By Simon Davies
I spent nearly twenty years as a university lecturer unsuccessfully trying to figure a way to make authentication a fun and popular topic, and only now have I cracked the problem. It’s all about “pull” and “push” and making people in call centres read out hilariously stupid words from their screen. I call it “negotiated authentication”, but more on that later. First some background.
there’s far too much “pull” of personal information going on.
As a rule there’s far too much “pull” of personal information going on. A safer mutual authentication system that provides confidence to both parties will involve much more “push” of information, allowing the other party to confirm the accuracy of known data, rather than being required to disclose the information up-front. For example, why on earth should I be handing over my address details to an alleged health provider when surely it’s they that should be telling me my address so I know they’re not a fraudster? We live in strange times.
most phone-based interactions between organisations and customers still rely on instinct and trust
Good progress has been made on technologies for Web-based and electronic transactions, allowing the consumer (or the consumer’s machine) to be more certain that an organisation is what it claims to be. However most phone-based interactions between organisations and customers still rely on instinct and trust, with individuals required to give out bits of their personal data to a purported company or agency. Even if the technology was available, it would be only a partial answer to the much larger problem of social engineering by fraudsters. As Bruce Schneier points out: “If you think technology can solve your security problems, then you don’t understand the problems and you don’t understand the technology.”
Keep an eye out out for organisations that move from a pull to a push system. They’re often the ones that are trying to put their security promises into action. For example Barclay’s bank recently adopted a new push-based telephone security system. It reads out four possible dates of birth (month and year) before asking the customer to choose one and then submit a day of birth. The odds against a fraudster randomly guessing a correct combination of month and year are around 200 to one. The problem of course is that there are countless ways to reduce those odds – even by using publicly available demographic data. And once a customer starts giving out personal and identity information, more often than not they’ll continue giving it out.
A scarily large percentage of the population use the same password for all purposes, so asking a customer for a password for, say, a book retailer may provide the same password used for a bank.
But what if the customer were allowed to turn the tables by challenging the calling organisation to produce information that only a legitimate supplier would know? I recently tried such a method with T-mobile, which has always made a habit of calling me from a variety of numbers if I’m a few days late with my mobile phone payment.
Having become heartily sick of arguing the toss over who should provide information first I discussed the dilemma with a helpful call centre manager and we figured a solution.
It turns out that nearly all call centre systems have a “special instructions” field that allows operators to add useful comments about calling times, payment instructions or special customer requirements. Into this field we placed a word. I must say it’s the most hilariously camp and silly word ever to grace an otherwise tedious financial management system, but it is memorable and unique. Now whenever I’m called by T-Mobile I demand that they read out the word before they get any information from me. The mutuality works for both of us, and we all get a giggle.
it’s the most hilariously camp and silly word ever to grace an otherwise tedious financial management system.
It’s an infectious idea, and now my friends and I try to implement the system with every organisation that calls us. Sometimes we meet with success, sometimes not. HSBC, for example, refused point-blank to institute the idea, even though they have a special instructions field. Their reasoning is that any new field of data presents unknown security risks. They could well be right, but neither have they come up with an alternative to their current fallible system.
This is a small step to resolving a much larger problem, but it’s a potentially useful measure that will help overcome the current imbalance in the authentication arena. And importantly, it will help strengthen consumer trust by establishing a more genuinely mutual process.