Oldest ‘new’ technology is science of the future

People can get a little scared of new technologies. Mobile phones were supposed to give you cancer, they were an invasion because you were suddenly available when you were out walking – and now everyone carries one. Likewise biometrics. Intrusive, based on science fiction and fantasy – downright sinister. And at first many people thought it was too complex in technology terms.

To an extent it’s true; the first time many people saw someone using a retinal scan was in the James Bond film Never Say Never Again and to be honest the person being scanned didn’t end up well. But is it really so new?

The very first time most people use biometrics to identify someone is when they are born. They start to recognise their parents by their faces, by their voices, before they have developed enough to rationalise the process. In terms of a more formal identification through bodily elements, you could look to palm prints or smears from cave paintings; these are no less biometric than retinal scans or signatures.

Yes, signatures. Nobody talks about signatures being biometric in nature and indeed they are too easily forged to be particularly useful in very sensitive areas. But if someone can be identified from their gait or eye movement, then it’s evident that their signature, assuming there is no deliberate attempt to disguise it, has to be considered among the more basic examples of individual markers produced by bodily gestures.

The human species is massively diverse and we’re only just beginning to realise in how many ways this is true

Biometric data in identification is in fact nothing new at all. The ancient Babylonians used fingerprints to sign business transactions in around 500BC, although they couldn’t have been aware of their complete uniqueness. Sherlock Holmes is famous for being among the first to use prints as an identifier, which is some achievement for an entirely fictional person; (it is Sir Arthur Conan Doyle’s teacher Dr Joseph Bell who ends up with most of the credit for indirectly slotting them into the Holmes canon).

It would be glorious to think that this is where they started as a means of identification; real life is more prosaic and it is documented fact that Dr Henry Faulds published a paper on the individuality of prints in 1880, offering the police the theory in 1886 only to have it rejected. Holmes first came into print in 1887.

Using, say, retinal scans is more sophisticated and safer because nobody can cover their retina with prosthetics as they might a finger. The principle, however, remains the same; nothing says you are “you” more eloquently than, well, than “you”. We could wonder whether the 20th century will be remembered as the time when, instead of looking at someone to confirm their identity, they were made to carry images or bits of plastic around with them – images and cards which are of course easier to forge than an actual human body.

The human species is, in fact, massively diverse and we’re only just beginning to realise in how many ways this is true. We have gone as far as DNA identification that identifies an individual by the smallest traces, but only now are we starting to look at the walk, the glance, the nervous tic and realising that these can be as distinctive as any other identifier people happen to have.

How to distil these into provable elements is the challenge the scientific community is starting to overcome. One thing is certain, though, and that’s the distinctiveness of someone’s body or habits is not new technology in itself. It’s as old as the human race and therefore shouldn’t be scary in the least.