The idea of wearable technology is nothing new and can be traced back to the 19th century, as Paul Lamkin reports
Academia is littered with forward-thinkers predicting a communication revolution centred on devices worn on the self, while the popular media have been throwing up ideas of how these gadgets might look and operate for decades.
But we’re finally at a point where these prophecies are ringing true. The recent unveiling of the Apple Watch is just an early breakthrough of mainstream attention for an industry that may well be in its infancy, but one that has been bubbling away for more than 60 years and is set to explode like no technological revolution has ever done before.
A recent report from CCS Insight forecast that the wearable market was set to expand from 9.7 million device shipments in 2013 to 135 million in 2018; and IT analysts Canalys gave weight to those forecasts by revealing that the wearable smartband market had rocketed an incredible 684 per cent, year-on-year, for the first half of 2014.
But while these figures suggest that we’re reaching a tipping point for the wearable tech industry, where novelty gives way to necessity, the signs of a genuine paradigm shift for the connected self haven’t just sprung up in the last couple of years since the likes of Google and Apple realised the potential.
Computing for the masses
In the late-1980s, Mark Weiser, chief scientist at Xerox PARC, was credited with coining the phrase “ubiquitous computing”. He suggested that a third wave of computing was beginning to take shape; a movement that would see personal technology escaping the confines of the desktop and receding into the backgrounds of our lives.
In 2006, American author Adam Greenfield suggested that Mr Weiser’s intent for this statement was that technology could expand functions, such as information-sensing, processing and networking to things not considered hi-tech at that point, such as clothing. It’s a suggestion Don Norman, a cognitive science academic, had put forward in 1999 when he said ambient technology would become more personal and wearable, and that new devices would allow us to interact unconsciously with embedded environmental technology.
For anyone who’s ever worn a Misfit Shine or a Fitbit Flex for days on end without ever really noticing it was there, while it recorded an array of personal information, it’ll seem obvious that the time is now.
It was around this era that visions of wearable technology were being widely adumbrated in the mass media. And it’s surprising just how accurate some of these media forecasts turned out to be.
Glimpses of the future
Watch an episode of Knight Rider and you’ll see Michael Knight interacting with his sensor-laden wristband, which packs in voice-control functionality, and is not all that different from the Sony SmartBand Talk. Or check out Back to The Future II and you’ll witness Marty McFly Junior donning a JVC-branded virtual reality (VR) headset not dissimilar to the Gear VR device Samsung recently unveiled to the market.
Hearing aids, which have been around since 1898, are examples of existing, successful wearable technology
It’s entirely possible the perception of wearable tech and, as such, the public’s appetite for its potential was fuelled by these eighties and nineties media portrayals, and that TV and cinema essentially shaped the culture of wearable devices in the real world. After all, many of the engineers currently employed in the R&D labs of Silicon Valley and beyond would have grown up having these images engraved into their subconscious.
But you have to go back further than the 1980s for examples of wearables in both real life and the media. The 1927 Plus Four Wristlet Route Indicator would help you navigate using moveable scroll map cartridges, almost 90 years before Google Now started directing people on Android Wear smartwatches. And the 1960 Telesphere Mask was patented by Morton Heilig some 53 years before Oculus attempted to take his idea of a virtual reality headset to the masses.
The idea of a smartwatch for making and receiving calls outdates even that. The wrist-radio, worn by Dick Tracy and one of his comic’s most recognisable icons, was first seen in 1946. And on American television, The Jetsons were streaming live TV on their watches as early as 1962.
Right place at right time
The driving force behind the boom in wearable technology, of course, is the advancement in hardware, both in terms of affordability and practicality. Earlier this year, ARM Holdings chief executive Simon Segars told tech website CNET that even small companies could create hit products now because the sensors and processors required to power low-powered wearable devices were so easily available. “Because it is inexpensive to put some of these products together, it does open the door for new companies,” he says.
The current crop of wearable technology genres – fitness trackers, smart bands, smartwatches, VR headsets and augmented reality (AR) spectacles – aren’t necessarily new ideas. In fact many of these genres and some of their biggest selling products are many generations old.
The difference now is there is an appetite in the mass market for them, coinciding with a point in time where the hardware is making media and academic prophecies a much more realistic proposition. As Mr Weiser suggested, ubiquitous computing would only succeed at a time when devices were submissive enough to recede into the backgrounds of our lives.
Moving with the times
Of course, the very definition of wearable technology is changing all the time. The term is evolving at an incredible pace as a seemingly never-ending conveyor belt of new form factors emerges. Surely hearing aids, which have been around since 1898, and headphones, popularised in the 1980s during the Walkman boom, are both examples of existing, successful wearable technology?
Of course they are and it’s pleasing to see these old masters of the wearable world influencing the new breed of connected devices. Soundhawk, Cupertino-based neighbours of Apple, recently unveiled its smart-listening device – a hearing aid designed even for people with good hearing that boasts adaptive audio processing, which is capable of cutting through background noise and elevating just the sounds you need to hear.
And the Avegant Glyph has taken the basic premise of audio headphones and given it a visual twist for the 21st century. The Glyph is a head-mounted display that promises to deliver a visual experience of the kind of realism and clarity that you’ve never had before. There’s no screen, pictures are projected directly on to your eyes from a low-powered LED bulb, presenting smooth pixel-free visuals. A far cry from the tinny headphones Sony boxed with its original Walkmans.
The idea of wearable technology is in no way a new one, but we’re finally arriving at a point where years of research and development is paying dividends, where the hardware is both accomplished and economically viable enough for the visionaries’ conceptions to enter mass production, and for the results to proliferate the mainstream consumer consciousness.
So while you may be hearing a lot more about wearable technology as of late, it’s key to remember that what we’re seeing now is merely the first fruits of a cultural and social shift that’s been a long time coming.
And you thought wearables were just a fad.