Artificial intelligence could bring speed and awareness to healthcare

Olivia is patient and tireless, with soft brown eyes and a gentle bedside manner. She wears the light blue scrubs and fob watch of an NHS nurse, which in fact she is. But Olivia is also a robot, a virtual avatar accessed on a smartphone app.

Currently on trial with the groups responsible for Vanguard in NHS Dudley Clinical Commissioning Group in the West Midlands, the NHS service for non-emergency calls, Olivia will check your symptoms, give advice on treatment and help schedule an appointment, whether that’s in English, Spanish or Dutch; she speaks all three, as well as Czech and Japanese.

Olivia, developed be Sensely, is able to schedule GP appointments, check symptoms and give advice on treatment

Olivia, developed be Sensely, is able to schedule GP appointments, check symptoms and give advice on treatment

Is it the same as talking to a live nurse? “We would all like that attention,” says Ivana Schnur, co-founder of Sensely, the Californian startup behind the device. “But the sad truth is our systems can’t afford it.”

Until recently, computers weren’t good at this kind of messy interaction. They could respond to commands, but only in the exact form they were programmed to understand. Now that’s beginning to change. New techniques in applied mathematics and computer science are being used to create machines that can learn without being explicitly programmed – that can, in effect, think for themselves. Applied to healthcare, it’s an insight with potentially revolutionary consequences.

Take diagnostics, known in the profession as triage. When most people feel unwell or injure themselves, they take one of three steps: they check the internet, call 111 or turn up at Accident and Emergency. The first is unreliable, the second and third are costly. Hence the appeal of automated healthcare.

“The most expensive part of healthcare is the human being,” says Ali Parsa, chief executive of Babylon. “The only way to solve the supply-and-demand issues so many health services face is to leverage artificial intelligence or AI.”

Speed and accuracy

Babylon’s AI-powered health app checks patients’ symptoms and medical background against a vast database of diseases. The service is currently being used in two hospitals in Essex, with what Mr Parsa claims are human-beating results. “We’re 91 per cent accurate,” he says. “That makes us about 17 per cent more accurate in tests than a nurse and 14 per cent more accurate than a GP.”

What makes these machine-learning systems so powerful is their ability to find patterns in datasets too large and complex for human brains to comprehend

This is the promise of AI – speed and accuracy at scale. For patients, that means remote check-ups, even in areas with relatively little healthcare provision. Babylon is working on making its system available in Rwanda. For doctors, it brings much needed assistance keeping track of records and making clinical decisions. IBM Watson – the supercomputer that crushed human champions at Jeopardy! in 2011 – is being used by clinicians at the Memorial Sloan-Kettering Cancer Center in New York, where it is fed with papers and patient records related to cancer. The resulting data is then made available to hospitals and clinics around the world.

health app solutions desired by UK consumers_3

In June, Watson was able to diagnose a 60-year-old woman’s rare form of leukemia, after oncologists at the University of Tokyo had puzzled for months over the illness. Watson sifted through 20 million research papers to come up with the proper diagnosis. The entire process took ten minutes.

Data too complex for humans

What makes these machine-learning systems so powerful is their ability to find patterns in datasets too large and complex for human brains to comprehend. In healthcare, where data is plentiful, albeit in practice sparse and poorly connected, the range of possibilities is tremendous. Add in data from digital interactions and it becomes positively dizzying.

DeepMind is work­ing with Moorfields Eye Hospital to de­velop an AI system to spot sight-threat­ening conditions in OCT scans

DeepMind is work­ing with Moorfields Eye Hospital to de­velop an AI system to spot sight-threat­ening conditions in OCT scans

One recent project from Microsoft Research Labs showed that search engine logs could be used to pre-empt diagnoses of pancreatic cancer. Another, from new startup HealthRhythms, uses AI in smartphones to track signifiers of bipolar disorder, monitoring everything from movement to speed of typing. “AI is perfect for making sense of the thousands of data points our app collects each day,” says Mark Matthews, HealthRhythms’ co-founder.

Medical imagery is especially amenable to machine-learning. In July, Moorfields Eye Hospital in London announced that it was working with Google’s AI research division, DeepMind, to develop an AI system to spot sight-threatening conditions in digital scans of the eye.

“We’ve millions of scans,” says Pearse Keane, a consultant ophthalmologist at Moorfields, who is leading the study. “But the amount of imaging data is far outstripping our ability as clinicians to derive the maximum benefit from it.” DeepMind’s system will be trained on Moorfield’s library. The study is in its early stages, but Dr Keane is optimistic it will soon be able to identify cases which warrant a referral.

DeepMind, which was acquired by Google in 2014 for $500 million, came to international attention when it’s Alpha Go AI defeated the world champion at the game of Go in March 2016. Now it is turning its attention to healthcare, with the launch in February of DeepMind Health. “In many areas of healthcare there are front-line clinicians saving lives every day using technologies that were developed a decade ago,” says Mustafa Suleyman, DeepMind’s co-founder and head of applied AI. “The margin for impact is unlike any other sector.”

consumer attitudes towards virtual healthcare_3The firm’s first venture into the field – a collaboration with London’s Royal Free NHS Trust to detect acute kidney injury – was marked by controversy, after complaints that the firm had access to private patient data. Mr Suleyman says the row stemmed from a “lack of understanding” about the role DeepMind was playing. “We’re basically at this point plumbers. We’re processing data on behalf of the controller,” he says.

Basic plumbing might seem an odd use of DeepMind’s capabilities, but Mr Suleyman says a great deal of it is necessary before the NHS is ready to reap the benefits of AI outside research projects. “The precursor to that is to build super-secure infrastructure, the likes of which haven’t been seen in healthcare,” he adds.

So when will the machines be able to take over? Mr Suleyman frowns. “For as long as I can see forward – more than ten years, maybe twenty years – there is no way a machine-learning algorithm is going to be making those decisions without human oversight,” he says. “In most other areas, humans combined with an algorithm is always the best combination. That’s the paradigm – human-assisted decision-making.” Everyone, meet Olivia. You’ll be spending a lot of time together.