How a team at Google is using AI to help doctors prevent blindness in diabetics.
Google research scientist Varun Gulshan was looking for a project that would meet a few criteria.
The project would utilise Gulshan’s background developing artificial intelligence (AI) algorithms and stimulate his interest in science and medicine. And ideally, the project would help people in Gulshan’s home country of India.
He fired off an email to Phil Nelson, Director of Google Accelerated Science (GAS), asking if there was such a project in the works.
A few weeks later, Gulshan was opening a digital drive containing hundreds of anonymised retina scans from a hospital in India. Nelson thought that he had a project for Gulshan, but first he needed to know: could an artificial intelligence model learn to identify which of these images showed signs of a specific cause of blindness, a disease called diabetic retinopathy?
'This was like a perfect skill-set match for me', says Gulshan, whose background was working with AI to recognise hand gestures. 'When I looked at these images, I could tell that, OK, deep learning is kind of working well enough', he says. 'We could really use it on these problems.'
A Growing Concern
With 70 million people with diabetes, India has a growing problem with diabetic retinopathy. The disease creates lesions in the back of the retina that can lead to total blindness, and 18 per cent of diabetic Indians already have the ailment. With 415 million diabetics at risk for blindness worldwide (the United States, China and India have the most cases), the disease is a global concern.
But the good news is that permanent vision loss is not inevitable. For those who are diagnosed early enough, medications, therapies, exercise and a healthy diet are highly effective treatments for preventing further damage.
Awareness is a huge issue with diabetic retinopathy. Many diabetic patients assume that early signs of the disease are simply minor vision problems, according to Dr. Rajiv Raman, a retina surgeon at Sankara Nethralaya Eye Hospital in Chennai, India. With no word in Hindi for 'retina', just talking about the disease is a challenge. 'For cataracts we have a word, for glaucoma we have a word in Hindi as well as in Tamil, but diabetic retinopathy is – there is no translational word', Dr. Raman says.
But while an ophthalmologist can explain the disease and how regular exams will monitor its progress, the real difficulty is getting at-risk patients a retinal exam in the first place. For rural communities worldwide, the prevalence of late-stage diabetic retinopathy has more to do with infrastructure than medicine. The journey from home to the nearest specialist can be long, and keeping multiple appointments is often very difficult.
'Many of the rural patients have an advanced stage of diabetic retinopathy, but they don’t know they are diabetics'
It is often impossible for patients in poverty with dependents to also care for themselves. Instead, they will carry on until the effects of diabetic retinopathy become too bad to ignore, which is often too late. 'Many of the rural patients have an advanced stage of diabetic retinopathy, but they don’t know they are diabetics', says Dr. Sheila John, head of teleophthalmology at Sankara Nethralaya. 'They are losing sight. In some cases they have lost vision in one eye, [and] the other eye we have to save.'
Assembling the Team
The biggest challenge with diagnosing diabetic retinopathy, however, is the sheer number of cases. India alone has 70 million diabetics who must be screened, and there just aren’t enough trained clinicians to review their retinal scans.
But it simply isn’t feasible for specialists to open practices in rural areas where only a few patients may reside, according to Dr. R. Kim, chief medical officer at Aravind Eye Hospital in Madurai, India. 'We need to screen them early on, when their vision is still good. So how do we do that?' Dr. Kim asks. 'Because it’s not humanly possible to screen these 70 million.'
If Google’s artificial intelligence could help make diagnosing diabetic retinopathy easier by accurately interpreting retinal scans, perhaps the eyesight of millions could be saved.
The tricky part was creating a data set for the AI model to learn from – a task which involved scoring and labelling all the scans one by one for different grades of severity. Solving that problem would eventually require a large team of ophthalmologists whose scoring of the scans would inform the AI model.
But the team would need more quality data if it was going to teach the AI model the nuances to truly read a retinal scan.
Teaching the Model
At the outset, the team was aided by ophthalmologists at Aravind and Sankara Nethralaya to label the retina images. After a few short months, the model was trained to identify key markers of diabetic retinopathy, such as nerve tissue damage, swelling and hemorrhaging. And with a larger data set, Gulshan was sure that they could make the model even more accurate.
Enter Dr. Jorge Cuadros, head of the Eye Picture Archive Communication System (EyePACS), a telemedicine network connecting patients in rural areas across the United States to opthamologists for diabetic retinopathy scans. But patients seen by EyePACS still have to wait weeks for a graded scan, and Dr. Cuadros was happy to help any effort for a faster diagnosis.
The data EyePACS shared comprised a wide range of patients and was a hundred times as much as the AI team had gathered by that point. That meant a huge labelling workload because each image had to be graded multiple times to compensate for the bias of different graders. 'The model learns what things they always did consistently', says Dale Webster, a software engineer at Google. 'This tends to result in something that’s a bit less biased and a bit more robust.'
To date, close to 100 ophthalmologists have rendered more than 1 million grades for the AI model.
How the AI Works
How the AI Model Works (1/4)
Over 50 ophthalmologists have manually reviewed more than 1 million anonymous retina scans, rating each for the level of diabetic retinopathy present.
How the AI Model Works (2/4)
Each scan is reviewed multiple times, and is graded manually on a scale of 1 (no diabetic retinopathy signs present) to 5 (extreme signs present).
How the AI Model Works (3/4)
The graded images are then fed into an image recognition algorithm. By feeding the algorithm thousands of graded images, it can start to understand signs of diabetic retinopathy just like an opthamologist would.
How the AI Model Works (4/4)
Once the algorithm has been trained, it can be used to power an application called an Automated Retinal Disease Assessment (ARDA). ARDA allows a user to upload a retina scan for instant analysis of diabetic retinopathy.
From Model to Device
For all the team members, the idea that they could turn this model into an actual Automated Retinal Disease Assessment (ARDA) device was the main reason for their involvement.
The key to that was another Google team member, Lily Peng. Trained as a medical doctor, Peng, like the rest of the Ophthalmology team, is driven by the prospect of creating an actual clinical impact.
'I saw that we had a lot of big ideas – a lot of promises, right?' she asks. 'But why do some of these never make it to the bedside?'
Lily Peng, Google
Peng had a vision that the ARDA could be used in a clinical setting – but getting to that point required trials and regulatory approval. To do this, the team focused on two goals: conducting a clinical trial to begin testing the ARDA in the real world, and writing a paper about the results for the Journal of the American Medical Association (JAMA).
'We wanted to go to JAMA because JAMA is about the practice of medicine', says Nelson. 'We didn’t just want to show that we could do this. We wanted to get on the map with doctors.'
Another part of getting the ARDA device on the map was presenting their work to the Food and Drug Administration (FDA). With Nelson at her side, Peng gave a 'virtuoso performance' on the virtues of AI. Peng was a key advocate and translator between the different communities involved in bringing the ARDA to life.
'She can speak all languages', Gulshan says, 'so she could talk to us and understand the technical complexities of what we were doing, and also what the doctors were speaking, and what is relevant in terms of impact. Lily brought that and made it into something that we can now think of putting into a clinic.'
A New Kind of Thermometer
No one on the Google team had any experience actually creating a medical device, so the team turned to Verily, a healthcare-focused Alphabet company (Alphabet also owns Google), to navigate the regulatory and clinical demands of getting the ARDA technology approved as a medical device.
Accepted into the FDA’s recently announced pre-certification pilot programme – one of only nine companies selected out of hundreds that applied to participate – Verily is using its expertise to help usher the ARDA through clinical trials in India. And so is Gulshan, who moved back to India to help doctors and nurses use the device.
'Getting regulatory approval is important', Peng says, 'but more important is that the clinicians working with us feel confident about what they’re doing and feel good about using the software. And so it’s not just about safety and effectiveness; it’s whether or not this is actually going to be helpful to them.'
In a recent clinical trial, the ARDA was used to grade the images of 3,000 diabetic patients at two hospitals in India. Those grades were compared with doctors’ assessments, which confirmed the 2016 study reported in JAMA: the model was performing on par with their existing healthcare workers screening patients.
For Dr. Cuadros, the key benefit of the ARDA is simple maths. He notes that the percentage of people with diabetic retinopathy in the United States is going down, indicating that preventative treatment is working. But because the rate of diabetes is increasing, the overall number of diabetic retinopathy patients remains the same. The number of people who need screening is on the rise, while the demand for treatment expertise remains the same.
And ophthalmologists feel the pinch.
Dr. Rajiv Raman, Ophthalmologist
In such conditions, inserting expertise into primary care is a huge benefit. 'If ARDA could be used in the primary care doctor’s office, it would make a huge difference, because you will be screening more patients', says Dr. Kim. 'So the ophthalmologist can focus on treating only those with retinopathy.'
In fact, Dr. Raman imagines a device that’s as common as a thermometer or even a glucometer, a diagnostic tool that diabetics already use to monitor their blood sugar. 'My job is not to screen for diabetic retinopathy', he says. 'My job is to do lasers, to do injections, to give – to really do surgeries and help them alleviate their blindness.'
But no matter the vector of diagnosis, all agree that awareness is key to health. In fact, getting a diabetic retinopathy diagnosis can lead to better outcomes overall. 'If you detect retinal disease at an early stage when they don’t need treatment', Dr. Cuadros says, 'it’s still an opportunity for the patient to understand that diabetes is beginning to affect their body. Hopefully that would motivate them to control their blood sugar better.'
A Diagnostic Advance
More studies are underway, including ongoing clinical trials in India – the first time that screenings will be performed at this level. And the Google and Verily teams are optimistic about the possibilities even beyond diabetic retinopathy. 'Since [the JAMA article] we have made even more progress', says Nelson. 'We recently published a paper in Nature Biomedical Engineering showing that from a retina image we can predict not only several cardiovascular health risk factors but also your risk of a significant cardiovascular event.'
One day, diagnosing serious diseases may be as easy as taking a temperature or checking blood pressure. But in the near term, millions of diabetics could keep their vision thanks to an AI algorithm helping doctors quickly spot diabetic retinopathy.