In his book, ‘21 Lessons for the 21st Century’ historian Yuval Noah Harari talks about running many alternative algorithms on the same network, so that a patient in a remote jungle village can access through her smartphone, not just a single authoritative doctor but actually a hundred different AI (artificial intelligence) doctors, whose relative performance is constantly being compared. “You don't like what the IBM doctor told you? No problem. Even if you are stranded somewhere on the slopes of Kilimanjaro, you can easily contact the Baidu doctor for a second opinion,” he says in the book. The benefits for human society are likely to be immense. In the book, Harari says AI doctors could provide far better and cheaper healthcare for billions of people, particularly for those who currently receive no healthcare at all. Thanks to learning algorithms and biometric sensors, he says that a poor villager in an underdeveloped country might come to enjoy far better healthcare via her smartphone than the richest person in the world gets today from the most advanced urban hospital.
That future is not very far as many tech firms including Google are scaling up their efforts in this direction. To help, Google Health has expanded its research and applications to focus on improving the care clinicians provide and allowing care to happen outside hospitals and doctor’s offices.
At a recently held Google Health event ‘The Check Up’, the company shared new areas of AI-related research and development and how it is providing clinicians with easy-to-use tools to help them better care for patients. This included smartphone cameras’ potential to protect cardiovascular health and preserve eyesight.
One of the innovations is recording and translating heart sounds with smartphones. Google shared a new area of research that explores how a smartphone’s built-in microphones could record heart sounds when placed over the chest. Listening to someone’s heart and lungs with a stethoscope, known as auscultation, is a critical part of a physical exam. It can help clinicians detect heart valve disorders, such as aortic stenosis which is important to detect early. Screening for aortic stenosis typically requires specialised equipment, like a stethoscope or an ultrasound, and an in-person assessment.
“Our latest research investigates whether a smartphone can detect heartbeats and murmurs,” says Greg Corrado, Google’s Head of Health AI. “We're currently in the early stages of clinical study testing, but we hope that our work can empower people to use the smartphone as an additional tool for accessible health evaluation."
Google previously had shared how mobile sensors combined with machine learning can democratise health metrics and give people insights into daily health and wellness. The feature that allows you to measure your heart rate and respiratory rate with your phone’s camera is now available on over 100 models of Android devices, as well as iOS devices.
Google is also partnering with Northwestern Medicine to apply AI to improve maternal health. In this area, Ultrasound, a noninvasive diagnostic imaging method uses high-frequency sound waves to create real-time pictures or videos of internal organs or other tissues, such as blood vessels and foetuses. Research shows that ultrasound is safe for use in prenatal care and effective in identifying issues early in pregnancy. However, more than half of all birthing parents in low-to-middle-income countries don’t receive ultrasounds, in part due to a shortage of expertise in reading ultrasounds.
“We believe that Google’s expertise in machine learning can help solve this and allow for healthier pregnancies and better outcomes for parents and babies,” says Corrado. “We are working on foundational, open-access research studies that validate the use of AI to help providers conduct ultrasounds and perform assessments.”
For this, Google has partnered with Northwestern Medicine to further develop and test these models to be more generalizable across different levels of experience and technologies. With more automated and accurate evaluations of maternal and fetal health risks, Google hopes to lower barriers and help people get timely care in the right settings.
One of Google’s earliest Health AI projects, ARDA, aims to help address screenings for diabetic retinopathy — a complication of diabetes that, if undiagnosed and untreated, can cause blindness.
“Today, we screen 350 patients daily, resulting in close to 100,000 patients screened to date,” says Corrado.
In addition to diabetic eye disease, Google had previously also shown how photos of eyes’ interiors (or fundus) can reveal cardiovascular risk factors, such as high blood sugar and cholesterol levels, with assistance from deep learning. Google’s recent research tackles detecting diabetes-related diseases from photos of the exterior of the eye, using existing tabletop cameras in clinics. Given the early promising results, Google is looking forward to clinical research with partners, including EyePACS and Chang Gung Memorial Hospital (CGMH), to investigate if photos from smartphone cameras can help detect diabetes and non-diabetes diseases from external eye photos as well. “While this is in the early stages of research and development, our engineers and scientists envision a future where people, with the help of their doctors, can better understand and make decisions about health conditions from their own homes,” says Corrado.
Google is also helping people in Brazil, India and Japan discover local, authoritative health content on YouTube.YouTube is adding health source information panels on videos to provide context that helps viewers identify videos from authoritative sources, and health content shelves that more effectively highlight videos from these sources when people search for specific health topics. “These context cues help people easily navigate and evaluate credible health information,” says Dr. Karen DeSalvo, chief health officer, Google.
Many young tech firms are also betting big on using smartphones to address cardiovascular and chronic respiratory diseases, hypertension and diabetes. Bengaluru-based health-tech firm platform MFine has built an innovation by bringing Blood Pressure (BP) and Glucose Monitoring on a smartphone. In an industry-first breakthrough, MFine added BP and Glucose monitoring to the suite of self-check health tools available on its app, eliminating the need of any external devices to measure and track these health vitals. MFine is working on various next-gen AI (artificial intelligence) technologies that convert any smartphone into a rich diagnostics and vitals monitoring device.MFine has built a proprietary algorithm that measures Blood Pressure using a smartphone by obtaining a PhotoPlethysmoGram (PPG) signal from the user’s fingertip when it is placed on the smartphone camera. By observing subtle changes in colour across red and blue wavelengths of the PPG signal, a prediction of BP and Glucose is made. MFine said its machine learning algorithm, trained with thousands of patients’ data makes the prediction accurate. The algorithm is able to measure BP with close to 90 per cent accuracy
“By enabling vitals monitoring through smartphones, MFine aims to make basic health assessments universal, easy and free to use for millions of people in India,” says Ajit Narayanan, chief technology officer, MFine.
Another tech company, Qure.ai uses deep learning and AI tools to make healthcare more accessible and affordable to patients around the world. Qure.ai’s solutions, which automatically interprets X-rays, CT Scans and Ultrasound, focuses on the diagnosis of pulmonary diseases and traumatic brain injuries (TBI). The technology improves both the accuracy and efficiency of the radiology function in traditional hospital settings, and also makes leading-edge medical imaging services accessible to smaller, remote medical care teams with limited to no in-house radiology capability.
"Every year our technology helps more than four million people across 50 countries,” says Prashant Warier, CEO and founder, Qure.ai. Warier says the company’s extensively tested and proven algorithms show how the power of data-driven, AI-based technology can literally save hundreds of thousands of lives across a variety of heath care settings.
“We’re on the cusp of an exciting quantum leap forward in radiology and medical imaging,” says Warier. “Not only can we increase clinical accuracy, but we can also improve radiology department efficiencies by aiding in the triage process. In many cases, this equates to the difference between life and death, especially in the emergency room.”