Big Tech has already turned the humble telephone into a combination computer-camera-gaming system-library that can fit in the palm of your hand. Now, Google is aiming to make smartphones even smarter by infusing them with medical diagnostic technology.
At its second annual The Check Up event this week, Google Health introduced new research projects that would use artificial intelligence to turn smartphones into stethoscopes and other at-home disease screening tools, Greg Corrado, the company’s head of health AI, wrote in a blog post.
The launch of these projects comes on the heels of the European debut of Google’s Derm Assist, a CE marked mobile app introduced last year that uses AI to analyze three photos of a skin, hair or nail concern and the answers to a self-reported survey to provide a list of possible conditions.
One of the new programs builds on one of Google Health’s first forays into medical AI, which analyzed clinical photos of an eye’s interior to screen for diabetic retinopathy or certain cardiovascular risk factors like blood sugar and cholesterol levels.
Now, the company’s researchers will attempt to develop AI that can spot signs of illness—diabetes-related or otherwise—by looking only at external photos of the eye captured with a patient’s own smartphone camera, allowing them to be diagnosed from the comfort of their own homes.
The other smartphone-based AI project introduced at The Check Up is exploring how the built-in microphones on a mobile device could be used to analyze heart and lung sounds when the device is placed over the chest. That research is already in the early stages of clinical testing, with an initial focus on developing algorithms that could detect heartbeats and murmurs.
Eventually, according to Corrado, the technology could help detect major heart valve disorders like aortic stenosis, replacing the specialized and often costly equipment like stethoscopes and ultrasound machines that are currently required for diagnosis.
That research also builds on Google Health’s past work, including a feature added to the Google Fit app last year that assesses a selfie video of a user’s head and upper torso to check their heart and respiratory rate.
Another AI model unveiled at the event doesn’t adapt smartphone technology, but shares the other projects’ aim of making high-quality healthcare more accessible for all. In studies already underway in partnership with Northwestern Medicine, researchers are developing AI to read prenatal ultrasounds, targeting low- and middle-income countries with a shortage of healthcare professionals trained to read imaging data.
“With more automated and accurate evaluations of maternal and fetal health risks, we hope to lower barriers and help people get timely care in the right settings,” Corrado wrote of the project.