Google this week announced that it was working on plans to allow users monitor their health using their smartphones. The tech giant said it wanted to find out whether allowing users to capture their heart beats and eyeball images would help them detect medical issues from home.
According to Google’s head of health AI Greg Corrado, the company wants to investigate whether the smartphone’s built-in microphone can help detect heartbeats and murmurs when placed over the chest. He said the readings could help detect heart valve disorders early.
“It’s not at the level of diagnosis but it is at the level of knowing whether there is an elevated risk,” Corrado said, noting questions remained about accuracy.
The eye research focuses on recognizing disorders from images, such as diabetes-related diseases. Google claimed it had seen “early positive results” in clinics using tabletop cameras and would now look into whether smartphone photographs would work as well.
Corrado added that his team could envision a future where people, with the help of their doctors, can better understand and make decisions about health conditions from their own home.
Google also intends to see if its AI can analyze ultrasound scans performed by less-skilled clinicians if they follow a predetermined pattern. This will help address high-skill personnel shortages and allow birthing parents to be assessed from their homes.
The developments come after Google introduced measurement of heart and breathing rates using smartphone cameras. The features which were announced last year are now available on a number of devices through the Google Fit app.
While Google has long pushed to apply its technical expertise to health care, the company has been tight-lipped about whether its initiatives are yielding significant income or usage.
According to Corrado, the launching capabilities are “a major step” that will take time to embrace.
“When you think about breathing and heart rate, whatever level of adoption we see today only scratches the surface,” he said.