Skip to main content

AI could turn your phone into a mobile health lab

Your phone might soon be getting AI-assisted upgrades to benefit your health.

Google Health has introduced research projects that promise to turn smartphones into disease=screening tools. One promising avenue involves using the onboard microphones on a phone as a stethoscope to detect circulatory irregularities like murmurs. The innovations could be deployed through telehealth, saving the need and time for patients to travel to a doctor.

“At Google, we’re focused on unlocking the potential of everyday devices to support people’s health and wellness,” Greg Corrado, head of Health AI at Google, told Digital Trends in an interview. “Such developments are becoming possible thanks to improvements in sensors on mobile devices, as well as advances in AI, and we want to be at the forefront of these developments.”

Diagnosis through your phone

Steth IO heart rate monitoring.
Steth IO heart rate monitoring. Steth IO

Google says it sees “early promising results” in using existing clinical cameras to detect diabetic eye disease. It plans to fund more clinical trials on the use of smartphones to do the same thing. The company uses Automated Retinal Disease Assessment, an AI engine, to process the images.

Corrado says mobile sensors combined with machine learning can give people insights into their daily health and wellness. The feature that allows you to measure your heart rate and respiratory rate with your phone’s camera is now available on Android and iOS devices.

Three screenshots of the Google Fit app
Image used with permission by copyright holder

Another area of research explores how a smartphone’s built-in microphones could record heart sounds when placed over the chest. Listening to someone’s heart and lungs with a stethoscope can help clinicians detect heart valve disorders. Screening for aortic stenosis typically requires specialized equipment, like a stethoscope or an ultrasound, and an in-person assessment.

“This newest area of research explores how a smartphone’s built-in microphone can record heart sounds when placed over the chest, and possibly detect heartbeats and murmurs,” Corrado said. “It’s important to note that we’re still in the beginning stages of research and development, but our hope is to provide doctors and patients with an additional tool for assessment and care.”

Connor Landgraf, the CEO of Eko, which created the first AI stethoscope, told Digital Trends in an interview that it’s not completely clear yet how Google plans to roll the stethoscope feature out to consumers.

A nurse uses the Eko Duo stethoscope and iPhone app.
The Eko DUO ECG + Digital Stethoscope Eko

“But a guess would be that Google plans to enable patients with some risk for heart disease to use the microphone on their smartphone to listen to their heart sounds and to share the auscultation data with their primary care physician,” Landgraf added. ”This could allow clinicians to gather a patient’s cardiac data, which is mostly available during an in-person exam, remotely, from anywhere in the world.”

Landgraf said that having a stethoscope on a smartphone could make the tool more available. “It would be straightforward to reach a large number of patients through these apps,” he added. “However, it would also likely trade off some accuracy since the smartphone is not an ideal stethoscope. It will also likely require a Food and Drug Administration (FDA) review of the software and may consider the smartphone as a medical device.”

Smartphone stethoscopes could also be a way to give patients more awareness and knowledge of their overall heart health, Landgraf said. A significant number of patients with heart valve disease are not diagnosed early enough by their primary doctor, and many patients are never treated.

“There is a tsunami wave of patients with undiagnosed cardiac conditions, and this technology could play a role in helping provide more knowledge to these patients,” Landgraf said.

Ultrasounds for everyone

A doctor used this iPhone ultrasound machine to diagnose his own cancer
The Butterfly iQ Butterfly Network

Corrado said AI could also help make ultrasound more available to low-income parents. Ultrasound uses high-frequency sound waves to create real-time pictures or videos of internal organs or other tissues, such as blood vessels and fetuses. But more than half of all birthing parents in low-to-middle-income countries don’t receive ultrasounds due to a shortage of expertise in reading ultrasounds.

Google is working on using AI to help providers conduct ultrasounds and perform assessments. “With more automated and accurate evaluations of maternal and fetal health risks, we hope to lower barriers and help people get timely care in the right settings,” Corrado said in a blog post.

Hila Goldman – Aslan, the CEO of DiA Imaging Analysis, a provider of AI-powered ultrasound analysis solutions, told Digital Trends in an interview that AI can act as a second set of eyes for overworked doctors or provide diagnostic power that is simply impossible for humans.

“In our area of ultrasound tests, visually analyzing ultrasound images is subjective, error-prone, and highly dependent on the operator’s experience,” Goldman – Aslan said.

Radiologists typically work chronologically on a first-come-first-served basis, without necessarily always knowing if there are more urgent cases down the list that need addressing first.

Elad Walach, the CEO of Aidoc, a provider of health care AI solutions, said that for medical imaging, AI addresses the challenges of labor shortages and physician fatigue. AI can flag the positive cases in the radiologist’s workflow so that patients can get timely treatment.

“Radiologists are facing larger case volumes, which can lead to burnout,” Walach said. “More importantly, radiologists typically work chronologically down a worklist on a first-come, first-served basis, without necessarily always knowing if there are more urgent cases down the list that need addressing first.”

But Ronald Dixon, a physician and the CEO of CareHive, told Digital Trends in an interview that he doesn’t think that the Google health tools will have much of an impact on patients.

“The focus of the Google effort has “really been on how you use technology to better diagnose or better manage patients without necessarily an understanding of the value of what that technology might provide,” he said.

Dixon said he’s concerned the use of AI could raise the costs of health care for consumers.

“If the cost goes up because you’re using technology to try to solve a problem, you’re not actually helping the health care system because you’re actually hurting it in that way,” he added. “You’re causing us to spend more money to likely get to the same outcome. And the reason why they do that is that they’re technology companies. So it’s a ‘technology-first’ strategy as opposed to a ‘clinical need-first’ strategy.”

Editors' Recommendations

Topics
Sascha Brodsky
Sascha Brodsky is a writer who focuses on consumer technologies and privacy issues for a broad range of outlets. He’s been…
This cute AI gadget wants to replace your smartphone
Photo of the Rabbit R1.

“Infer and model human actions on computer interfaces by learning users’ intention and behavior when they use specific apps, and then mimic and perform them both reliably and quickly.” That’s the promise of a rather cute, but cutting-edge device called the Rabbit R1, which was previewed at CES 2024. In simpler terms, it wants to keep us from getting lost in the maze of smartphone apps.

Instead, it wants to replicate human interactions with apps by learning and then removing them from the equation. And it can do it all without requiring a phone to pair with. Cellular connectivity is part of the package here, as is Wi-Fi, to execute AI-based tasks within apps without actually having to open those apps on your phone.

Read more
Your Android phone just got an update that could save your life
A person holding the OnePlus 11 and Google Pixel 8.

When you’re in an emergency, every second matters. And if you already have your health information on your phone, it’s going to be even easier than ever to get that critical information over to operators. Now, when you make an emergency call on your Android phone, your health information can be sent directly to emergency services when you call 911.

This new feature update is made possible due to Google partnering with RapidSOS to allow data from Android ELS (Emergency Location Service) to be given to emergency contacts and first responders. The type of data that we’re talking about is critical medical information that can save your life, such as blood type, emergency contacts, and severe allergies. This is all information that the user sets up and is stored locally on their Android device.

Read more
4 AI features I want in my next iPhone
Blue Titanium (left) and Natural Titanium iPhone 15 Pros on a concrete bench.

Believe it or not, Apple didn’t always have Siri. Siri originally belonged to SRI International, which created Siri, Inc. in 2007. Then in April 2010, Apple acquired Siri, which became a key component of the iPhone 4S that launched in 2011.

Ever since then, Siri has continued to evolve at Apple, for better or for worse. While the point of Siri is to make your life easier with hands-free use of your iPhone or Apple Watch, it’s far from perfect. Siri often misunderstands you and results in hilarious requests, or is just incapable of doing what you need it to do, sometimes because of a poor connection.

Read more