Home Healthcare Google’s AI Can Predict Potential Heart Disease By Looking at the Eyes

Google’s AI Can Predict Potential Heart Disease By Looking at the Eyes

Google's Uses AI to Predict Potential Heart Disease By Looking at the Eyes
Images: Flickr Unsplash Pixabay Wiki & Others

The eyes are windows to the soul – or heart disease, according to Google AI. Using artificial intelligence (AI) and machine learning, experts at Verily (a health-tech subsidiary of Google) have come up with a way of diagnosing heart disease by using AI to analyze scans of the back of patient’s eyes.

Initial tests have proved that the AI has an accuracy that’s just as good as any of the leading methods out there today. As well as predicting the likelihood of heart disease, the AI can predict the patient’s likelihood of suffering from a heart attack or other major cardiac event. There are no blood tests required when using the AI, which is a big bonus too. However, the AI still has a way to go before it gets to a clinical setting.

In order to train the AI, scientists from both Google and Verily analyzed the medical data and eye scans of more than 300,000 patients. Neural networks then set to work looking for patterns and learning what the signs of cardiovascular risk look like within these scans. Looking into the eyes for signs of detrimental health is not a new practice and has derived from lots of established research.

READ MORE – Top 10 Ways Artificial Intelligence is Impacting Healthcare

READ MORE – Artificial Intelligence in Medicine – Top 10 Applications

Already Google’s algorithm has an accuracy of 70 percent, which is not bad considering the current SCORE method used to predict heart disease has an accuracy of just 72 percent. The company’s receiving plenty of praise and support for the new AI.

Alan Hughes is a professor of Cardiovascular Physiology and Pharmacology at London’s UCL, and he said this approach is quite credible but further testing would be needed before it’s ready to be released unto to the public. “There’s a long history of looking at the retina to prevent cardiovascular risk,” he said.

Instead of simply finding a new way to replicate existing diagnostic tools, Google’s AI uses new ways in which to analyze all the medical data it’s presented with. Now it’s simply a case of collecting lots of data. But, Google’s already onto that with its Project Baseline Study which aims to gather the medical records of 10,000 individuals over the next four years. Until then, it’s unlikely that we’ll see any form of diagnosing going on that doesn’t require some kind of human involvement. But that doesn’t mean to say it won’t be here one day.

Source Theverge