Google outlines its plans to harness AI in healthcare
Recently rebranding its Google Research division to Google AI, at Google’s 2018 I/O, festival, the technology giant has revealed its ongoing plans to harness artificial intelligence and machine learning across a multitude of consumer technologies. However, health technology is gaining traction, and the company continues to grow apace in this space.
Recently partnering with Fitbit, the company will seek to use its cloud healthcare API, founded on Fast Health Interoperability Resources (FHIR) to connect user data with electronic medical records (EMR).
Following on from Fitbit’s recent acquisition of Twine Health (now rebranded as the Fitbit Health Platform), the collaboration will also open doors for the duo to provide solutions to those with chronic, long-term health solutions.
“At Google, our vision is to transform the way health information is organised and made useful,” explained Gregory Moore MD, PhD, Vice President, Healthcare, Google Cloud.
With this in mind, the company has unveiled its new consumer health apps and new smartwatch, all which will successfully integrate with healthcare organisations EMR systems.
See also
- New archetypes are transforming healthcare, new PwC report finds
- The use of AI will continue to present challenges within health
- The National Institute of Health rolls out its $1.45bn All of Us health initiative
"Artificial intelligence and machine learning are not just technologies of the future. I can assure you that the best artificial intelligence and machine learning in the world is already in your pockets," noted Greg Corrado, Principal Scientist and Director of Augmented Intelligence Research at Google.
In a recent blog, the company has explained how it is working to recognise patterns or “signals” in EMR’s, utilising AI to drive data-driven health outcomes. However, this has remained a complex feat. Working in partnership with a number of healthcare providers, such as UC San Francisco, Stanford Medicine and the University of Chicago, Google has sought to implement deep learning models to make essential predictions regarding patients in hospital through de-identified electronic medical records.
“For each prediction, a deep learning model reads all the data-points from earliest to most recent and then learns which data helps predict the outcome. Since there are thousands of data points involved, we had to develop some new types of deep learning modelling approaches based on recurrent neural networks (RNNs) and feedforward networks. We engineered a computer system to render predictions without hand-crafting a new dataset for each task, in a scalable manner,” Google explained.
Additionally, by utilising the area-under-the-receiver-operator curve to measure the accuracy of its findings, Google has distinguished those with particular traits against those which don’t. However, the company has been keen to stress that such technology is not set to replace medical professionals, but rather enhance the patient experience.
“The model is more like a good listener rather than a master diagnostician,” it adds.
- Nobel Prize Highlights AI's Revolutionary Role in HealthcareAI & ML
- Google DeepMind's AlphaFold 3 'is Drug Discovery Boost'Technology & AI
- 48% of women say work has a negative impact on mental healthDigital Healthcare
- Digital healthcare with OSF HealthCare & Current HealthDigital Healthcare