Google outlines its plans to harness AI in healthcare

By Catherine Sturman
Recently rebrandingits Google Research division to Google AI, at Google’s 2018 I/O, festival, the technology giant has revealed its ongoing plans to h...

Recently rebranding its Google Research division to Google AI, at Google’s 2018 I/O, festival, the technology giant has revealed its ongoing plans to harness artificial intelligence and machine learning across a multitude of consumer technologies. However, health technology is gaining traction, and the company continues to grow apace in this space.

Recently partnering with Fitbit, the company will seek to use its cloud healthcare API, founded on Fast Health Interoperability Resources (FHIR) to connect user data with electronic medical records (EMR).

Following on from Fitbit’s recent acquisition of Twine Health (now rebranded as the Fitbit Health Platform), the collaboration will also open doors for the duo to provide solutions to those with chronic, long-term health solutions.

“At Google, our vision is to transform the way health information is organised and made useful,” explained Gregory Moore MD, PhD, Vice President, Healthcare, Google Cloud.

With this in mind, the company has unveiled its new consumer health apps and new smartwatch, all which will successfully integrate with healthcare organisations EMR systems.

See also

"Artificial intelligence and machine learning are not just technologies of the future. I can assure you that the best artificial intelligence and machine learning in the world is already in your pockets," noted Greg Corrado, Principal Scientist and Director of Augmented Intelligence Research at Google.

In a recent blog, the company has explained how it is working to recognise patterns or “signals” in EMR’s, utilising AI to drive data-driven health outcomes. However, this has remained a complex feat. Working in partnership with a number of healthcare providers, such as UC San Francisco, Stanford Medicine and the University of Chicago, Google has sought to implement deep learning models to make essential predictions regarding patients in hospital through de-identified electronic medical records.

“For each prediction, a deep learning model reads all the data-points from earliest to most recent and then learns which data helps predict the outcome. Since there are thousands of data points involved, we had to develop some new types of deep learning modelling approaches based on recurrent neural networks (RNNs) and feedforward networks. We engineered a computer system to render predictions without hand-crafting a new dataset for each task, in a scalable manner,” Google explained.

Additionally, by utilising the area-under-the-receiver-operator curve to measure the accuracy of its findings, Google has distinguished those with particular traits against those which don’t. However, the company has been keen to stress that such technology is not set to replace medical professionals, but rather enhance the patient experience.

“The model is more like a good listener rather than a master diagnostician,” it adds

 

Share

Featured Articles

5 minutes with Yoni Nevo, CEO of Sweetch

At behavioural science company Sweetch, CEO Yoni Nevo treats chronic illnesses with digital therapeutics solutions, using AI & emotional intelligence

Canada’s telehealth support for Ukrainian healthcare workers

McGill University has stepped up to support healthcare workers in Ukraine with training videos to help provide life-saving care for those in the conflict

How Contura’s hydrogel technology can help patients knees

Contura’s hydrogel works as a scaffolding for the body, where it needs extra support. In 2022 the company will adapt for the digital healthcare space

MVP Health Care’s Kim Kilby on hybrid healthcare’s future

Hospitals

Loylogic’s path leads to higher healthcare app engagement

Digital Healthcare

Lenovo’s move from laptops to digital healthcare

Digital Healthcare