Cognitive health care in 2027

Cognitive health care in 2027

Harnessing a data-driven approach in personalized health care

“Precision medicine” or care that is highly personalized for each person’s genome is likely to revolutionize health care of the future. And cognitive technologies will play a pivotal role, as handling the enormous amounts of data—one of the imperatives of cognitive health care—requires much more than just “artisanal” analytic capabilities.

Introduction

THE next 10 years will likely see a revolution in the use of cognitive technologies for health care. Admittedly, the industry has not been a leader in the use of data and analytics in the past. Multiple disconnected systems, poor data quality, and difficult-to-change patient and provider behaviors have often been part of the challenges related to health care information. “Imprecision medicine” has generally been the rule. But there are clear signs of change in the $3 trillion US health care industry1 that we believe will come to fruition over the next decade. And from an information technology perspective, cognitive technologies are probably the only resource that can make that revolutionary vision for personalized health care possible.

More precision for providers, payers, and life sciences firms

“Precision medicine” is the shorthand term—adopted by the US government and many others—for care that is highly personalized to each individual’s genome, behavior, social, and environmental factors. For the federal government (specifically the National Institutes of Health and The Centers for Medicare & Medicaid Services), the primary focus is a data set of 1 million or more individuals whose information on all the above factors is captured and analyzed. There are other precision medicine initiatives underway for specific groups (such as the “Million Veterans” initiative for military veterans) and diseases (such as the Oncology Precision Network for cancer). The United Kingdom has a “100,000 Genomes Project” to sequence that many patient genomes. The primary focus of these initiatives is on health care providers, helping them develop treatment approaches that are most effective for individual patients. Genomic-driven medicine has already had some initial successes in fields like cancer.2

There are similarly transformational efforts underway among payers and life sciences companies. Payers for health care—both governmental and private sector—also have incentives to change their business models. Precision health care is based on value-based care, not the volume of treatments—which should ultimately lead to better outcomes. And not only acute or chronic care but disease prevention plans can also become personalized and more effective. Some payers are already working on individualized care plans, and some have formed new business units to create and market them.

Precision medicine will likely be just as revolutionary for life sciences companies, which have already developed some drugs that are designed for specific genome profiles. Many of these companies are also analyzing genomic, metabolic, and clinical data to identify biomarkers that can facilitate early diagnosis of diseases and indicate whether a particular drug will prove effective on a particular individual. One consortium of hospitals, researchers, and a startup, for example, is conducting “Project Survival” to identify effective biomarkers for pancreatic cancer.3 In other firms, real-world data sources are being used to identify molecules that might be particularly effective (or ineffective) in clinical trials. Another key focus is on using machine learning to quicken drug development processes and help predict the most fruitful molecules and compounds. Some AI-based startups are developing their own new drugs based on extensive clinical data analysis; others are partnering with “big pharma” firms or university researchers.

What’s making all this possible?

As in other domains of data science, the availability of digital data is key to precision medicine. Each human genome, for example, contains about a gigabyte of data before compression; proteome and biome data would increase that amount dramatically. Over a million humans have had their genomes sequenced, and the number is growing rapidly as the price falls. Medical devices, mobile phones, activity trackers, and health sensors generate data continuously. Electronic medical records (EMRs) accumulate patient data over a lifetime. The availability of all these data makes it possible—albeit still difficult—to personalize diagnoses and treatment plans to the individual level, and necessitates new data aggregation, storage, and modeling approaches. Traditional “artisanal analytics” cannot power precision medicine effectively.

Machine learning is one of the most common techniques for dealing with large volumes of rapidly changing data. It allows for a variety of statistical algorithms, can involve a large number of highly granular models, and can quickly generate new models for new data. It can be used to predict (disease onset, for example), detect patterns in data (a drug’s effects on populations or individuals, for example), or to classify populations (patient subpopulations, for example). Machine learning can also be used for the prosaic but important task of combining data across disparate data sources—say to create a Patient 360 view, a “predictive matching” approach is employed to aggregate data of the same patient (with slightly different names and addresses) that may be federated across multiple databases.

While generating complex models, some machine learning approaches allow for some degree of transparency—information on why the model is suggesting a certain course of action—which is often important in prescriptive or diagnostic models in health care. Patients and physicians are unlikely to accept “black box” recommendations. Less transparent forms of machine learning like neural networks and deep learning also have a role in cognitive health care. Deep learning-based image recognition, for example, is being used to identify potentially cancerous lesions in medical imaging, and to identify abnormalities and pathologies in cells and tissues.4 Since the identification is often a first-level screening, the interpretability of outcomes is typically less of an issue than in some areas of medicine.

Among payers, there is growing interest in technologies that address member and patient engagement. These approaches employ some form of natural language processing (NLP). NLP-based “chatbots” or intelligent virtual agents can be used to answer common patient questions, issue reminders about treatments and appointments, and capture subjective patient conditions. Among some providers, NLP is also being used to extract meaning from unstructured text, such as clinical notes or research articles.

What’s standing in the way?

While the technical feats we have discussed are possible today—at least as pilots or proofs of concept—it’s not easy to execute them at scale, and it’s likely even more difficult to deploy them in mission-critical care settings. First of all, the cognitive technologies are still evolving, and they involve technical challenges. Some organizations have found it very difficult, for example, to use them for ambitious medical objectives like predicting cancer or creating customized treatment pathways. Still, we expect that many of these technical barriers will be sufficiently addressed within a decade.

Secondly, a substantial amount of integration is required with existing systems, particularly in health care provider organizations. Cognitive solutions should be embedded into EMRs and revenue cycle systems in providers. These systems themselves will have to evolve and become more open to effectively accommodate new cognitive functionality.

Another long-term challenge to be addressed by the life sciences and health care industry is collaboration and integration of data. There is no consensus within the industry about who owns the data about a patient, for example, and who can do what with them. The industry should consider more collaborations and partnerships across sectors and organizations. To some degree, this is already happening. Project Survival, for example—an effort to find a pancreatic cancer biomarker—involves collaboration among a big data drug development startup (Berg Health), an academic medical center (Beth Israel Deaconess in Boston), a nonprofit (Cancer Research and Biostatistics), and a network of oncology clinicians and researchers (the Pancreatic Research Team). It seems likely that this form of cancer will eventually succumb to such a collaborative effort.

Many health care organizations will likely also face the same types of talent shortages with cognitive technologies that many other organizations encounter. There is a general lack of skills to engineer cognitive solutions. Costs are falling for software and hardware, but experienced software developers and system architects tend to still be expensive and often difficult to hire. This problem is particularly pronounced among providers, many of which lack the resources to attract top data science talent. In addition to the talent that develops cognitive applications, we’ll also need a generation of clinicians who are comfortable with smart machines augmenting their knowledge.

All of these challenges are surmountable. Over the next decade, universities will churn out more talented people, more collaborations will be created, and cognitive software is expected to become more capable and easier to use. We see a bright future for leaders and organizations wishing to harness a data-driven approach to help treat and cure disease in human populations. It will be an exciting and constructive time to be in the field of cognitive health care.

Written By: Tom DavenportRavi Kalakota

*This article was originally published in Deloitte University Press on September 12, 2017?

Endnotes

  1. Robert Pear, “Health spending in U.S. topped three trillion last year,” New York Times, December 2, 2015. View in article
  2. National Institutes of Health, “Impact of cancer genomics on precision medicine for the treatment of cancer,” NIH Cancer Genome Atlas website, accessed August 23, 2017. View in article
  3. Beth Israel Deaconess Medical Center, “Project Survival: Finding a pancreatic cancer biomarker,” BIDMC website, accessed August 23, 2017. View in article
  4. S. Kevin Zhao et al, eds., Deep Learning for Medical Image Analysis (Academic Press, 2017). View in article
Kaneshwari Patil

Marketing Specialist at Data Dynamics

1 年

It's encouraging to see the healthcare sector evolving towards precision medicine. The integration of genomic, behavioral, and environmental data holds promise for tailored treatments and improved patient care. Exciting times indeed!

回复
Chad Erickson

Practice Executive | Special Projects | Strategy | Simplicity in Healthcare for everyone

7 年

First, Tom I loved the book Competing on Analytics! What a wonderful journey through the last 20 years of analytic growth that the every-man can consume and understand. Second, I have been playing in healthcare for a while and one of the biggest challenges is that where healthcare starts at the PCP and second level specialist is not where the most healthcare dollars are being spent, and the place where they are the least likely to even have decision support (did't know that terminology until I read your book). But this is where early detection and intervention can make the most difference. Third there are a whole host of laws both state and federal that keep physicians from being able to collaborate. Forth, physicians are very independent by nature, and resistant to change. This is my every day, I am out talking these doctors and their administrators. But like you I believe the next big jump in efficiency is going to come from not replacing the physician but empowering them. I fear the groups that don't in the next ten years will go the way of Blockbuster.

回复
Fred Dempster

**RETIRED *** Helping People Succeed

7 年

Yes - Addressing the big and little in healthcare - exponentially more complex than any other industry I've touched.

回复
Tiran Dagan

Strategy, Transformation & Alliances Executive | Sales Management & Revenue Optimization | Partner & Alliance Management | Strategic & Financial Planning | Offering & Product Lifecycle Management

7 年

Tom: I believe the role for Cognitive computing is not just in the heard-to-crack diagnostic services. It has applications in the most fundamental aspects of healthcare delivery. Detecting billing anomalies for fraud or to help a private practice get a handle on their billing is one area that creates a waterfall effect on the entire delivery model of fee for service. Using machine learning to understand the pattern of billing and detect if a decline for EOB was unreasonable or (on the payor side) to combat fraud - are areas of huge impact. That is why your point about the integration of data and services is key: every EMR developer believer they have the end all solution for EMR and have very limited access APIs or sharing of their data sets. I took one look at the raw data from a private practice recently and immediately uncovered a $1M+ gap in billing that was not detectable through the regular menus for that popular web based EMR. I have been playing with the use of cognitive computing for experience design including work I did analyzing employee sentiment for Verizon Wireless that you might find interesting: https://www.dhirubhai.net/pulse/from-employee-reviews-quantitative-sentiment-cognitive-tiran-dagan/ Keep up the good work! Tiran

要查看或添加评论,请登录

Tom Davenport的更多文章

社区洞察