How Healthcare AI Can Elevate the Patient Experience

How Healthcare AI Can Elevate the Patient Experience

What’s Trending: Growing Number of Effective AI Applications Focus on Patient Engagement?

Healthcare organizations are increasingly seeing the strategic benefits of AI applications that personalize the patient experience. They can create differentiated interactions, stronger engagement, and better health outcomes. And they are a good starting point for organizations early in their AI journeys—as long as leaders can successfully navigate the risks.?

Why It Matters?

Patient engagement use cases for AI span a wide range of interactions, with a similarly broad array of benefits for both the organization and the patients it serves. What do these AI tools like in action? During a roundtable discussion earlier this month, panelists shared some examples:?

  • Creating more frequent, tailored communications. “AI tools can build a stronger, stickier patient relationship over time,” said Jon Freedman, Partner in Chartis Digital. “Healthcare organizations are strapped for time and people, and it’s difficult to engage patients as frequently and smartly as they’d like.” AI tools can accelerate how often organizations communicate with patients and improve the quality of those messages. Organizations can tailor engagement with patient language preferences, reading and health literacy levels, and more. “This tailored engagement can enable patients to better adhere to their care plans and medications, and help health systems more proactively reach out to patients to address their care needs,” said Freedman. “It can empower the patient to be an active part of their care team and participate in the dialogue.”? ?
  • Capturing important points from patient conversations. AI tools can summarize the content of conversations to key points—and that can go both ways. The patient can receive key takeaways from what their care team shares, and the care team can easily identify important points that the patient says, facilitating a more natural and meaningful conversation that captures the most important information. “There’s a natural language processing (NLP) opportunity for extracting important points from what the patient says in the dialogue, both during live interactions and submitted via unstructured text,” said ?Tom Kiesau, Chartis Chief Innovation Officer and Head of Chartis Digital.? ?
  • Streamlining patient intake. “The patient intake process has been grounded in traditional rules and guidelines that require asking patients questions over and over again,” said Jody Cervenak, Chartis Informatics and Technology Practice Leader. “But so much information doesn’t need to be re-asked every time the patient enters a health system. Re-asking the questions can be a frustrating waste of time for both patients and the care team.” AI tools can help validate the patient history and current health state so the patient can have a meaningful discussion with their care team about things that might have changed since the last time, and the care team can ask targeted questions to update or confirm existing information. “Having an up-to-date patient history also communicates an important message to the patient: We care about you, we care about your time, and we want to make sure that we’re talking to you about relevant things,” said Cervenak.? ?
  • Balancing clinical advice with compassion. “Doctors are using AI to communicate with greater empathy,” said Kevin Phillips, Co-Founder and Chief Operating Officer of Jarrard Inc. “They’re even using it to help them better deliver bad medical news.” AI tools can also help translate complex medical jargon and concepts into messages that are easy for patients to understand. “Used with an empathy angle, AI can really help in patient engagement,” said Phillips.?
  • Helping patients understand benefits and cost estimates. While health plan benefits can be challenging to interpret even for the most sophisticated healthcare consumers, AI tools can help patients understand the benefits unique to their plans. Similarly, these tools can help create simplified messages to clearly communicate cost estimates for services.?Leveraging this technology can “empower patients to take advantage of benefits that are ‘no cost’ (to them), like preventative care,” to help them stay as healthy as possible, said Freedman.??

What’s Next?

For healthcare leaders who pursue using AI tools to elevate the patient experience, 3 actions will be critical:?

1. Cultivate patient trust with proactive reassurance and transparency. When healthcare organizations use AI tools with the?appropriate guardrails and governance in place, they can confidently reassure patients that this AI use will only make their experience better.

It starts with reminding patients that AI is already in use in ways that people often don’t consider as AI (such as scheduling, appointment reminders, and medication management). “Assure them that AI doesn’t mean their doctor will be a robot next week,” said Phillips. “AI is a complementary mechanism (such as double-checking diagnoses and imaging reports, almost like a second opinion), and their human care team will still be primary. If you are up front about that, it can help ease consumers’ concerns.”

And organizations need to follow through with transparency about how and when they are using new AI tools. “Patients want to know that they’re getting a better experience and the highest quality of care and outcomes,” said Cervenak. “While we’re not seeing this so much in other industries, the very human nature of healthcare makes it critical for organizations to figure out how they will appropriately highlight their use of AI.”

Just as important as transparency will be communicating about that AI use in a way that patients can easily understand, not getting so far into the weeds that the transparency is muddied in obtuse or confusing explanations.? ?

2. Establish a disciplined study of AI performance and impact. Much like the healthcare industry already employs robust methodologies and studies around the effectiveness of new drugs and clinical interventions, this kind of discipline should be applied to studying AI applications. A true pre-AI baseline should be established, followed by objective measurement of performance and impact after implementation.

Such study will inform not only application, optimization, and investment on the organization’s side but also help patients understand why the organization is using this technology and the safeguards in place.

“Consumers don’t want to be the one for whom the system fails, regardless of how good it is overall,” said Phillips. “People are fascinated by AI, but fear is involved as people don’t understand how AI functions and hear examples of wrong outputs.”

He notes the example of when the Tesla autopilot feature doesn’t see the car in front of it and crashes into it. People become fearful, despite the fact that in aggregate, driving with the autopilot feature may result in fewer crashes per highway mile driven than unassisted human drivers.

“Human failures abound, currently—whether in causing car accidents or in engaging patients,” said Cervenak. “Has your organization assessed its current state without AI intervention? How many ‘crashes’ are you having every day that you just don’t track? How many wrong diagnoses, incorrect responses, or (even worse) no responses at all are happening in your organization today?” Healthcare leaders need to know so they can measure their improvement.?

3. Prepare a plan of action for when things veer off course. The potential risks of AI use span the field—from data privacy and security issues, to built-in or amplified bias, to missed diagnoses and errors. That’s why it’s critical for each organization to establish a process for refining algorithms and associated data to ensure they are up to date, and for each organization to cultivate AI-specific guidelines.

“These defined guidelines will need to cover AI use, transparency, and communication,” said Kiesau. “Organizations will need a process for how they respond when instances of AI use slam into those guardrails. They need to know how they will review, advance, and revise guidelines—and how they will communicate when something changes or goes wrong. It’s a complicated planning exercise that every health system needs to go through.”

“Patients are unlikely to care much about how an organization is using AI when things are going well,” said Freedman. “But they will care very much the moment things seem amiss or the AI just doesn’t work seamlessly, the way it should.”

Understanding that things will not always go as intended and having a prepared process in place to identify and respond will be important to address those situations when they inevitably arise. Organizations need to explicitly consider the impact of possible failures and be prepared with the appropriate reaction—including explaining what happened and taking responsibility.

AI tools hold tremendous opportunity to elevate the patient experience. But doing so requires a focus on empathy and integrated human oversight, defined AI guidelines, process transparency, and clear communications about that AI use. Healthcare organizations that can bring these critical elements together will be able to realize meaningful benefits for their organizations and patients alike.?


ABOUT CHARTIS??

Chartis is a comprehensive healthcare advisory firm dedicated to helping clients build a healthier world. We work across the healthcare continuum with more than 600 clients annually, including providers, payers, health services organizations, technology and retail companies, and investors. Through times of change, challenge, and opportunity, we advise the industry on how to navigate disruption, pursue growth, achieve financial sustainability, unleash technology, improve care models and operations, enhance clinical quality and safety, and advance health equity. The teams we convene bring deep industry expertise and industry-leading innovation, enabling clients to achieve transformational results and create positive societal impact.?Learn more.?

Want more fresh perspectives to help you think about, plan, and execute strategies for what’s next in healthcare??Subscribe to our latest thinking?and check out our weekly blog,?Chartis Top Reads.

Daniel Coulton Shaw

I represent a handpicked collection of top private medical facilities offering some of the most successful treatments worldwide. If you think I can help you, send me a message. I’d be happy to help.

10 个月

Great article! With your permission, I'd love to cover some of your key points at https://www.drarti.ai/ in the next edition - I'll cite your article, of course.

回复
GerriAnne H.

Strategic Operations Wizard | Program Management | Project Management | Grant Writing |Change Agent Extraordinaire |Making Impact that Sticks

1 年

This topic has been on my mind as of late.... Can it elevate the patient experience, and should it? An article written in June by Ryan Levi and Dan Gorenstein titled "AI in medicine needs to be carefully deployed to counter bias – and not entrench it." We have created the biased data set and now the algorithms that contribute to existing health disparities for specific populations. One of the biggest hurdles in healthcare is collecting accurate data based on race, ethnicity, gender, age, linguistics, or other demographic factors. Levi and Gorenstein rightfully pointed out in their article that "These powerful new tools can perpetuate long-standing racial inequities in how care is delivered." The large data sets we have curated over the years have issues and have already created differences in how we treat patients. For example, a 2019 landmark study published in Science showed that algorithms to predict healthcare needs were biased against black patients. Another article I came across quoted that in reviewing clinical vignettes, AI got the diagnosis right 73% of the time. I didn't calculate the error rate.

回复
Thushara Urumbil

Senior Analytics Engineering Leader @ Huron | Health Insights & Digital Products

1 年

I read it with "patient trust" in capital letters.

回复
CHESTER SWANSON SR.

Next Trend Realty LLC./wwwHar.com/Chester-Swanson/agent_cbswan

1 年

Thank you for Sharing.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了