More empathetic: Doctors or ChatGPT?
GETTY

More empathetic: Doctors or ChatGPT?

Generative artificial intelligence (AI) continues to make impressive strides in medicine.

In March, Google’s Med-PaLM 2 shocked the profession by scoring an “expert-level” 86.5% on the U.S. medical license exam, a 20-point jump over AI’s previous best. Then, in July, a study found that ChatGPT writes clinical notes so well that independent reviewers can no longer distinguish AI from humans.

Many of the skeptics who panned AI’s shortcomings earlier in the year—arguing large language models could never replace most of what writers, educators or doctors do—have changed their outlook on the technology’s potential.

As generative AI systems continue getting stronger and smarter (pulling from an ever-larger corpus of knowledge), people increasingly recognize that ChatGPT can match our cognitive abilities. What’s now uncertain is whether there’s anything left about our basic humanity that generative AI cannot emulate and even improve upon.

In healthcare, for example, clinicians insist that chatbots will never match their levels of compassion, empathy or trustworthiness. Medical professionals view these interpersonal skills as distinctly human, foundational to the doctor-patient relationship. Patients value these personal connections, as well. According to one survey, patients ranked “compassion as more important than cost” when rating physicians.

But new research indicates machines are rapidly gaining ground in these areas, too.

AI now boasts strong EQ

At the University of Texas in Austin, behavioral therapy treatments were failing to help patients who abuse alcohol.

So, the chair of internal medicine asked a team to write a script that clinicians could use to speak more compassionately and better engage with patients. A week later, no one had taken the assignment seriously, so the department head asked ChatGPT to do the job. It complied, masterfully.

Not only was the letter excellently written—sincere, considerate, even touching—but it was also devoid of “doctor speak,” which frequently gets in the way of patients adhering to treatment plans. Social workers at the university then asked the generative AI app to rewrite the communication for a fifth grade reading level, and then translate it into Spanish. The result was greater clarity and appropriateness in both languages.

Other clinicians who’ve used chatbots to script more empathetic remarks for patients found themselves equally impressed.

In a recent review, one doctor told The New York Times that the results of using ChatGPT “blew me away.” Other clinicians added, “I wish I would have had this when I was in training” and “you’d be crazy not to give it a try.”

How doctors learn (and unlearn) empathy

Emotional responses like empathy and compassion have long been considered biological. In support of that theory, scientific evidence demonstrates that these traits are inborn, although they can be fostered and expanded over time.

The desire to be kind, sympathize with others and care for those in need are precisely the kinds of heartfelt traits that draw people into medical careers. In fact, when medical school applicants are asked, “Why do you want to become a doctor?” the most common responses include:

  • To help people
  • To make connections with others
  • To improve lives
  • To help the underserved

Most doctors pursue medicine for kindhearted reasons. But by the time they finish medical school and residency, they emerge with a different set of priorities.

In 2021, I published a book about the unseen and unspoken forces that shape doctors. That book, “Uncaring: How the Culture of Medicine Kills Doctors & Patients,” explains how medical culture erodes compassion and empathy over a decade of clinical training, fundamentally reshaping the attitudes, beliefs and behaviors of once-idealistic medical students.

Through careful observation of their professors and attending physicians, young doctors learn which emotions and behaviors are rewarded and which are dismissed as unimportant.

For example, a resident will rarely (if ever) witness an attending physician take time to learn non-clinical details about a patient’s life or connect with concerned family members about anything medically irrelevant. Trainees come to view these interpersonal activities as a waste of time when compared to reading textbooks and mastering technical skills. After a decade of disuse, their “softer skills” atrophy.

The reality of medical practice

We know that physicians value the doctor-patient bond. However, the realities of healthcare today make it difficult to invest time in that relationship.

The practice of medicine for most physicians resembles running on a care-delivery treadmill—one that spins ever-faster with each passing year. As economic pressures grow, physicians are forced to see more and more patients each day just to maintain their income.

That is why, on average, physicians spend only 17.5 minutes with each patient. And, given the demand to move quickly, physicians interrupt patients after just 11 seconds to eliminate “wasted time.” Of course, doctors don’t hurry up their exams or hijack conversations with the intent to be rude. They truly care about people. They’re just busy. And they’ve learned that taking control allows them to complete the visit more efficiently.

But these rapid-fire exchanges can leave patients feeling uncared for. In fact, nearly three-quarters of patients surveyed reported having seen a doctor who failed to be compassionate. A similar percentage said they always or often felt rushed by physicians.

How tech bests humans emotionally

While the healthcare industry has been grappling with the anecdotal notions of ChatGPT’s superior soft skills, a recent study published in the Journal of the American Medical Association (JAMA) provides hard evidence.

Researchers compared doctor and AI responses to nearly 200 medical questions submitted by patients via social media. The answers were read by a team of health care professionals who didn’t know whether the author was a doctor or a bot.

The team concluded that 80% of the AI-generated responses as more nuanced, accurate and detailed than those shared by physicians. But most surprising was ChatGPT’s bedside manner. According to a write up in U.S. News, “While less than 5% of doctor responses were judged to be ‘empathetic’ or ‘very empathetic,’ that figure shot up to 45% for answers provided by AI.”?

ChatGPT is far from perfect. Current versions are tied to medical data published before September 2021. And, on occasion, AI will hallucinate, providing seemingly expert answers that are dead wrong along with references that don’t exist. Clearly, current versions of generative AI aren’t “ready for prime time” when it comes to diagnosing, treating or caring for patients.

But these large language models are vastly better at “learning” than any AI that has come before. Thus, anything that can be taught—such as demonstrating compassion—can be learned and mastered by generative AI. As they become faster, smarter and more powerful, they will become not only more accurate, but also more empathetic.

Today, most patients (60%) are uncomfortable relying on technology over doctors for medical care. Given the choice, they’ll consistently pick a physician over AI.

But our nation is facing a worsening physician shortage at the same time it’s experiencing an AI revolution. It now takes 31 days on average to be seen by an OB-GYN and 35 days for a dermatologist.

I predict that when people?struggle to access timely medical care, they’ll turn to ChatGPT for help. When the answers they get are accurate and compassionate, they’ll turn to AI again the next time they need medical expertise. Over time, people will care less (or not at all) whether the assistance and advice come from a carbon-based life form or a silicon chip.?

Already, a growing number of doctors are comfortable using generative AI to assist with everyday healthcare tasks—from writing letters to insurers and transcribing notes to double-checking diagnoses and populating medical records.

But if they don’t find ways to demonstrate empathy, sympathy and respect in ways that foster patient trust, generative AI will fill that gap. Once this process begins, humans will play an ever-smaller role in the provision of medical care.

Dion Lampe

Chief Marketing Officer

1 年

Fascinating topic! The intersection of AI and empathy raises thought-provoking questions about the future of healthcare. While AI shows promise, there are still critical nuances of human connection and healthcare that remain irreplaceable.

Jacqueline (Jacki) Bishop

Physician Liaison | Strategic Leader in Community Outreach & Healthcare Product Launches | ?? Over a Decade of Healthcare Industry Excellence | ?? Driving Patient Care Outcomes & Market Growth

1 年

On a personal note, ChatGPT has been instrumental in communicating effectively with an ex-husband. I understand the benefit. For a medical professional lacking social emotional support, mindfulness or the one who is attempting to communicate with a patient when sleep deprived or over burdened with 20 other issues occring at the same time, this technology is beneficial.

Jeff D Kopelman MD, MS

Otolaryngology Consultation Service Mount Sinai South Nassau

1 年

When we start relying on computers to replace human emotions that we have become inept at performing, its not a cause for celebration. It’s an indictment of the medical training process. When we no longer have time to be empathetic and pass it off to a computer like just another lab test then there is something radically wrong. If your partner said they preferred to get their affection from a computer instead of you, it might arouse feelings of inadequacy. But as a health care provider, when patients begin asking for the computer to console them instead of you, why would this be acceptable. Oh the computer is now taking over responsibility for providing empathy so it frees me up for more important things. What’s more important than connecting with other people in time of need? Is that where medicine is headed? Patients are human not computers. When they would rather call for the computer with its algorithms to console them, why aren’t we outraged and trying to enhance emotional intelligence as much as artificial intelligence.

Kloe Korby

Founder & Podcast Media Producer at United Senior Association USA

1 年

This isn’t even about doctors or healthcare it’s about something even more prehistoric and related to foundation and eventual conception of Darwinism - survival of the fittest and right now the war is the mass consumption of technology and it’s as basic as it sounds. So like when we were playing with nuclear weapons and wanted to show off we did something that will haunt history forever but now it’s about AI which is almost as scary because from these posts and the others I am reading it’s already influenced a cultural shift and evolutionary paradigm. It’s distruptive innovation but this margin beats out the competition and not how Amazon did and after years of speculation this is an android task force that has the hardware connected to work at such an impossible level that it’s already determined the market and so this discussion is irrelevant because you’re all talking about issues that are already solved and it’s not in our control anymore.

要查看或添加评论,请登录

Robert Pearl, M.D.的更多文章

社区洞察

其他会员也浏览了