Artificial intelligence a great tool, but no replacement for human intelligence
Brian Murphy
I enhance and elevate careers of mid-revenue cycle healthcare professionals. Published author, podcast host. Former ACDIS Director.
Is artificial intelligence (AI) coming for your jobs, CDI and coding professionals??
The answer is no. Not in any foreseeable future, at least.?
AI has been billed as a cure-all or a panacea, by some. It’s not. Smart, clinically minded CDI and coding professionals with critical thinking skills are still needed to ensure full and complete documentation, and the accurate reporting of medical codes that reflect hospital quality and accurate reimbursement.?
That’s not to say AI-powered tools like prioritization, computer assisted coding, computer assisted physician documentation, etc. are worthless. Far from it. These tools:?
“It gives you that big picture. It allows you to go in and give you your set diagnoses, tell you where they’re located, and allow you to say, ‘Oh yeah, I do agree this is met here, let’s go ahead and put it in the encoder,’” says Sandra Love, BSN, RN, CCDS, CCDS-O, CPC, senior manager of CDI for Norwood, of AI. “You don’t always need to read straight from the medical record.”?
But, more often than not, AI-powered support tools only land a CDI or coding professional in a broad ballpark. While we don’t have reliable and independently verified statistical data, some users report only about a 20% accuracy rate when querying for a specific diagnosis.??
AI is a pattern-detection tool that augments narrow bands of tasks typically performed by humans. AI cannot reason.* It draws inferences from data and takes actions and/or offers suggestions based on that data.??
Lacking good data, lacking context, and of course lacking data altogether, AI makes mistakes.??
AI gets better over time and with use (aka., machine learning). But in an era of copy and paste, outdated problem lists, and harried physicians entering non-specific diagnoses, the data these machines draw upon for inferences is often unreliable, which leads to false positives, or de-prioritization of cases that have opportunity.?
New variables are not AI’s friend. Deep pools of clean reliable data are. And AI does not always have access to this.?
Better to think of AI products as tools, not fire-and-forget machines, or robots you wind up and turn loose. They require oversight, and human auditing and validation of results.??
Following are two tips for using these tools effectively with your (human) CDI team.?
Tip 1: Reduce auto-suggested diagnoses by validating keywords
AI can elevate diagnoses for review or auto-assignment, but miss the context of surrounding language, resulting in false positives. Some examples include:?
Other terms AI can fail to recognize include “resolved,” “ruled out,” and “no change.” AI might read the diagnosis that follows or proceeds these terms, but not the modifier, resulting in an unwarranted auto-query.?
“We want to make sure we reduce unnecessary autosuggested query opportunities,” Love says. “When we are seeing medical diagnoses, we want to make sure they are chosen and interpreted appropriately. And when the AI asks us for a specific diagnosis, the technology should be able to query for POA status—or we (CDI) need to do it.”?
Tip 2: Understand and review for opportunities bypassed by AI?
AI is not always programmed to review certain areas of the medical record, and/or deprioritizes particular diagnoses that are perceived as less meaningful or of lesser impact. For example, AI might:?
“If you’ve got someone with congestive heart failure, and they’re coming in for a bacterial pneumonia, and your congestive heart failure is not considered for moderate to high prioritization, to me that puts you at risk, especially if the patient goes into respiratory failure due to fluid overload and might need BiPAP,” Love says. “To me, that’s very important to have those chronic conditions in there. People look at those as minor issues, but they are very important, it places the patient at a higher risk of mortality.”?
Some AI does not review anesthesia notes (pre- and post-operative) and pathology results, fails to pick up surgical procedures (cardiac catheterization, cardioversions, and colonoscopies), and doesn’t recognize scanned documents (paramedic’s notes, code blue notes), or nursing and clinician telephone notes. These review gaps lead to missed CDI and coding opportunity.?
“I use nursing notes all the time when I’m looking at query opportunities,” Love says. “I’ll see what a physician wrote, and then I’ll go back and look at the nursing notes, and sometimes I’ll see, ‘oh my gosh, the nurse stated a diagnosis,’ and I’ll include that information in the query.”?
领英推荐
Case examples?
Let’s bring this all home with a couple case examples that demonstrates AI’s successful use, and ultimate limitations/failures.?
Example 1: AI finding of acute respiratory failure?
Information from autosuggested query for acute respiratory failure includes the following:?
ED Report:?
Information the CDI found to support the diagnosis:?
Nurses Note:?Respiratory continues to decline since yesterday. Patient was placed directly on BiPAP since o2 sats 88% and using extreme accessory muscles.?
Verdict??
In this example we agree with the findings and AI suggestion for CDI to query acute respiratory failure. However, even in this case the AI only flagged the BiPAP and PcO2; it did not read the nursing notes, which provided important clinical support for the respiratory failure diagnosis. It’s important to include this information as validation and to strengthen the case in the event of an audit.?
Example 2: AI finding of pneumonia?
Information from autosuggested query for pneumonia includes the following:?
H&P:??
Patient presents with nasal congestion, rhinorrhea and shortness of breath. Patient appears to have wheezing associated with viral lower respiratory tract infection, since there has been no fever this makes pneumonia lower on differential.??
Assessment and Plan:??
Verdict??
We disagree with AI suggestion; pneumonia query is inaccurate. Patient did not have pneumonia because it was not on the differential. Upon CDI review, the CXR showed no opacities and no antibiotics given. Patient is 72 years old, and had been admitted prior with a lower respiratory infection with associated wheezing. Non-smoker treated for asthma.?
These two cases (real AI examples) demonstrate why your critical thinking skills as a CDI specialist are still needed today. They also demonstrate the importance of auditing your technology for compliant use.?
* AI is often confused with artificial general intelligence (AGI). AGI refers to machines that are as intelligent as humans, and can perform the same intellectual tasks with the same or superior results. AGI is currently not on the horizon. If we ever develop and apply AGI, CDI and coding jobs would be in jeopardy. But fully deployed AGI would transform the entire world economy, including clinical medicine. Just know that today, your skills are needed more than ever.?
About Sandra Love??
Sandra Love, BSN, RN, CCDS, CCDS-O, CPC, is senior manager of CDI for Norwood. Sandra has extensive experience in CDI including but not limited to pediatrics and outpatient. Contact her at [email protected].?
About the author??
Brian Murphy is the founder and former director of the Association of Clinical Documentation Integrity Specialists (2007-2022). In his current role as Branding Director of Norwood he enhances and elevates careers of mid-revenue cycle healthcare professionals. Comment on this story here.
BSN, RN, CCDS, Lean Six Sigma Green Belt
2 年Well said, reminds me of the days when coders thought CAC was going to take their jobs. AI is a tool and should be used as such.