When Person-Centered Practices and Evidence-Based Claims Collide
Over the past several decades, there has risen an increasing call for evidence-based practice (EBP) in the field of psychotherapy, which has inevitably led to a kind of sorting - those models which have not been quantitatively validated, to the historical dust bin of shame; and those backed by research data and influential partners, into managed care. What we are seeing nationwide and even on a global scale are turf wars that vie for a share of the market.
EBP's are important but only useful if they are matched to practitioners who gain in scope or depth of practice, particularly when their developmental level as a psychotherapist indicates the need for a limiting focus, clear parameters for practice, and a semi-scripted methodology. I have concerns, however, about a culture change in the field marked by increasing blind assumption of research validity and expanding regulation related to EBP.
A number of evidence-based practice claims in the marketplace do remain insufficiently tested, and too many of these remain insufficiently challenged. I do not intend this a wholesale critique of EBP research design nor of EBP-utilizing therapists but a critique of the widely held assumption that therapists trained and certified in particular EBP's are implementing in practice the methodologies of those EBP's to a level of fidelity comparable to that carried out by the therapists participating in the original research studies.
The reality is that if therapists implement the methodologies of an EBP but do not implement it to a satisfactory level of fidelity, their practice is not evidence-based, yet very broad allowances are being made in the coding of EBP’s within managed care. This is in response to a growing appetite for "value-based" health care economies, systems of care that better reward "quality of care" on the basis of quantitative measures. In primary care, quantitative outcome measures such as standardized population health metrics are abundant; in behavioral health, with such standardized outcome measures being harder to come by, the level of implementation of evidence-based practice, measured by how many sessions demonstrate fidelity to and are coded with a particular EBP, is one option within the "value-based" milieu. Increasingly strict regulatory and contract requirements for the levels of EBP implementation may result, I fear, in a net reduction in depth and quality of psychotherapy practice rather than an increase in fidelity to effective psychotherapy intervention.
Many EBP's rigidly structure for therapists and, thereby, for clients, systems of levers to pull should the client's esteem tip this way or should the client's fears tip that way. In my experience, evidence-based practice cadres often do not have an interest in the personal agency of the client—in their capacity to choose for themselves and innate strengths and resilience that can emerge given the right kind of supportive conditions. While the spirit and principled mindset of a field of evidence-based practice is appropriately postured to mitigate potentially negligent and dangerous practices, far more widely than is widely acknowledged, EBP implementation takes the form of naive acceptances of poorly tested interventions and, in effect, may or may not ultimately ensure better therapy.
Donald Berwick, a Harvard-based quality-improvement expert, himself noted for employing evidence-based methods in the field of medicine, wrote in 2005 that we had “overshot the mark” and turned evidence-based practice into an “intellectual hegemony that can cost us dearly if we do not take stock and modify it” (p. 315). In 2009, advocating for “patient-centered care,” he declared, “evidence-based medicine sometimes must take a back seat.” (p. 561). His sentiment applies to the field of psychotherapy as well, in my view.
A common critique by EBP skeptics in light of researchers' claims of tightly controlled studies goes, "If your effect is so fragile that it can only be reproduced under strictly controlled conditions, then why do you think it can be reproduced consistently by practitioners operating without such active monitoring or controls?" If fidelity to a manualized modality cannot be ensured beyond the randomized controlled trials that stamped it “evidence-based,” how do we know, in the marketplace, that it is so? These, in my view, are valid concerns. Reminiscent of Hans Christian Andersen’s brief, illuminating tale about the two weavers who promise an emperor a new suit of clothes that they say is invisible, many EBP's may be wearing no clothes, if you catch the analogy.
Research findings based on the application of treatment manuals have led to endorsement of treatment brands which assume that these are practiced in a manner consistent with the research treatment manuals. Very often, they are not. In effect, the endorsement of a brand name treatment is a shortcut to and a means of defining de facto clinical practice guidelines and gaining a market monopoly.
The American Psychological Association unveiled a policy in 2005 recognizing that to practice from an evidence base, findings based on research are insufficient. The policy characterizes evidence-based psychological practice (EBPP) as incorporating evidence about treatment alongside expert opinion and an appreciation of client characteristics. Three components—evidence for treatment, expert opinion, and patient characteristics—are essential to writing clinical practice guidelines and thereby enhancing the delivery of evidence-based treatments ("APA Task Force on Evidence-Based Practice," 2006).
The APA policy stated, "A central goal of EBPP is to maximize patient choice among effective alternative interventions” (p. 284). Many practices claiming to work from an "evidence base" in practical fact minimize client choice.
There is no wholesale dismissal of evidence here, only of the errors of blind acceptance of a widely criticized and underperforming field of EBP research that has oversold to the unscientific public the merits of many findings.
Robert McNamara was the U.S. Secretary of Defense from 1961-1968. McNamara saw the world in numbers. He spearheaded a paradigm shift in strategy at the Defense Department to implement large-scale metric tracking and reporting that he contended would help minimize individual bias among department experts. A core metric he used to inform strategy and evaluate progress was body count data. “Things you can count, you ought to count,” argued McNamara. His focus, however, created a problem because many important variables could not be counted, so he largely ignored them. This thinking led to wrongheaded decisions by the U.S. and resulted in an eventual need for withdrawal from the Vietnam conflict.
Daddis (2009) instructed, "While McNamara contended that factual data had not supplanted judgment based on military experience or intuition, senior uniformed officials perceived their expertise being minimized as systems analysis took hold within DoD (p. 56)." Social scientist Daniel Yankelovich (1972) coined the term, the "McNamara fallacy," pointing out a human tendency to undervalue what cannot be measured and warning of the dangers of taking the measurably quantitative out of the complexity of its qualitative context.
Sociological researcher William Bruce Cameron (1963, p. 13) declared, "Not everything that counts can be counted, and not everything that can be counted counts (p. 13)." Clinicians, agencies, and entire systems of mental health care are beginning to identify themselves with particular EBP brands and hold increasingly rigid methodological expectations that drive skillful, humanistic practitioners to the fringes. This, in my view, is a problem. Yet this is also a time bursting with opportunity for person-centered practitioners to communicate the value of their approach to the public, to the profession, and to managed care. Person-centered practice is not necessarily at odds with evidence-based practice, and in fact, as EBP researchers advance qualitative design and as we transition into new, progressive, value-based health care economies, I see their convergence as not only possible, but ideal.
The preceding is an adapted excerpt from my chapter, "The Empathor's New Clothes: When Person-Centered Practices and Evidence-Based Claims Collide," in the book Re-visioning Person-Centred Therapy (2018, Routledge). Reprinted with permission.
To read more, click here to order the book at Amazon.com.
Blake Griffin Edwards is a licensed marriage and family therapist, clinical fellow in the American Association for Marriage and Family Therapy, behavioral services director, and statewide behavioral health leader for the American Academy of Pediatrics in Washington state whose writing has been featured by the American Academy of Psychotherapists, the Association for Family Therapy and Systemic Practice in the UK, the Association for Humanistic Psychology in Great Britain, and the American Association for Marriage and Family Therapy.
References
APA Task Force on Evidence-Based Practice. (2006). Evidence-based practice in psychology. American Psychologist, 61, 271–285. DOI: 10.1037/0003-066X.61.4.271.
Berwick, D. M. (2005). Broadening the view of evidence-based medicine. Quality and Safety in Health Care 14, 315–316.
Berwick, D. M. (2009). What ‘patient-centered’ should mean: Confessions of an extremist. Health Affairs 28 (4), 555–565.
Cameron, W. B. (1963). Informal sociology: A casual introduction to sociological thinking. New York: Random House.
Daddis, G. A. (2009). No sure victory: Measuring U.S. Army effectiveness and progress in the Vietnam War (Unpublished doctoral dissertation). University of North Carolina-Chapel Hill, Chapel Hill, North Carolina.
Yankelovich, D. (1972). Corporate priorities: A continuing study of the new demands on business. Stamford, CT: D. Yankelovich Inc.
Counsellor, Play Therapist and Family Therapist in the Voluntary Sector, the NHS and in Private Practice
5 年This feeds my soul, how powerful and significant. Thank you.
Private Practice at Solutions Therapy
6 年Don’t forget that every meta-analysis of grad to head comparisons between models of therapy reveal no significant differences in effect size. When controlled for allegiance factors, all bonafide models are equal. The real differences are between the therapists who get better results than those who do not.
Technology, Data, & Analytics Leadership ?? | Ethical & Trustworthy AI ?? | Quality Improvement ? | Policy & Evaluation ?? | Strategic Consultation ??
6 年I really appreciate the balanced view you provide, acknowledging and validating co-existing values of being person-centered as well as knowing that we were are doing as an intervention works. Both are important, and it can be challenging to find real-world evidence of the interventions' impacts, as you express so well. I'll take the side of deferring to more person-centered care, although we also need to be more creative and intentional about evaluating the efficacy of our work. Part of that is also in what we measure as outcomes--all too often it's only the symptom reduction (which is what the EBPs are generally studied against). We need person-centered outcomes, as well! With the rise of new technology, we can also better leverage qualitative data and other data sources to investigate impact beyond our traditional outcomes.
Owner Principal at New Insights Counseling
6 年I have yet to to see any "evidence" in the research that shows long term efficacy for any EBP. Rarely have I seen follow ups with study participants 2, 5, or 10 years later to validate the short term benefits of EBP. I haven't seen EBP's also prove long term efficacy across populations with regard to ethnic, cultural, or socioeconomic status of the patient. Most of what I have seen is either academia promoting EBPs that were conducted in very limited studies that are very difficult to apply in the real world or social services agencies trying to validate their worth to funders using EBPs yet unable to do long term efficacy follow up due to the populations they serve and the limited funds with which to do so.
Founder and CEO of the DBT Institute of Michigan and DBT-Linehan Board of Certification, Certified Clinician?
6 年Hate to say this, but more and more managed care organizations and private payers are requiring providers to do Evidence Based Treatments as well as moving to performance based care. The days of doing what “feels right” with patients are coming to an end. Patients and their loved ones are also looking more for providers who are science driven in their approach and becoming more aware of paying for treatments that are Evidence Based vs not. I believe the push back against Evidence Based treatments by some providers might be due to not wanting to be held to a specific standard when it comes to performance, outcomes and expectations. Might have to accept that the field is becoming more science driven whether you agree with it or not. Your organization’s survival might depend on getting on board with it.