October 2007 – Tips & Topics

TIPS & TOPICS
Volume 5, No.6
October 2007

In this issue
— SAVVY and STUMP THE SHRINK
— SATIRE
— SOUL
— SHAMELESS SELLING
— Until Next Time

Thanks for joining us for the October edition of TIPS and TOPICS (TNT). Welcome to all the new readers.

SAVVY and STUMP THE SHRINK

In the July/August edition of TNT, the focus was on Evidence-Based Practice (EBP). A number of readers sent me comments on that issue. At the risk of boring you with more discussion about EBPs, we’ll share comments from two readers. The first response I am calling a “Stump the Shrink” because it got me thinking how best to explain more about my views about EBPs and the relationship to therapeutic alliance.

With Dr. Brian Miller’s permission, here are his comments and my response right next to each section of the full message:

“Dr. Mee-Lee:

Comment:
I hope you can stand just a few more thoughts about evidence-based practices-I had a strong reaction to your July/August Tips-both positive and negative- and wanted to reply:

I am hopeful that the paradigm wars between the orthodoxy of the “nothing but EBPs” advocates and the “nothing but the relationship matters” advocates is reaching its acme.

My Response:
I re-read the July/August TIPS and TOPICS to see what it was you read that made it seem that Scott Miller and I are advocating: “nothing but the relationship matters”. I did not see where we said that. In fact Scott said that “For the record, our team has never said that the therapeutic relationship is the only important variable in psychological treatments. It is, however, the most evidence-based finding in the literature, with over 1000 findings published to date. Study after study show that it contributes 4-8 times more to treatment outcome than treatment approach. As such, its absence in most professional discourse and the EBP movement is nothing short of stunning.” If you read his section again and the SKILLS section, I was explicit that EBPs have a role to play. Our concern is that research, training funds and fidelity efforts have been centered on what contributes less to the outcome (EBPs) with far less resources spent on helping clinicians, programs and systems to focus on measuring the quality of engagement and tracking of real-time outcomes.

Comment:
I feel that a new dialectic is becoming clear that synthesizes the sides into what, I hope, will be a new paradigm. This new paradigm distinguishes between evidence-based practice and the evidence-based practices.

My Response:
I agree and hope you are right. Scott referred to the new APA definition on EBP that takes this in the right direction.

Comment:
In your and Miller’s deconstruction of the evidence- based practices, you create the straw man argument in the form of the practitioner who is rigidly adhering to a treatment manual without any consideration of relationship or client factors.

My Response:
I didn’t see what you were reading where either Scott or I created such a straw man. But maybe I missed something. It would help if you could paste in where we said anything that sounded like practitioners are “rigidly adhering to a treatment manual without any consideration of relationship or client factors.” Perhaps you are referring to comments of Norman G Hoffmann, Ph.D. in TNT. But again on re-reading his section, I still didn’t see where any of us created such a straw man. But I remain open to your feedback on any sections if you wouldn’t mind including those in a response back.

Comment:
Such practitioners may exist, but no one I know is arguing that what they are doing is “evidence-based practice.” Evidence-based practice (again, distinguished from the evidence based practices) is best defined by Eamon Armstrong in the context of evidence-based medicine: “Evidence-based medicine is a process of problem-based learning and informational mastery that enables physicians to keep up to date while improving their clinical behavior and patient outcomes.”

My Response:
I am comfortable with this definition and the APA one. In fact “improving their clinical behavior and patient outcomes” is precisely what Scott Miller and his associates and also I are all advocating. Scott Miller and Barry Duncan’s Session Rating Scale (SRS) focuses on engagement and the quality of the alliance so that the clinician can change the style, method, fit of their treatment approach to better meet the desires and direction of the client. The Outcome Rating Scale (ORS) specifically measures patient outcomes in real- time to improve outcomes by a change in approach and treatment plan negotiated with the client. This is akin to the management of hypertension where the outcome is measured in real time and the clinical behavior changes to adjust treatment to be more effective, acceptable and engaging. You can see these instruments on www.talkingcure.com. I have seen now a number of programs using these instruments and am impressed how client and clinician-friendly they are and how it helps clinicians focus on outcomes and engagement and individualized treatment. I hope Utah will consider the ORS and SRS rather than, as I understand it, Mike Lambert’s OQ 45 and the Clinical Support Tools that from my observation are not as clinician and client- friendly. By the way, I have no financial stake in the ORS or SRS or the ASIST software. If you know of programs that are using real-time feedback tools to track outcome and alliance that are as client and clinician-friendly, efficient, effective and time-feasible, please give me contact information. I would like to contact them and learn what they are doing. I would be happy to connect you with both mental health and addiction programs who are using ASAM multidimensional assessment in conjunction with ORS/SRS; or ORS/SRS alone.

Comment:
Nothing in that definition suggests rigid adherence to a manual or fidelity measures. Rather, the emphasis in evidence-based practice is upon the scientific method, wherein everything is open to question as the evidence appears. Who can argue against the emphasis upon problem-based learning-another way of describing attention to client-level data– and informational mastery? Thus, the new paradigm is not one of rigid adherence to anything, but emphasizes objective evidence and the scientific method over traditional views and theory-and places the emphasis on what works rather than on defending what we have always done. Viewed this way, evidence-based practice is flexible and open, but non-evidence-based practice is rigid adherence to what we already think we know-including the belief that only the treatment relationship matters.

My Response:
I agree with you on this paragraph. Again, neither Scott nor I believe that “only the treatment relationship matters”. We are concerned about the lopsided emphasis that is placed on the EBPs as you say, and the stunning neglect of helping clinicians measure and manage the therapeutic relationship and patient outcomes. A doctor manages and changes the treatment in real-time based on measurement of outcome. More training and resources need to be focused on helping clinicians change their behavior based on real-time, during- treatment feedback on the quality of the alliance, engagement and patient outcomes.

Comment:
What is missing in both sides’ arguments is that, although it is true that no one model has shown clear superiority in clinical trials against any or all other models, there is good evidence that there is a difference between high quality and low quality treatment. Miller’s MATCH study, often used to argue against the evidence-based practices, also made clear that it is well-implemented treatment that is effective. What is the basic stuff of “well- implemented” treatment? When “fidelity measures” are nothing more than arbitrary rules that define the “religion” of a particular model, they are meaningless and may be counterproductive. When, however, the fidelity measures define the essential elements of a well-implemented program, however, they become something more: They become definitions of quality treatment.

My Response:
I agree. In the SKILLS section, I said: “By all means learn about EBPs and become proficient enough to confidently include a greater variety of methods and techniques in your clinical toolkit. But don’t let self- consciousness over fidelity to a model dilute any natural and effective style that engages people in a good working alliance.”

Comment:
In our rush to defend against the evidence-based practices, we must be careful to realize that the genuine benefit to effectiveness research is not to prove which model has God on their side. The real value of this research is in beginning to learn what quality in treatment means. There is a difference whether you have 8 or 28 clients in your treatment group. There is a difference in whether a peer advocate is available only in the clinic during working hours, or is available 24/7 anywhere she is needed. There is a difference in whether the counselor is confrontational or is accepting, and there is a difference between quality and poor treatment. This difference shouldn’t be lost in the academic argument about the proportion of variation that can be attributed to model versus relationship factors.

My Response:
I agree we should look at what quality treatment is. The evidence is that we already know a lot about what contributes most to effective treatment outcomes. Rather than focus on chart audits to track compliance with process measures that have limited relationship to effective outcome (e.g., to what degree clinicians are adhering to a specific model; or whether there are signed consents that clients were explained their medications; or whether the treatment plan is worded with the right goals and objectives that the auditor feels is individualized enough etc.,) I’d suggest that we should focus on tracking client outcomes in real time; decrease drop-out rates by engaging clients more effectively; and audit whether clinicians measure outcomes and do anything different to improve the outcomes. You might be interested in “Making Treatment Count” that we wrote and can be accessed on www.talkingcure.com. It outlines in more detail what we are proposing. What is more fascinating to me, is that in programs that are taking this approach, we are seeing impressive clinician and client acceptance and far better individualized, client- directed care than all my years of training people on individualized treatment.

Comment:
I hope I am right in suggesting that evidence-based practice, as defined by Armstrong, will be the new paradigm, and that the argument between the old paradigms will become a unified focus on learning what works. Then we can focus on information mastery and applying that information in a problem- solving process with each individual client.

My Response:
Yes, I agree. We need to focus on what works for this client at this point in time, in this program, with this clinician, using what methods, working on what goals towards what outcome. The emphasis we would put then is on real-time, specific client feedback on alliance, engagement and treatment outcome. By having a range of EBPs in one’s toolkit, the clinician can move quickly and flexibly to change approaches if the outcomes and client preferences require a different approach.

Brian C. Miller, Ph.D.
Salt Lake County Mental Health Director
Salt Lake County Government Center
2001 South State Street
Salt Lake City, UT 84190-2000
(801) 468-2186

SATIRE

Just for this edition of TIPS and TOPICS, we’ve added a new “S” category that will substitute this month for the SKILLS section. I think you will enjoy a couple of comments from two readers. In the first, I know Bonnie Malek from Oregon is very supportive of 12 Step programs, so take this in the light-hearted manner it was intended. In the second comment, Bill Garrett from Alabama shares some humorous thoughts on what might be the real purpose of documentation (who would have thought documentation had anything to do with flies?)

Hi David:
It’s been a LONG time since we’ve been in contact and this issue inspired me to touch base. I thought you might get a kick out of the 12 Steps of Evidence Based Practice. Feel free to share it if you’re so inclined.

THE 12 STEPS OF EVIDENCE BASED PRACTICE

1. We admitted we were powerless over *Senate Bill 267 and that our information technology (IT) needs had become unmanageable.

2. Came to believe that the right set of manuals could restore us to pre-morbid functioning.

3. Made a decision to turn our program development and training resources over to the Substance Abuse and Mental Health Services Administration (SAMHSA) before we understood why.

4. Took the inventories of everyone that voted for this bill (and in some cases their mothers and their dogs).

5. Admitted to the Office of Mental Health and Addiction Services (OMHAS) and the Oregon legislature that for the past 70 years, we’ve been running on sweat equity, imagination and rubber bands.

6. Grudgingly agreed to do some reading and to keep an open mind.

7. Swore all the way to the dumpster with our favorite handouts and films.

8. Made a list of all the practices that made sense to us and became willing to check at least some of them out.

9. Agreed to learn one new thing this year as long as it didn’t add to our caseloads or paperwork.

10. Continued to work on doing the impossible with no new resources and dreamed of deleting databases when no one was looking.

11. Sought through outcomes data and specific serotonin reuptake inhibitors (SSRIs) medication to improve our conscious contact with the legislature, praying only to prove that treatment works and we’re truly not sleeping at our desks.

12. Having had a rude awakening as the result of these steps, we vowed to share our retention data with programs that were still pre-contemplative and to practice fidelity in all of our affairs.

By Bonnie Malek, MS, QMHP III, CDS III
E-mail : bmalek@co.marion.or.us

* Overview of Oregon Senate Bill 267

The 2003 Legislature passed ORS 182.525 (Senate Bill 267). This bill requires that increasing amounts of Oregon state funds be focused on Evidence-Based Practices (EBP). For 2005-07, the statute requires that at least 25 percent of state funds used to treat people with substance abuse problems who have a propensity to commit crimes be used for the provision of Evidence-Based Practices. The statute also requires that 25 percent of state fund be used to treat people with mental illness who use or have a propensity to use emergency mental health services. In 2007-09, the percentage of funds to be spent on EBPs increases to 50 percent and in 2009-2011 to 75 percent.

The shift to the delivery of services based on scientific evidence of effectiveness is a major shift for both the mental health and addiction treatment systems. This shift includes a focus on lifelong recovery for person with mental illness as well as those with substance abuse disorders.

As part of an effort to meet requirements outlined in ORS 182.525 (Senate Bill 267) from the 2003 Legislative Session, the Office of Mental Health and Addiction Services (OMHAS) developed an operational definition of evidence-based practices. The definition was developed with broad community input before being officially adopted by the office.

http://www.oregon.gov/DHS/mentalhealth/ebp/main. shtml

Comment:

Dr. Mee-Lee:

Your September ’07 Tips & Topics focus on rote documentation to please the overseers was dead on. The issue to me seems to be to look at the purpose of any particular paperwork and how is it facilitating patient care. Often in site visits I’ll come upon a form or some other sort of spurious documentation and ask “What’s it for” and be told, “Don’t know. On our last survey they said to do it so we just do it and they don’t bother us about that anymore.” I started calling this superstitious documentation.

Once on a bike ride I noticed some baggies of water stapled to the rafters of a porch at a rural store. I was staring up at them and the owner came out and I asked, “What are those for?” He replied, “Keeps the flies away.” I asked how. He responded: “We don’t know, but it works.” “Earl has some at his BBQ and they don’t never get pestered by ’em, so we put ’em up.”

I think putting up baggies of water is what many of us as treatment providers are doing in our documentation in order to keep the flies away, or in our case, not be pestered by surveyors. Hanging baggies of water and filling out meaningless forms both seem to qualify as superstitious behaviors. Then again, maybe not as they are both effective in keeping the flies away.

Bill Garrett, MPH
Shoals Treatment Center
Muscle Shoals, Alabama

SOUL

I was listening to a report about the Tokyo Auto Show and some of the concept cars Honda, Nissan and others were showing. Nissan featured a car with a little robot on the dashboard which speaks to you and keeps you company. Apparently, people have fewer accidents and are less likely to fall asleep or have other tragic mishaps. The robot uses several video cameras checking the driver’s eyes, head movements, body posture etc. to employ “mood recognition technology”. The robot processes this data and “senses” whether you are falling asleep at the wheel; or are building up to “road rage” or some other potentially dangerous mood.

I guess this is not all that ridiculous. I just started using one of those Global Positioning System (GPS) gadgets in the car. It calculates and plots out every turn to get you to your destination; and “holds your hand” all the way until you arrive safely. It not only anticipates and alerts you to every turn of the way, but lovingly, without scolding you, adjusts if you make a wrong turn and gets you back on track. None of “I told you so”; or “You just don’t listen”; or “Dummy, what do you think you are doing?”. No wonder the guy in the funny car rental TV ads tries to date and hook up with the “women” in the GPS dashboard gadget. He is just so impressed with how she is consistently there to guide his every step, calmly speaking to him at every turn.

Perhaps it is a sorry commentary on our lives that we could relate to talking robots and machines even better than to a talking real live human being. But then again, it would really be nice to get a real live person to answer the phone sometimes, especially if an intoxicated and/or psychotic person calls to try to get some help. That “your call is very important to us” message doesn’t seem very inviting or genuine.

I don’t know what right balance of technology and human contact is best. I would rather use an ATM to deposit and get money than stand in line to complete my transaction with a live bank teller. But then it is really frustrating to have to go through ten voice mail prompts in order to talk to a real person. As I said, I don’t know what the right balance is and I don’t have time to ponder that any more deeply now—I’ve got to go and check my voice mail and e-mail.

SHAMELESS SELLING

One more time —
If you want or need ten hours of continuing education credit, and a permanent inservice DVD training and new staff orientation on the ASAM Patient Placement Criteria, check out the latest offering from Hazelden in their Clinical Innovators Series.

“Applying ASAM Placement Criteria” DVD and 104 page Manual with more detail based on the DVD with Continuing Education test (10 CE hrs), 75 minute DVD
David Mee-Lee (DVD) and Kathyleen M. Tomlin (DVD manual)

Don’t miss out. Check it out.

Click here for Hazelden DVD

Until Next Time

See you for the November issue.
David