Predictors of Clinical Decision Support Success

From Clinfowiki
Jump to: navigation, search

There are a variety of factors that may predict clinical decision support (CDS) success. An early outline of some recommendations were supplied by Bates et al in 2003 (Ten Commandments for Effective Clinical Decision Support: Making the Practice of Evidence-based Medicine a Reality).[1]

"Ten Commandments" Introduction

This paper tried to collect the wisdom from years of attempting optimization with clinical decision support at Brigham and Women's Hospital in Boston, MA. The clinicians appreciated a discrepancy between optimal patient care and actual practice. A variety of examples were mentioned that refer to suboptimal care that is provided. Some of the examples cited included: only 50% of eligible patients receiving beta blockers, 27% anti-epileptic drug monitoring was indicated and half of those were drawn at incorrect times, only 68% of vancomycin orders didn't follow the Centers for Disease Control (CDC) guidelines. These instances provide the impetus to make changes for improvement. As stated in the paper, "We believe that decision support delivered using information systems, ideally with the electronic medical record as the platform, will finally provide decision makers with tools making it possible to achieve large gains in performance, narrow gaps between knowledge and practice, and improve safety." The authors have refined their clinical decision support tools with both successes and failures.

The authors point out that there is a gradient that represents the degree to which computing can assist in decision making. On one end, the computer doesn’t help at all and, on the other end, the computer makes all decisions without human help. While clinicians may fear loss of autonomy, there is room for improvement over the current state of CDS in health care. Ultimately, one extreme or another isn’t appropriate, but somewhere in between is best. The level of computer intervention may not even be the same for all types of encounters.

The "Ten Commandments"

1. Speed Is Everything

Speed is a very important priority for clinicians and should be considered "a primary goal." The results from surveys support this. It is important to keep in mind this may differ from the priority of operations and administrative staff.

2. Anticipate Needs and Deliver in Real Time.

The decision support must be available and delivered at the right time. Furthermore, it should try to predict future care needs. When these "latent needs" were addressed by the EHR, there was increased likelihood of a desired action.

3. Fit into the User's Workflow.

It is important to apply decision support that is integrated into a user’s workflow. A great suggestion for care will have little impact if user’s don’t know to look for it or it takes too much time to find.

4. Little Things Can Make a Big Difference.

One doesn’t need to think of huge interventions to create change in outcomes. An example is cited in regards to input field type of a diagnosis. This change created a variable data element from a structured one. Even though this is an isolated input change in the EHR, it had a big impact downstream when it came to trying to later provide feedback about the diagnosis.

5. Recognize that Physicians Will Strongly Resist Stopping.

One must be aware when designing support systems that there is a strong desire to continue with the original plan. This is especially true when there is no alternative provided. The authors recommend allowing providers to override CDS reminders. They site an example of clinicians finding ways to “game the system” even in the case of mandatory stops.

6. Changing Direction Is Easier than Stopping.

In contrast to command #5, the ability to alter orders is significantly easier to implement. This is especially true if the providers don’t feel strongly about the suggested element being changed. The paper cites examples of doses, route, or frequency of medications or number of views needed for certain imaging studies.

7. Simple Interventions Work Best.

Try to keep CDS simple. The authors suggest that guidelines should easily fit on a single screen. Additionally, any tool that requires multiple inputs from the provider may result in aborting the tool prior to delivery of the support information.

8. Ask for Additional Information Only When You Really Need It.

This is related to rule #7 in that CDS should strive to limit the amount of inputted information from clinicians. The authors state, “our experience has been that the likelihood of success in implementing a computerized guideline is inversely proportional to the number of extra data elements needed.”

9. Monitor Impact, Get Feedback, and Respond.

It is important to be cognizant of the number and quality of reminders clinicians encounter. Poor quality or frequent interventions may lead to an increase in reminder dismissals, even when they happen to be very important. CDS designers should collect feedback from end users regarding low quality alerts and suggestions for improvement. Of note, the authors mention that they have an empiric threshold of about 60% for positive responses to reminders.

10. Manage and Maintain Your Knowledge-based Systems.

Regularly monitoring of CDS reminders is imperative to providing optimal care. Action may need to be taken if an unusual spike in alerts are appearing. This may represent new medical knowledge or outdated support tools that hasn’t made it’s way into the CDS program.

Summary

Bates et al. provide concise and easy-to-follow recommendations for those interested in implementing CDS in their practice. The problems of suboptimal care in medicine are well documented and CDS may be a tool by which this can be mitigated. Ultimately, the “Ten Commandments” presented are good guidelines, but the research done by the authors to support much of the recommendations were carried out within a single healthcare system. Also, the authors point out that the majority of the users were residents in training. These limitations suggest a need for better designed trials and systematic reviews of the topic.

Comparison to Systematic Reviews on CDS Success

When compared to a 2005 systematic review by Kawamoto et al.[2], the Ten Commandments paper had overlapping "rules." The systematic review covered 15 CDS features and found integration into a clinician's workflow to be the most statistically significant. This integration into clinician workflow equates to rule #3 of the "Ten Commandments." Other clinically significant rules in the review were providing recommendations rather than just warnings (similar to rule #5), delivery at the right time (similar to rule #2), and having the support be computer based (inherent in the Bates et al. paper). With all of these rules implemented in a CDS rule, the review found 94% success in the tool's goal.

Another 2005 systematic review by Garg et al.[3] found importance in serving the decision support at the right time and within the clinician's workflow. Interestingly, the authors of this systematic review also noted improved CDS success when a paper's author was also the tool's developer. This paper cites possible bias of the authors, better integration into workflow, and better access to technical help as potential explanations. This is applicable to the Ten Commandments authors as they also developed their CDS tools, so it is important to recognize potential biases.

A 2006 systematic review by Niès et al.[4] again found similar outcomes. This paper points to the following as indicators of CDS success: 1) alerts initiated by the system, 2) computer aided assistance, 3) automatic retrieval of information, and 4) additional actions provided in CPOE. Here, the review support directly and indrectly "Ten Commandments" rules #2 and 8.

A more recent 2013 systematic review[5] of 162 randomized control trials showed some differences with the "Ten Commandments." This 2013 review more broadly found that CDS embedded into order entry or charting were most likely to fail. This doesn't necessarily counter any of the "Ten Commandments" given how broad it is. A similar finding to that of the Garg et al. revie found improved outcomes when the tool was evaluated by those who developed it. New indicators of success not previously addressed included: 1) tools that provide support to clinicians as well as patients and 2) requiring clinicians to provide a reason for overriding an alert.

Predictors of Clinical Decision Support Success

  • Integration into user workflow
  • Deliver at an appropriate time
  • Provide recommendations rather than just warnings
  • Development of tools in-house
  • Limiting user input of information
  • Tools that provide instruction for both provider and patient
  • Require providers to input reason for overriding alerts

Further Discussion

In 2007, Dr David Bates, along with Dr Peter Gross, published more on CDS in computerized provider order entry (CPOE) systems: A pragmatic Approach to Implementing Best Practices for Clinical Decision Support Systems in Computerized Provider Order Entry Systems [6]

References

  1. Bates DW, Kuperman GJ, Wang S, et al. Ten Commandments for Effective Clinical Decision Support: Making the Practice of Evidence-based Medicine a Reality. Journal of the American Medical Informatics Association : JAMIA. 2003;10(6):523-530. doi:10.1197/jamia.M1370. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC264429/
  2. Kawamoto Kensaku, Houlihan Caitlin A, Balas E Andrew, Lobach David F. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success BMJ 2005; 330 :765. http://www.ncbi.nlm.nih.gov/pubmed/15767266
  3. Garg AX, Adhikari NJ, McDonald H, et al. Effects of Computerized Clinical Decision Support Systems on Practitioner Performance and Patient Outcomes: A Systematic Review. JAMA. 2005;293(10):1223-1238. doi:10.1001/jama.293.10.1223. http://www.ncbi.nlm.nih.gov/pubmed/15755945
  4. Niès J, Colombet I, Degoulet P, Durieux P. Determinants of Success for Computerized Clinical Decision Support Systems Integrated into CPOE Systems: a Systematic Review. AMIA Annual Symposium Proceedings. 2006;2006:594-598. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1839370/
  5. Roshanov Pavel S, Fernandes Natasha, Wilczynski Jeff M, Hemens Brian J, You John J, Handler Steven M et al. Features of effective computerised clinical decision support systems: meta-regression of 162 randomised trials 2013; 346 :f657. http://www.bmj.com/content/346/bmj.f657
  6. Gross PA, Bates DW. A Pragmatic Approach to Implementing Best Practices for Clinical Decision Support Systems in Computerized Provider Order Entry Systems. Journal of the American Medical Informatics Association : JAMIA. 2007;14(1):25-28. doi:10.1197/jamia.M2173. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2215068/

Submitted by Marc Tobias