Examining clinical decision support integrity: is clinician self-reported data entry accurate?

From Clinfowiki
Jump to: navigation, search

This is a review of the article titled "Examining clinical decision support integrity: is clinician self-reported data entry accurate?" published by the Journal of the American Medical Informatics Association in 2014. [1]


Background

As meaningful use compliance continues to engulf health care organizations, clinical decisions support systems are on the rise to be mandated by the federal government. CDS systems are considered to be an important tool for improving evidence based-practice and patient care. Thus their integrity and relevance is always in question. The reason behind this is because they are dependent on the clinical data input that is entered by clinicians. For instance, inaccurate CPOE (computerized provider order entry) can lead to erroneous [CDS recommendations and testing. According to the article, the accuracy of CPOE data entered into a CDS can be tested with a strong, robust, evidence-based CDS system.[1]

Objective

The objective of the study was to determine the accuracy and effects of clinician data entry-dependent CDS designed to guide evidence-based use of CT angiography (CTA ) for ED patients with suspected pulmonary embolus (PE).[1]

Methods

The study was conducted in the ED of 793 -bed academic center with a level 1 trauma center. All ED patients that were ordered a CTA by clinicians for suspect of PE within a one year were part of the study. The ED clinicians use the institutions CPOE system to place the CTA order. For the study the clinicians not only needed to enter the order but also specific data that would automatically calculate a Wells score and their laboratory descriptor of the serum D-dimer. If a patient with a high risk of PE based on their Wells score there was no CDS recommendation and the order was sent through the system with approval. However, if the patient had a low risk of PE based on the Wells score, the system would reference the D-dimer value. If the D-dimer was elevated the system would provide no advice and the order was placed. If the D-dimer was normal on a low risk patient the system advised not to place a CTA order.

Results

Of the 59,519 ED patients a total of 1296 had CTAs ordered for suspected PE. Clinicians accurately entered D-dimers descriptors for 1175 of the patients (90.7%). Of the 1296 orders, the study showed that 121 were data entry errors (9.3%). Overall of the 121 inaccurate imaging requests 12 involved patients who were, per evidence-based guidelines, should have had a CTA based on their elevated D-dimer values but who did not because of the inaccurate data entered into the CDS.

Discussion

Based on the study, the results showed that the majority (>90%) of the cases in which clinical data was entered by clinicians into the CDS when requesting for evaluation of PE in the ED was accurate. Although there was a small margin of error, it is stated that more integration of the institution's EMR and CDS system along with quality improvement strategies could help minimize these types of errors.

Comments

The article portrays a good example of how clinicians help improve clinical decision support systems for healthcare organizations. Allowing them to be involved in the decision process by entering clinical information data helps them to view the system as a tool to deliver a better quality of care for their patients. It also empowers them and gives them a better perspective of how they are delivering their care then just entering orders on the fly. Another aspect that benefits from this type of data entry is the CDS system itself. The knowledge base will continue to grow and be more integrated with the EMR to produce better quality initiative reports.

References

  1. 1.0 1.1 1.2 Gupta, A., Raja, A. S., & Khorasani, R. (2014). Examining clinical decision support integrity: is clinician self-reported data entry accurate? Journal of the American Medical Informatics Association : JAMIA, 21(1), 23–26. Retrieved from http://jamia.oxfordjournals.org/content/21/1/23.long