Developing analytical inspection criteria for health IT personnel with minimum training in cognitive ergonomics A practical solution to EHR improving EHR usability

From Clinfowiki
Jump to: navigation, search

The following is a review of the article “Developing Analytical Inspection Criteria for Health IT Personnel with Minimum Training in Cognitive Ergonomics: A Practical Solution to EHR Improving EHR Usability.” by Zhang et al.(1)


Introduction

As consequences of EHR implementation are being realized, increasing emphasis on system usability has become recognized as an essential area for EHR improvement. In this initial study, Zhang et al. address one challenge in improving usability of clinical information systems: Developing practical tools for HIT personnel with little background in usability methods in order to identify EHR usability issues.

Usability has long been recognized as an important consideration in design oriented disciplines, including medical device design. Interestingly, the article first highlights the differences between medical device and EHR design that create challenges in producing usable EHRs. First, in contrast to the EHR industry, standards for human factors design for medical devices have been established. Use and user variations for EHRs are also significant between organizations, and additional challenges are presented by sweeping changes taking place in healthcare in socio-technical and institutional aspects. Substantial local configuration occurs with EHRs whereas medical devices are ready-to-use when they come to market. And importantly, healthcare organizations rarely have IT personnel with experience in usability considerations for clinical EHR needs. It is on this point that the authors examine, and present a potential tool for usability inspection, focusing on cognitive ergonomics, for use by EHR analysts to improve system usability.

Background

Zhang et al. begin by highlighting one usability principle, learnability, and discuss the need for EHRs that clinicians can “figure out” without advanced training. Analytical inspection methods for usability analysis do not require observation of the user, and therefore can save on effort in data collection. These methods however are limited in some respect in how well they can identify usability problems. Through a discussion of some theoretical foundations behind the proposed inspection tool, the authors identify a human-computer interaction model called the resource model. This model describes the relationship between a resource category available on a user interface, and a user’s interaction strategy. Clinical work environments place a high demand on the clinician’s working memory, thus EHR user interfaces should only require low cognitive input, and not require information retrieval from memory. The authors describe the interaction strategy of goal-matching as one that does not require advanced cognitive planning. One resource category of the resource model is action-effect mapping. The authors propose that analytical inspection criteria for EHR user interfaces should determine whether action-effect mapping is well used on user interfaces.

Methods

The authors develop a metric to assess cognitive transparency of the user interface. The metric aims to measure action-effect mapping, in essence, will the user understand what will happen when they click a particular interface control. In this case the metric assesses whether “clickable” options (operations) on the interface clearly indicate their purpose, for example, a clickable control that says “edit”. When the meaning of a clickable interface feature is externally represented on the interface, it is considered cognitively transparent. When the meaning is internal (e.g., a drug name that is clickable, but no other information), the interface control is considered not cognitively transparent. The authors then validate this metric by measuring its specificity and sensitivity. Three participants were asked to anticipate what they thought would happen if they clicked on selected interface controls from an e-prescribing use case. The interface control was deemed cognitively transparent only if all three participants anticipated the actual effect. Among three EHR interfaces, prediction of cognitive transparency was calculated for user interface controls using the metric, and compared with the three participants’ anticipation of control function. Based on this metric, inspection criteria to identify action-effect mapping on EHR interfaces are proposed.

Results

For test EHR 1, 2, and 3, metric-identified and participant identified cognitive transparency matched for 60%, 70%, and 50% of interface elements, respectively. EHR interface elements identified as not transparent by the metric and by participants matched for 31%, 12%, and 39%, respectively, suggesting a high sensitivity and “acceptable” specificity of the cognitive transparency metric.

Discussion

Health IT personnel need additional tools in order to assure and improve EHR quality. Analytical inspection methods may be useful for identifying usability problems within the EHR, however these methods rely on theory and need to be validated. The authors present an initial validation method for their metric, and point to the importance of healthcare organizations employing validated inspection tools to improve EHR usability. The authors conclude by pointing to additional considerations for future studies, including more robust research design, inclusion of subjective measures to understand user perceptions, experimental control among different EHR interfaces, larger study sample size, and testing the reliability of proposed inspection tools.

Comments

Zhang et al. present an important aspect of EHR usability improvement: Using tested and validated improvement tools. With increasing recognition of EHR usability challenges, validated methods for system improvement will be essential. In the face of lacking interface standards, and high variation between vendor interfaces, tools to improve EHR usability seem like an important asset to Health IT. The study appears to be just preliminary and small-scale. Given the widespread need for EHR usability improvements, this aspect may seem unfortunate. In other words, perhaps we still have a long way to go in order to achieve EHR usability. Still, this study underscores the need for systematic tools in this effort, and provides a starting point to develop these tools.


References

1) Zhang Z, Franklin A, Walji M, Zhang J, Gong Y. Developing Analytical Inspection Criteria for Health IT Personnel with Minimum Training in Cognitive Ergonomics: A Practical Solution to EHR Improving EHR Usability. AMIA Annu Symp Proc. 2014;2014:1277–85. [1]

Submitted by Laura Hickerson