From Clinfowiki
Jump to: navigation, search

Usability is a quality metric that has been identified as a key factor in user satisfaction of health information technology among health care professionals [1]


There are multiple definitions of usability. The most commonly used is the one defined by the International Standards Organization (ISO): usability is "the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use" [2]. Jef Raskin, a human-computer interface expert, described usability as achieved through “A humane interface [that] is responsive to human needs and considerate of human frailties” [3].

Usability expert Jakob Nielsen defines usability as having the following 5 components[4]:

  1. Learnability—how easy is it for first time users to use the product?
  2. Efficiency—how quickly can experienced users perform tasks using the product?
  3. Memorability—how well can users remember how to use the product after returning to it later?
  4. Errors—how many, how severe and how recoverable are the errors that users make with the product?
  5. Satisfaction—how well do users like using the product?

Within healthcare information technology, usability is defined similarly. The National Center for Cognitive Informatics and Decision Making in Healthcare uses 3 measures of usability: 1) Useful, 2) Usable and 3) Satisfying [5].

General Usability Principles

There are many published principles for achieving usability of software. In 1988, usability expert Don Norman established four classic principles of usability in “The Psychopathology of Everyday Things”[6]:

  1. Affordance: visual cues indicate how to operate an object or an interface
  2. Visibility: all operations are easily seen or apparent
  3. Mapping: it is clear what the system or object response is to each potential action
  4. Feedback: the system or object responds appropriately to an action

Another usability expert, Jakob Nielsen, established 10 usability heuristics in 1995 which have become a standard for evaluating user interfaces [7]:

  1. Visibility of system status
  2. Match between system and the real world
  3. User control and freedom
  4. Consistency and standards
  5. Error prevention
  6. Recognition rather than recall
  7. Flexibility and efficiency of use
  8. Aesthetic and minimalist design
  9. Help users recognize, diagnose, and recover from errors
  10. Help and documentation

There are many other published general usability principles [8], [9], [10] as well as ones that are defined for specific contexts [11], [12].

EHR Usability Principles

Because usability is a concern for EHRs, several organizations have developed usability principles specific for EHRs. The Healthcare Information and Management Systems Society (HIMSS) established the following principles[13]:

  1. Simplicity
  2. Naturalness
  3. Consistency
  4. Minimizing cognitive load
  5. Efficient interactions
  6. Forgiveness and feedback
  7. Effective use of language
  8. Effective information presentation (appropriate density, meaningful use of color, readability)
  9. Preservation of context

The National Center for Cognitive Informatics and Decision Making in Healthcare proposed 14 usability principles based on evidence review[5]:

  1. Consistency in design and standards
  2. Visibility of system state
  3. Match between system and real word
  4. Minimalism
  5. Memory load minimization
  6. Informative feedback
  7. Flexible and customizable system
  8. Useful error messages
  9. Error prevention
  10. Clear closure
  11. Reversible actions
  12. User language
  13. User control
  14. Help and documentation

Usability in EHRs

It has been demonstrated that poor usability in electronic medical records (EMRs) has contributed to their poor adoption levels in the healthcare market as well as contributing to the introduction of new categories of medical error in the delivery of healthcare[1],[13], [14], [15], [16]. While EHR adoption has increased due to government incentives, such as the Meaningful Use program, user satisfaction of EHR has been steadily decreasing. [17]. EHR usability affects clinic productivity, error rate and user fatigue [18]. A recent survey of EHR users indicates that a majority of users’ productivity has not improved with EHR use and only a minority are satisfied with their EHR. [19]. The Institute of Medicine issued a report on patient safety and concluded that usability is a key driver of safety.[20] An analysis of errors resulting from the EHR within the Veterans Health Administration categorized these concerns as: 1) unmet data display needs 2) software modifications 3) system-system interfaces and 4) hidden dependencies[16]. This page will lead to mHealth consumer apps

EHR Usability Improvement Efforts

Usability has become such a concern for EHRs that several national organizations have studied the issue and published recommendations.

AHRQ Efforts

The Agency for Healthcare Research and Quality (AHRQ) studied EHR vendor practices for usability and concluded that while vendors are concerned with the usability of their products, they have not followed standards for usability testing and practices nor is there cross-vendor collaboration for promoting usability [21].

AHRQ has published other reports relating to usability as well. AHRQ published a toolkit for clinicians to assess the usability of an EHR [22]. They also published recommendations for improving usability [23] :

  1. Funding research on EHR usability in areas such as creating standardized use cases, evaluating clinician use, develop innovative ways for information display and determining best practices for EHR design.
  2. Developing polices for EHR certification and a national EHR usability laboratory.

AMA Recommendations

The American Medical Association (AMA) has identified EHR usability improvement as an “important goal for our nation’s healthcare system” [24]. The AMA identified 8 priorities for EHR usability[24]:

  1. Enhance physicians’ ability to provide high-quality care
  2. Support team-based care
  3. Promote care coordination
  4. Offer product modularity and configurability
  5. Reduce cognitive workload
  6. Promote data liquidity
  7. Facilitate digital and mobile patient engagement
  8. Expedite user input into product design and post implementation feedback

AMIA Recommendations

The American Medical Informatics Association (AMIA) published the following recommendations in four key areas[25]:

  1. Usability and human factors research agenda in health IT:
    • Prioritize standardized use cases for patient-safety sensitive EHR functionalities
    • Develop a core set of measures for adverse events related to health IT use
    • Research and promote best practices for safe implementation of EHR
  2. Policy recommendations:
    • Standardization and interoperability across EHR systems should take account of usability concerns
    • Establish an adverse event reporting system for health IT and voluntary health IT event reporting
    • Develop and disseminate an educational campaign on the safe and effective use of EHR
  3. Industry recommendations:
    • Develop a common user interface style guide for select EHR functionalities
    • Perform formal usability assessments on patient-safety sensitive EHR functionalities
  4. Clinical end-user recommendations:
    • Adopt best practices for EHR system implementation and ongoing management
    • Monitor how IT systems are use and report IT-related adverse events

ONC Efforts

The Office of the National Coordinator for Health IT (ONC) has included new usability requirements in the 2014 EHR Standards and Certification Criteria[26]. First, in section 170.314(g)(3) of the Certification Final Rule, EHR vendors must apply user-centered design principles to the following 8 EHR features/functions[26]:

  1. Computerized physician order entry
  2. Drug-drug and drug-allergy interaction checks
  3. Medication list
  4. Medication allergy list
  5. Clinical decision support
  6. Electronic medication administration record
  7. Electronic prescribing
  8. Clinical information reconciliation.

Second, in section 170.314(g)(4), EHR vendors must use a quality management system (QMS) for the development, testing, implementation and maintenance for all certified capabilities[26].

The ONC also sponsored the Strategic Healthcare IT Research Projects (SHARP) to address problems that impede the adoption of health IT [27]. A couple of SHARP projects have focused on usability: a design guide for EHRS [28] and a usability toolkit [29].

Usability Methods

To ensure usability and acceptance among end users, researchers and engineers use several key research methods to model user behavior. These methods range from qualitative observational studies to quantitative user testing and survey methods. Qualitative analyses can be used to better understand physician behavior after a system has been designed. Kushniruk and colleagues demonstrated changes in diagnostic reasoning as a result of program usage; clinicians structured their interviews to match the program's work-flow . [30]. Qualitative video & "think aloud" research paradigms can be used to compare or validate questionnaire responses. This methodology entails recording video either in a controlled laboratory environment or in more naturalistic environments, even in a practice site. In the "think aloud" procedure, subjects attempt to vocalize their thoughts as they interact with an application. Researchers found that while satisfaction can be rated highly for an information system, the think aloud transcription show that clinicians note several shortcomings during use. This example underlines the differences between online use experience and its recall, a possible weakness of survey data [31], [30].

Usability is achieved through user-centered design; the user is involved in all stages of the development process: project planning, requirements gathering, design, implementation, testing, release and post-release support and maintenance. During the initial phases (planning and requirements), usability methods include observational studies, interviews, focus groups and user surveys. During the design and implementation phases, usability methods include participatory design, prototyping, desirability studies and card sorting activities. During the release and post-release phases, usability methods include usability benchmarking, online assessments, surveys and A/B testing [32]. Similarly, Hollan and his colleagues describe a frame work of cyclical design that encompasses distributed cognition, usability testing, and design [33]. Indeed, usability assessments find their strength in the middle of the development cycle, so that developers can make refinements to the software (also known as "formative research") [31].

List of Usability Methods

There are many different methods for achieving usability; a usability method is essentially any activity that involves users (expert, representative or actual) during different stages of the design and development process. The Nielsen Norman group (expert consultants in usability evaluation) provides the following list of usability methods[32]:

  • Multiple studies have used qualitative analysis in order to optimize usability
  • Usability-lab studies: users test software in a lab with a researcher focusing on specific tasks or scenarios.
  • Ethnographic field studies: researchers observe users in their natural environment focusing on a particular task or software use.
  • Participatory design: users are asked to create designs from particular design elements.
  • Focus groups: small groups of users are led through a discussion about a given topic relating to a product or proposed product.
  • Interviews: one-on-one meetings with researcher and users to discuss a particular topic relating to a product or proposed product.
  • Eyetracking: users’ eye movements are captured while using software or performing a tasks using specially configured eyetracking devices .
  • Usability Benchmarking: scripted usability studies with users performing specific tasks which are measured using a pre-determined measure of performance.
  • Moderated Remote Usability Studies: usability studies performed remotely using screen-sharing.
  • Unmoderated Remote Panel Studies: usability studies using trained users done remotely using recording and screen capture software.
  • Concept testing: researcher presents to users an approximation or prototype of new feature or product to determine if it meets the users’ needs. Can be done in person or online.
  • Diary/Camera Studies: users record aspects of their lives relevant to the product. Typically longitudinal.
  • Customer feedback: information provided by the user, usually done online through a link, form or email.
  • Desirability studies: users are presented with design alternatives and asked to associate each with a set of attributes (provided from a given list).
  • Card sorting: users are asked to organize items into groups and categorize them. Helps assess users’ mental models of a system.
  • Clickstream analysis: user’s clicks and navigation are recorded and analyzed.
  • A/B testing: a formal method of testing different designs by randomly assigning users to evaluate a particular design and measuring effects on user behavior/performance.
  • Unmoderated UX studies: automated method that uses specialized software tool to capture user behaviors and attitudes. Usually done with a specific scenarios using a prototype.
  • True-intent studies: users are asked what their goal or intention is when they visit a particular website or use a particular software application.
  • Intercept surveys: user surveys that are triggered by visiting a website or using a software application.
  • Email surveys: user surveys that are solicited by email.

Additional Methods

  • Heuristic evaluation: usability or domain experts evaluate a software interface based on a provided list of heuristics. [34]
  • Discount usability testing: using a small number of users (3-5) for usability lab studies or heuristic evaluation. Since results are qualitative and used for improving interface design, the focus is on finding the usability errors rather than conclusively determining usability. [35]
  • Workflow Analysis: observing and measuring the work processes of users. Ideally, it is done as part of the project planning phase as well as the post-release phase to ensure that the project matches the users’ workflow.
  • Risk Assessment: determination of the amount of risk (quantitative or qualitative) a particular feature of a product raises [13].

Related papers


  1. 1.0 1.1 Murff HJ, Kannry J. "Physician Satisfaction with Two Order Entry Systems". J. Am Med Info Ass. 2001; 8: 499-509.
  2. ISO/IEC, 9241-11 Ergonomic requirements for office work with visual display terminals (VDT)s - Part 11 Guidance on usability. 1998: ISO/IEC 9241-11: 1998 (E).
  3. Raskin, Jef. The Humane Interface: New Directions for Designing Interactive Systems. Adison Wesley, 2000.
  4. Nielsen, Jakob. Usability 101: Introduction to Usability. Nielsen Norman Group., 1/4/2012.
  5. 5.0 5.1 Zhang, Jiajie and Walji, Muhammed F. “TURF: Toward a unified framework of EHR usability”. Journal of Biomedical Informatics Vol. 44, 2011. pp. 1056-1067.
  6. Norman, Donald. “The Psychopathology of Everyday Things”. The Design of Everyday Things. New York, NY: Basic Books, 1988, Chap. 1.
  7. Nielsen, Jakob. “10 Usability Heuristics for User Interface Design”. Nielsen Norman Group. 1/1/1995.
  8. Research-Based Web Design and Usability Guidelines. US Department of Health and Human Services.
  9. Lidwell et al. Universal Principles of Design. Gloucester, Massachusetts: Rockport Publishers, 2003.
  10. Tognazzini, Bruce. “First Principles of Interaction Design (Revised and Expanded). AskTog, 3/5/2014.
  13. 13.0 13.1 13.2 HIMSS EHR Usability Task Force. Defining and Testing EMR Usability: Principles and Proposed Methods of EMR Usability Evaluation and Rating. HIMSS. June, 2009.
  14. Ash, et al. "Some Unintended Consequences of Information Technology in Health Care: The Nature of Patient Care Information System-related Errors". Journal of the American Medical Informatics Association, Vol 11, No 2, March/April 2004.
  15. Koppel et al. “Role of Computerized Physician Order Entry Systems in Facilitating Medication Errors”, JAMA, Vol. 293, No. 10, March 2005, pp. 1197-1203.
  16. 16.0 16.1 Meeks DW, et al. "An analysis of electronic health record-related patient safety concerns". Journal of the American Medical Informatics Association, Vol 21, 2014.
  17. American College of Physicians. "Survey of Clinicians: User satisfaction with electronic health records has decreased since 2010". ACP. 3/5/2013.
  18. Pfister, Helen R. and Ingargiola, Susan R. "ONC: Staying Focused on EHR Usability". iHealthBeat. 2/20/14.
  19. Robert L. Edsall and Kenneth G. Adler, MD, MMM. "The 2012 EHR User Satisfaction Survey: Responses From 3,088 Family Physicians". Fam Pract Manag. 2012 Nov-Dec;19(11):23-30.
  20. Institute of Medicine. “Health IT and Patient Safety: Building Safer Systems for Patient Care”. National Academies Press, Washington D.C., 2011.
  21. McDonnell C, Werner K, Wendel L. “Electronic Health Record Usability: Vendor Practices and Perspectives”. AHRQ Publication No. 09(10)-0091-3-EF. Rockville, MD: Agency for Healthcare Research and Quality. May 2010.
  22. Johnson, C. et al. "EHR Usability Toolkit: A Background Report on Usability and Electronic Health Records". AHRQ Publication No. 11-0084-EF, August, 2011.
  23. Armijo, D. et al. "Electronic Health Record Usability: Interface Design Considerations". AHRQ Publication No. 09(10)-0091-2-EF, October 2009.
  24. 24.0 24.1 American Medical Association. Improving Care: Priorities to Improve Electronic Health Record Usability, 2014.
  25. Middleton B, et al. "Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA", Journal of the American Medical Informatics Association 2013; 20 e2-e8.
  26. 26.0 26.1 26.2 Health and Human Services Department. Health Information Technology: Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology, 2014 edition; Revisions to the Permanent Certification Program for Health Information Technology. Federal Register, 9/4/2012.
  27. Strategic Health IT Advanced Research Projects (SHARP)., 2013.
  28. Belden, Jeff et al. Inspired EHRs: Designing for Clinicians. University of Missouri, 2014.
  29. Turf-EHR Usability Toolkit. UT Health School of Biomedical Informatics, 2014.
  30. 30.0 30.1 Kushniruk AW, Patel VL. "Cognitive and usability engineering methods for the evaluation of clinical information systems". J Biomed Info. 2004; 37:56-76.
  31. 31.0 31.1 Kushniruk AW, Patel VL, Cimino JJ. "Usability testing in medical informatics: Cognitive approaches to evaluation of information systems and user interfaces". Proc AMIA Annu Fall Symp. 1997 : 218–222.
  32. 32.0 32.1 Rohrer, Christian. “When to use which user-experience research methods”. Nielsen Norman Group, 10/12/2014.
  33. Holllan J, Hutchins E, Kirsh D. "Distributed cognition: Toward a new foundation for human-computer interaction research". ACM Trans Comp Hum Int. 2000; 7:174-96
  34. Nielsen, J., and Molich, R. (1990). "Heuristic evaluation of user interfaces", Proc. ACM CHI'90 Conf. (Seattle, WA, 1-5 April), 249-256.
  35. Nielsen, Jakob. “Guerrilla HCI: Using Discount Usability Engineering to Penetrate the Intimidation Barrier”. Nielsen Norman Group, 1/1/1994.

Submitted by Michelle Hribar