Difference between revisions of "Usability"

From Clinfowiki
Jump to: navigation, search
(Included standard definitions of usability as well as commonly used principles. Revised and updated page to include latest publications relating to EHR usability. Removed broken links and redid the references.)
(Related papers)
(17 intermediate revisions by 10 users not shown)
Line 3: Line 3:
 
== Definition ==
 
== Definition ==
  
The International Standards Organization (ISO) defines usability as "the extent to which a product can be used by ''specified'' users to achieve ''specified'' goals with effectiveness, efficiency and satisfaction in a ''specified'' context of use" <ref name="second">ISO/IEC, 9241-11 Ergonomic requirements for office work with visual display terminals (VDT)s - Part 11 Guidance on usability. 1998: ISO/IEC 9241-11: 1998 (E).</ref>.  Jef Raskin, a human-computer interface expert,  described usability as achieved through  “A humane interface [that] is responsive to human needs and considerate of human frailties” <ref name="third"> Raskin, Jef.  The Humane Interface:  New Directions for Designing Interactive Systems. Adison Wesley, 2000. </ref>.
+
There are multiple definitions of usability.  The most commonly used is the one defined by the International Standards Organization (ISO): usability is "the extent to which a product can be used by ''specified'' users to achieve ''specified'' goals with effectiveness, efficiency and satisfaction in a ''specified'' context of use" <ref name="second">ISO/IEC, 9241-11 Ergonomic requirements for office work with visual display terminals (VDT)s - Part 11 Guidance on usability. 1998: ISO/IEC 9241-11: 1998 (E).</ref>.  Jef Raskin, a human-computer interface expert,  described usability as achieved through  “A humane interface [that] is responsive to human needs and considerate of human frailties” <ref name="third"> Raskin, Jef.  The Humane Interface:  New Directions for Designing Interactive Systems. Adison Wesley, 2000. </ref>.
  
 
Usability expert Jakob Nielsen defines usability as having the following 5 components<ref name="fourth">Nielsen, Jakob.  Usability 101:  Introduction to Usability.  Nielsen Norman Group.  http://www.nngroup.com/articles/usability-101-introduction-to-usability/, 1/4/2012.</ref>:
 
Usability expert Jakob Nielsen defines usability as having the following 5 components<ref name="fourth">Nielsen, Jakob.  Usability 101:  Introduction to Usability.  Nielsen Norman Group.  http://www.nngroup.com/articles/usability-101-introduction-to-usability/, 1/4/2012.</ref>:
Line 12: Line 12:
 
#  Satisfaction—how well do users like using the product?  
 
#  Satisfaction—how well do users like using the product?  
  
Within healthcare information technology, usability is defined similarly.The National Center for Cognitive Informatics and Decision Making in Healthcare established TURF, a unified framework for usability in healthcare.  Within TURF, there are 3 measures of usability:  1) Useful, 2) Usable and 3) Satisfying <ref name="twenty-second"> Zhang, Jiajie and Walji, Muhammed F.  “TURF: Toward a unified framework of EHR usability”.  Journal of Biomedical Informatics Vol. 44, 2011. pp. 1056-1067. </ref>.
+
Within healthcare information technology, usability is defined similarly. The National Center for Cognitive Informatics and Decision Making in Healthcare uses 3 measures of usability:  1) Useful, 2) Usable and 3) Satisfying <ref name="twenty-second"> Zhang, Jiajie and Walji, Muhammed F.  “TURF: Toward a unified framework of EHR usability”.  Journal of Biomedical Informatics Vol. 44, 2011. pp. 1056-1067. </ref>.
  
 
==General Usability Principles==
 
==General Usability Principles==
Line 30: Line 30:
 
# Flexibility and efficiency of use
 
# Flexibility and efficiency of use
 
# Aesthetic and minimalist design
 
# Aesthetic and minimalist design
# Help users recognize, diagnose, and recover from errors
+
# Help users [[Minimizing Electronic Health Record Patient-Note Mismatches|recognize, diagnose, and recover from errors]]
 
# Help and documentation   
 
# Help and documentation   
  
There are many other published general usability principles <ref name="eighteenth"> Research-Based Web Design and Usability Guidelines.  US Department of Health and Human Services.  http://www.usability.gov/pdfs/guidelines.html.</ref>, <ref name="nineteenth"> Lidwell et al.  Universal Principles of Design.  Gloucester, Massachusetts:  Rockport Publishers, 2003. </ref>, <ref name="twentieth"> Tognazzini, Bruce.  “First Principles of Interaction Design (Revised and Expanded).  AskTog, 3/5/2014. http://asktog.com/atc/principles-of-interaction-design/ </ref>.  
+
There are many other published general usability principles <ref name="eighteenth"> Research-Based Web Design and Usability Guidelines.  US Department of Health and Human Services.  http://www.usability.gov/pdfs/guidelines.html.</ref>, <ref name="nineteenth"> Lidwell et al.  Universal Principles of Design.  Gloucester, Massachusetts:  Rockport Publishers, 2003. </ref>, <ref name="twentieth"> Tognazzini, Bruce.  “First Principles of Interaction Design (Revised and Expanded).  AskTog, 3/5/2014. http://asktog.com/atc/principles-of-interaction-design/ </ref> as well as ones that are defined for specific contexts <ref name="thirty-first">http://www.google.com/design/</ref>, <ref name="thirty-second">https://developer.apple.com/design/</ref>.
  
 
==EHR Usability Principles==  
 
==EHR Usability Principles==  
Line 65: Line 65:
  
 
== Usability in EHRs ==
 
== Usability in EHRs ==
It has been demonstrated that poor usability in [[EHR|electronic medical records (EMRs)]] has contributed to their poor [[EHR Adoption| adoption]] levels in the healthcare market as well as contributing to the introduction of [[Unintended Consequences|new categories of medical error in the delivery of healthcare]]<ref name="first"></ref>,<ref name="fifth">HIMSS EHR Usability Task Force.  Defining and Testing EMR Usability:  Principles and Proposed Methods of EMR Usability Evaluation and Rating.  HIMSS.  June, 2009. </ref>, <ref name="sixth"> Ash, et al. "Some Unintended Consequences of Information Technology in Health Care: The Nature of Patient Care Information System-related Errors". Journal of the American Medical Informatics Association, Vol 11, No 2, March/April 2004. </ref>, <ref name="seventh"> Koppel et al. “Role of Computerized Physician Order Entry Systems in Facilitating Medication Errors”, JAMA, Vol. 293, No. 10, March 2005, pp. 1197-1203. </ref>, <ref name="eighth">Meeks DW, et al.  "An analysis of electronic health record-related patient safety concerns". Journal of the American Medical Informatics Association, Vol 21, 2014. </ref>.  While EHR adoption has increased due to government incentives, such as the [[Meaningful_use| Meaningful Use]] program, user satisfaction of EHR has been steadily decreasing.  <ref name="ninth"> American College of Physicians.  "Survey of Clinicians: User satisfaction with electronic health records has decreased since 2010".  ACP. 3/5/2013. http://www.acponline.org/pressroom/ehrs_survey.htm. </ref>.  EHR usability affects clinic productivity, error rate and user fatigue <ref name="tenth">Pfister, Helen R. and Ingargiola, Susan R.  "ONC: Staying Focused on EHR Usability". iHealthBeat. 2/20/14. http://www.ihealthbeat.org/insight/2014/onc-staying-focused-on-ehr-usability </ref>.    A recent survey of EHR users indicates that a majority of users’ productivity has not improved with EHR use and only a minority are satisfied with their EHR. <ref name ="eleventh" > Robert L. Edsall and Kenneth G. Adler, MD, MMM. "The 2012 EHR User Satisfaction Survey: Responses From 3,088 Family Physicians".  Fam Pract Manag. 2012 Nov-Dec;19(11):23-30.</ref>. The Institute of Medicine issued a report on patient safety and concluded that usability is a key driver of safety.<ref name="twelveth"> Institute of Medicine. “Health IT and Patient Safety: Building Safer Systems for Patient Care”.  National Academies Press, Washington D.C., 2011. </ref> An analysis of errors resulting from the EHR within the Veterans Health Administration categorized these concerns as: 1) unmet data display needs 2) software modifications 3) system-system interfaces and 4) hidden dependencies<ref name="eighth"> </ref>.
+
It has been demonstrated that poor usability in [[EHR|electronic medical records (EMRs)]] has contributed to their poor [[EHR Adoption| adoption]] levels in the healthcare market as well as contributing to the introduction of [[Unintended Consequences|new categories of medical error in the delivery of healthcare]]<ref name="first"></ref>,<ref name="fifth">HIMSS EHR Usability Task Force.  Defining and Testing EMR Usability:  Principles and Proposed Methods of EMR Usability Evaluation and Rating.  HIMSS.  June, 2009. </ref>, <ref name="sixth"> Ash, et al. "Some Unintended Consequences of Information Technology in Health Care: The Nature of Patient Care Information System-related Errors". Journal of the American Medical Informatics Association, Vol 11, No 2, March/April 2004. </ref>, <ref name="seventh"> Koppel et al. “Role of Computerized Physician Order Entry Systems in Facilitating Medication Errors”, JAMA, Vol. 293, No. 10, March 2005, pp. 1197-1203. </ref>, <ref name="eighth">Meeks DW, et al.  "An analysis of electronic health record-related patient safety concerns". Journal of the American Medical Informatics Association, Vol 21, 2014. </ref>.  While EHR adoption has increased due to government incentives, such as the [[Meaningful_use| Meaningful Use]] program, user satisfaction of EHR has been steadily decreasing.  <ref name="ninth"> American College of Physicians.  "Survey of Clinicians: User satisfaction with electronic health records has decreased since 2010".  ACP. 3/5/2013. http://www.acponline.org/pressroom/ehrs_survey.htm. </ref>.  EHR usability affects clinic productivity, error rate and user fatigue <ref name="tenth">Pfister, Helen R. and Ingargiola, Susan R.  "ONC: Staying Focused on EHR Usability". iHealthBeat. 2/20/14. http://www.ihealthbeat.org/insight/2014/onc-staying-focused-on-ehr-usability </ref>.    A recent survey of EHR users indicates that a majority of users’ productivity has not improved with EHR use and only a minority are satisfied with their EHR. <ref name ="eleventh" > Robert L. Edsall and Kenneth G. Adler, MD, MMM. "The 2012 EHR User Satisfaction Survey: Responses From 3,088 Family Physicians".  Fam Pract Manag. 2012 Nov-Dec;19(11):23-30.</ref>. The Institute of Medicine issued a report on patient safety and concluded that usability is a key driver of safety.<ref name="twelveth"> Institute of Medicine. “Health IT and Patient Safety: Building Safer Systems for Patient Care”.  National Academies Press, Washington D.C., 2011. </ref> An analysis of errors resulting from the EHR within the Veterans Health Administration categorized these concerns as: 1) unmet data display needs 2) software modifications 3) system-system interfaces and 4) hidden dependencies<ref name="eighth"> </ref>.
 +
This page will lead to [[mHealth consumer apps]]
  
 
==EHR Usability Improvement Efforts==
 
==EHR Usability Improvement Efforts==
Usability has become such a concern for EHRs that several national organizations have studied the issue and published recommendations.   The [[Agency_for_Healthcare_Research_and_Quality_(AHRQ)| Agency for Healthcare Research and Quality (AHRQ)]] studied EHR vendor practices for usability and concluded that while vendors are concerned with the usability of their products, they have not followed standards for usability testing and practices nor is there cross-vendor collaboration for promoting usability <ref name= "twenty-first"> McDonnell C, Werner K, Wendel L. “Electronic Health Record Usability: Vendor Practices and Perspectives”. AHRQ Publication No. 09(10)-0091-3-EF. Rockville, MD: Agency for Healthcare Research and Quality. May 2010. </ref>.
+
Usability has become such a concern for EHRs that several national organizations have studied the issue and published recommendations.  
  
 +
===AHRQ Efforts===
 +
The [[Agency_for_Healthcare_Research_and_Quality_(AHRQ)| Agency for Healthcare Research and Quality (AHRQ)]] studied EHR vendor practices for usability and concluded that while vendors are concerned with the usability of their products, they have not followed standards for usability testing and practices nor is there cross-vendor collaboration for promoting usability <ref name= "twenty-first"> McDonnell C, Werner K, Wendel L. “Electronic Health Record Usability: Vendor Practices and Perspectives”. AHRQ Publication No. 09(10)-0091-3-EF. Rockville, MD: Agency for Healthcare Research and Quality. May 2010. </ref>.
 +
 +
AHRQ has published other reports relating to usability as well.  AHRQ published a toolkit for clinicians to assess the usability of an EHR <ref name="thirty-sixth"> Johnson, C. et al. "EHR Usability Toolkit: A Background Report on Usability and Electronic Health Records". AHRQ Publication No. 11-0084-EF, August, 2011. https://www.rti.org/pubs/ehr_usability_toolkit_background_report.pdf </ref>.  They also published recommendations for improving usability <ref name="thirty-seventh"> Armijo, D. et al. "Electronic Health Record Usability: Interface Design Considerations". AHRQ Publication No. 09(10)-0091-2-EF, October 2009. http://healthit.ahrq.gov/sites/default/files/docs/citation/09-10-0091-2-EF.pdf</ref> :
 +
 +
# Funding research on EHR usability in areas such as creating standardized use cases, evaluating clinician use, develop innovative ways for information display and determining best practices for EHR design.
 +
# Developing polices for EHR certification and a national EHR usability laboratory.
 +
 +
===AMA Recommendations===
 
The American Medical Association (AMA) has identified EHR usability improvement as an “important goal for our nation’s healthcare system” <ref name="thirteenth">American Medical Association. Improving Care: Priorities to Improve Electronic Health Record Usability, 2014.</ref>. The AMA identified 8 priorities for EHR usability<ref name="thirteenth"></ref>:  
 
The American Medical Association (AMA) has identified EHR usability improvement as an “important goal for our nation’s healthcare system” <ref name="thirteenth">American Medical Association. Improving Care: Priorities to Improve Electronic Health Record Usability, 2014.</ref>. The AMA identified 8 priorities for EHR usability<ref name="thirteenth"></ref>:  
  
Line 79: Line 89:
 
# Promote data liquidity
 
# Promote data liquidity
 
# Facilitate digital and mobile patient engagement
 
# Facilitate digital and mobile patient engagement
# Expedite user input into product design and post implementation feedback .  
+
# Expedite user input into product design and post implementation feedback  
 +
 
 +
===AMIA Recommendations===  
  
 
The American Medical Informatics Association (AMIA) published the following recommendations in four key areas<ref name="fourteenth"> Middleton B, et al.  "Enhancing patient safety and quality of care by improving the usability of electronic health record systems:  recommendations from AMIA", Journal of the American Medical Informatics Association 2013; 20 e2-e8.</ref>:  
 
The American Medical Informatics Association (AMIA) published the following recommendations in four key areas<ref name="fourteenth"> Middleton B, et al.  "Enhancing patient safety and quality of care by improving the usability of electronic health record systems:  recommendations from AMIA", Journal of the American Medical Informatics Association 2013; 20 e2-e8.</ref>:  
Line 97: Line 109:
 
#*Adopt best practices for EHR system implementation and ongoing management
 
#*Adopt best practices for EHR system implementation and ongoing management
 
#*Monitor how IT systems are use and report IT-related adverse events  
 
#*Monitor how IT systems are use and report IT-related adverse events  
 +
 +
===ONC Efforts===
  
 
The [[ONC|Office of the National Coordinator for Health IT (ONC)]] has  included new usability requirements in the 2014 EHR Standards and Certification Criteria<ref name="fifteenth"> Health and Human Services Department.  Health Information Technology: Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology, 2014 edition;  Revisions to the Permanent Certification Program for Health Information Technology. Federal Register, 9/4/2012. https://www.federalregister.gov/articles/2012/09/04/2012-20982/health-information-technology-standards-implementation-specifications-and-certification-criteria-for#h-50.  </ref>. First, in section 170.314(g)(3) of the Certification Final Rule, EHR vendors must apply user-centered design principles to the following 8 EHR features/functions<ref name="fifteenth"></ref>:  
 
The [[ONC|Office of the National Coordinator for Health IT (ONC)]] has  included new usability requirements in the 2014 EHR Standards and Certification Criteria<ref name="fifteenth"> Health and Human Services Department.  Health Information Technology: Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology, 2014 edition;  Revisions to the Permanent Certification Program for Health Information Technology. Federal Register, 9/4/2012. https://www.federalregister.gov/articles/2012/09/04/2012-20982/health-information-technology-standards-implementation-specifications-and-certification-criteria-for#h-50.  </ref>. First, in section 170.314(g)(3) of the Certification Final Rule, EHR vendors must apply user-centered design principles to the following 8 EHR features/functions<ref name="fifteenth"></ref>:  
Line 108: Line 122:
 
#Clinical information reconciliation.   
 
#Clinical information reconciliation.   
  
Second, in section 170.314(g)(4), EHR vendors must use a quality management system (QMS) for the development, testing, implementation and maintenance for all certified capabilities<ref name="fifteenth"></ref>.  
+
Second, in section 170.314(g)(4), EHR vendors must use a quality management system (QMS) for the development, testing, implementation and maintenance for all certified capabilities<ref name="fifteenth"></ref>.
  
 +
The ONC also sponsored the Strategic Healthcare IT Research Projects (SHARP) to address problems that impede the adoption of health IT <ref name="thirty-third">Strategic Health IT Advanced Research Projects (SHARP). HealthIT.gov, 2013.  http://www.healthit.gov/policy-researchers-implementers/strategic-health-it-advanced-research-projects-sharp. </ref>.  A couple of SHARP projects have focused on usability: a design guide for EHRS <ref name= "thirty-fourth">Belden, Jeff et al.  Inspired EHRs:  Designing for Clinicians. University of Missouri, 2014.  http://inspiredehrs.org/. </ref> and a usability toolkit <ref name="thirty-fifth"> Turf-EHR Usability Toolkit.  UT Health School of Biomedical Informatics, 2014. https://sbmi.uth.edu/nccd/turf/. </ref>.
  
 
== Usability Methods ==
 
== Usability Methods ==
Line 117: Line 132:
  
 
==List of Usability Methods==
 
==List of Usability Methods==
The Nielsen Norman group lists the following usability methods<ref name="twenty-fifth"></ref>:
+
There are many different methods for achieving usability; a usability method is essentially any activity that involves users (expert, representative or actual) during different stages of the design and development process.  The Nielsen Norman group (expert consultants in usability evaluation) provides the following list of usability methods<ref name="twenty-fifth"></ref>:
 +
*Multiple studies have used [[qualitative analysis]] in order to optimize usability
 
*Usability-lab studies:  users test software in a lab with a researcher focusing on specific tasks or scenarios.
 
*Usability-lab studies:  users test software in a lab with a researcher focusing on specific tasks or scenarios.
 
*Ethnographic field studies:  researchers observe users in their natural environment focusing on a particular task or software use.
 
*Ethnographic field studies:  researchers observe users in their natural environment focusing on a particular task or software use.
Line 127: Line 143:
 
*Moderated Remote Usability Studies: usability studies performed remotely using screen-sharing.
 
*Moderated Remote Usability Studies: usability studies performed remotely using screen-sharing.
 
*Unmoderated Remote Panel Studies:  usability studies using trained users done remotely using recording and screen capture software.   
 
*Unmoderated Remote Panel Studies:  usability studies using trained users done remotely using recording and screen capture software.   
*Concept testing: researcher presents approximation or prototype of new feature or product to users to determine if it meets the users’ needs.  Can be done in person or online.
+
*Concept testing: researcher presents to users an approximation or prototype of new feature or product to determine if it meets the users’ needs.  Can be done in person or online.
 
*Diary/Camera Studies:  users record aspects of their lives relevant to the product.  Typically longitudinal.
 
*Diary/Camera Studies:  users record aspects of their lives relevant to the product.  Typically longitudinal.
 
*Customer feedback: information provided by the user, usually done online through a link, form or email.
 
*Customer feedback: information provided by the user, usually done online through a link, form or email.
Line 139: Line 155:
 
*Email surveys:  user surveys that are solicited by email.
 
*Email surveys:  user surveys that are solicited by email.
  
Additional Methods:
+
=== Additional Methods ===
 
*Heuristic evaluation: usability or domain experts evaluate a software interface based on a provided list of heuristics. <ref name="twenty-sixth"> Nielsen, J., and Molich, R. (1990). "Heuristic evaluation of user interfaces", Proc. ACM CHI'90 Conf. (Seattle, WA, 1-5 April), 249-256. </ref>
 
*Heuristic evaluation: usability or domain experts evaluate a software interface based on a provided list of heuristics. <ref name="twenty-sixth"> Nielsen, J., and Molich, R. (1990). "Heuristic evaluation of user interfaces", Proc. ACM CHI'90 Conf. (Seattle, WA, 1-5 April), 249-256. </ref>
 
*[[Discount Usability Engineering|Discount usability testing]]:  using a small number of users (3-5) for usability lab studies or heuristic evaluation.  Since results are qualitative and used for improving interface design, the focus is on finding the usability errors rather than conclusively determining usability.  <ref name="twenty-seventh"> Nielsen, Jakob. “Guerrilla HCI: Using Discount Usability Engineering to Penetrate the Intimidation Barrier”.  Nielsen Norman Group, 1/1/1994. http://www.nngroup.com/articles/guerrilla-hci/. </ref>
 
*[[Discount Usability Engineering|Discount usability testing]]:  using a small number of users (3-5) for usability lab studies or heuristic evaluation.  Since results are qualitative and used for improving interface design, the focus is on finding the usability errors rather than conclusively determining usability.  <ref name="twenty-seventh"> Nielsen, Jakob. “Guerrilla HCI: Using Discount Usability Engineering to Penetrate the Intimidation Barrier”.  Nielsen Norman Group, 1/1/1994. http://www.nngroup.com/articles/guerrilla-hci/. </ref>
 
*[[Importance_of_Workflow_Analysis_During_Physician_Office_EMR_Implementation|Workflow Analysis]]: observing and measuring the work processes of users.  Ideally, it is done as part of the project planning phase as well as the post-release phase to ensure that the project matches the users’ workflow.
 
*[[Importance_of_Workflow_Analysis_During_Physician_Office_EMR_Implementation|Workflow Analysis]]: observing and measuring the work processes of users.  Ideally, it is done as part of the project planning phase as well as the post-release phase to ensure that the project matches the users’ workflow.
 
*Risk Assessment: determination of the amount of risk (quantitative or qualitative) a particular feature of a product raises <ref name="fifth"></ref>.
 
*Risk Assessment: determination of the amount of risk (quantitative or qualitative) a particular feature of a product raises <ref name="fifth"></ref>.
 +
 +
== Related papers ==
 +
* [[Using qualitative studies to improve the usability of an EMR]]
 +
* [[Toward successful migration to computerized physician order entry for chemotherapy]]
 +
* [[Are three methods better than one? A comparative assessment of usability evaluation methods in an EHR]]
 +
* [[Are three methods better than one]]
 +
* [[Integrating computerized clinical decision support systems into clinical work: A meta-synthesis of qualitative research.]]
 +
* [[Detection and characterization of usability problems in structured data entry interfaces in dentistry]]
 +
* [[The Cognitive Complexity of a Provider Order Entry Interface]]
 +
 
== References ==
 
== References ==
 
<references/>
 
<references/>
 +
 +
Submitted by Michelle Hribar
 +
 +
[[Category:BMI512-FALL-14]]
 
[[Category: EHR]]
 
[[Category: EHR]]
 +
[[Category: Interface, Usability and Accessibility]]

Revision as of 14:00, 5 November 2015

Usability is a quality metric that has been identified as a key factor in user satisfaction of health information technology among health care professionals [1]

Definition

There are multiple definitions of usability. The most commonly used is the one defined by the International Standards Organization (ISO): usability is "the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use" [2]. Jef Raskin, a human-computer interface expert, described usability as achieved through “A humane interface [that] is responsive to human needs and considerate of human frailties” [3].

Usability expert Jakob Nielsen defines usability as having the following 5 components[4]:

  1. Learnability—how easy is it for first time users to use the product?
  2. Efficiency—how quickly can experienced users perform tasks using the product?
  3. Memorability—how well can users remember how to use the product after returning to it later?
  4. Errors—how many, how severe and how recoverable are the errors that users make with the product?
  5. Satisfaction—how well do users like using the product?

Within healthcare information technology, usability is defined similarly. The National Center for Cognitive Informatics and Decision Making in Healthcare uses 3 measures of usability: 1) Useful, 2) Usable and 3) Satisfying [5].

General Usability Principles

There are many published principles for achieving usability of software. In 1988, usability expert Don Norman established four classic principles of usability in “The Psychopathology of Everyday Things”[6]:

  1. Affordance: visual cues indicate how to operate an object or an interface
  2. Visibility: all operations are easily seen or apparent
  3. Mapping: it is clear what the system or object response is to each potential action
  4. Feedback: the system or object responds appropriately to an action

Another usability expert, Jakob Nielsen, established 10 usability heuristics in 1995 which have become a standard for evaluating user interfaces [7]:

  1. Visibility of system status
  2. Match between system and the real world
  3. User control and freedom
  4. Consistency and standards
  5. Error prevention
  6. Recognition rather than recall
  7. Flexibility and efficiency of use
  8. Aesthetic and minimalist design
  9. Help users recognize, diagnose, and recover from errors
  10. Help and documentation

There are many other published general usability principles [8], [9], [10] as well as ones that are defined for specific contexts [11], [12].

EHR Usability Principles

Because usability is a concern for EHRs, several organizations have developed usability principles specific for EHRs. The Healthcare Information and Management Systems Society (HIMSS) established the following principles[13]:

  1. Simplicity
  2. Naturalness
  3. Consistency
  4. Minimizing cognitive load
  5. Efficient interactions
  6. Forgiveness and feedback
  7. Effective use of language
  8. Effective information presentation (appropriate density, meaningful use of color, readability)
  9. Preservation of context

The National Center for Cognitive Informatics and Decision Making in Healthcare proposed 14 usability principles based on evidence review[5]:

  1. Consistency in design and standards
  2. Visibility of system state
  3. Match between system and real word
  4. Minimalism
  5. Memory load minimization
  6. Informative feedback
  7. Flexible and customizable system
  8. Useful error messages
  9. Error prevention
  10. Clear closure
  11. Reversible actions
  12. User language
  13. User control
  14. Help and documentation

Usability in EHRs

It has been demonstrated that poor usability in electronic medical records (EMRs) has contributed to their poor adoption levels in the healthcare market as well as contributing to the introduction of new categories of medical error in the delivery of healthcare[1],[13], [14], [15], [16]. While EHR adoption has increased due to government incentives, such as the Meaningful Use program, user satisfaction of EHR has been steadily decreasing. [17]. EHR usability affects clinic productivity, error rate and user fatigue [18]. A recent survey of EHR users indicates that a majority of users’ productivity has not improved with EHR use and only a minority are satisfied with their EHR. [19]. The Institute of Medicine issued a report on patient safety and concluded that usability is a key driver of safety.[20] An analysis of errors resulting from the EHR within the Veterans Health Administration categorized these concerns as: 1) unmet data display needs 2) software modifications 3) system-system interfaces and 4) hidden dependencies[16]. This page will lead to mHealth consumer apps

EHR Usability Improvement Efforts

Usability has become such a concern for EHRs that several national organizations have studied the issue and published recommendations.

AHRQ Efforts

The Agency for Healthcare Research and Quality (AHRQ) studied EHR vendor practices for usability and concluded that while vendors are concerned with the usability of their products, they have not followed standards for usability testing and practices nor is there cross-vendor collaboration for promoting usability [21].

AHRQ has published other reports relating to usability as well. AHRQ published a toolkit for clinicians to assess the usability of an EHR [22]. They also published recommendations for improving usability [23] :

  1. Funding research on EHR usability in areas such as creating standardized use cases, evaluating clinician use, develop innovative ways for information display and determining best practices for EHR design.
  2. Developing polices for EHR certification and a national EHR usability laboratory.

AMA Recommendations

The American Medical Association (AMA) has identified EHR usability improvement as an “important goal for our nation’s healthcare system” [24]. The AMA identified 8 priorities for EHR usability[24]:

  1. Enhance physicians’ ability to provide high-quality care
  2. Support team-based care
  3. Promote care coordination
  4. Offer product modularity and configurability
  5. Reduce cognitive workload
  6. Promote data liquidity
  7. Facilitate digital and mobile patient engagement
  8. Expedite user input into product design and post implementation feedback

AMIA Recommendations

The American Medical Informatics Association (AMIA) published the following recommendations in four key areas[25]:

  1. Usability and human factors research agenda in health IT:
    • Prioritize standardized use cases for patient-safety sensitive EHR functionalities
    • Develop a core set of measures for adverse events related to health IT use
    • Research and promote best practices for safe implementation of EHR
  2. Policy recommendations:
    • Standardization and interoperability across EHR systems should take account of usability concerns
    • Establish an adverse event reporting system for health IT and voluntary health IT event reporting
    • Develop and disseminate an educational campaign on the safe and effective use of EHR
  3. Industry recommendations:
    • Develop a common user interface style guide for select EHR functionalities
    • Perform formal usability assessments on patient-safety sensitive EHR functionalities
  4. Clinical end-user recommendations:
    • Adopt best practices for EHR system implementation and ongoing management
    • Monitor how IT systems are use and report IT-related adverse events

ONC Efforts

The Office of the National Coordinator for Health IT (ONC) has included new usability requirements in the 2014 EHR Standards and Certification Criteria[26]. First, in section 170.314(g)(3) of the Certification Final Rule, EHR vendors must apply user-centered design principles to the following 8 EHR features/functions[26]:

  1. Computerized physician order entry
  2. Drug-drug and drug-allergy interaction checks
  3. Medication list
  4. Medication allergy list
  5. Clinical decision support
  6. Electronic medication administration record
  7. Electronic prescribing
  8. Clinical information reconciliation.

Second, in section 170.314(g)(4), EHR vendors must use a quality management system (QMS) for the development, testing, implementation and maintenance for all certified capabilities[26].

The ONC also sponsored the Strategic Healthcare IT Research Projects (SHARP) to address problems that impede the adoption of health IT [27]. A couple of SHARP projects have focused on usability: a design guide for EHRS [28] and a usability toolkit [29].

Usability Methods

To ensure usability and acceptance among end users, researchers and engineers use several key research methods to model user behavior. These methods range from qualitative observational studies to quantitative user testing and survey methods. Qualitative analyses can be used to better understand physician behavior after a system has been designed. Kushniruk and colleagues demonstrated changes in diagnostic reasoning as a result of program usage; clinicians structured their interviews to match the program's work-flow . [30]. Qualitative video & "think aloud" research paradigms can be used to compare or validate questionnaire responses. This methodology entails recording video either in a controlled laboratory environment or in more naturalistic environments, even in a practice site. In the "think aloud" procedure, subjects attempt to vocalize their thoughts as they interact with an application. Researchers found that while satisfaction can be rated highly for an information system, the think aloud transcription show that clinicians note several shortcomings during use. This example underlines the differences between online use experience and its recall, a possible weakness of survey data [31], [30].

Usability is achieved through user-centered design; the user is involved in all stages of the development process: project planning, requirements gathering, design, implementation, testing, release and post-release support and maintenance. During the initial phases (planning and requirements), usability methods include observational studies, interviews, focus groups and user surveys. During the design and implementation phases, usability methods include participatory design, prototyping, desirability studies and card sorting activities. During the release and post-release phases, usability methods include usability benchmarking, online assessments, surveys and A/B testing [32]. Similarly, Hollan and his colleagues describe a frame work of cyclical design that encompasses distributed cognition, usability testing, and design [33]. Indeed, usability assessments find their strength in the middle of the development cycle, so that developers can make refinements to the software (also known as "formative research") [31].

List of Usability Methods

There are many different methods for achieving usability; a usability method is essentially any activity that involves users (expert, representative or actual) during different stages of the design and development process. The Nielsen Norman group (expert consultants in usability evaluation) provides the following list of usability methods[32]:

  • Multiple studies have used qualitative analysis in order to optimize usability
  • Usability-lab studies: users test software in a lab with a researcher focusing on specific tasks or scenarios.
  • Ethnographic field studies: researchers observe users in their natural environment focusing on a particular task or software use.
  • Participatory design: users are asked to create designs from particular design elements.
  • Focus groups: small groups of users are led through a discussion about a given topic relating to a product or proposed product.
  • Interviews: one-on-one meetings with researcher and users to discuss a particular topic relating to a product or proposed product.
  • Eyetracking: users’ eye movements are captured while using software or performing a tasks using specially configured eyetracking devices .
  • Usability Benchmarking: scripted usability studies with users performing specific tasks which are measured using a pre-determined measure of performance.
  • Moderated Remote Usability Studies: usability studies performed remotely using screen-sharing.
  • Unmoderated Remote Panel Studies: usability studies using trained users done remotely using recording and screen capture software.
  • Concept testing: researcher presents to users an approximation or prototype of new feature or product to determine if it meets the users’ needs. Can be done in person or online.
  • Diary/Camera Studies: users record aspects of their lives relevant to the product. Typically longitudinal.
  • Customer feedback: information provided by the user, usually done online through a link, form or email.
  • Desirability studies: users are presented with design alternatives and asked to associate each with a set of attributes (provided from a given list).
  • Card sorting: users are asked to organize items into groups and categorize them. Helps assess users’ mental models of a system.
  • Clickstream analysis: user’s clicks and navigation are recorded and analyzed.
  • A/B testing: a formal method of testing different designs by randomly assigning users to evaluate a particular design and measuring effects on user behavior/performance.
  • Unmoderated UX studies: automated method that uses specialized software tool to capture user behaviors and attitudes. Usually done with a specific scenarios using a prototype.
  • True-intent studies: users are asked what their goal or intention is when they visit a particular website or use a particular software application.
  • Intercept surveys: user surveys that are triggered by visiting a website or using a software application.
  • Email surveys: user surveys that are solicited by email.

Additional Methods

  • Heuristic evaluation: usability or domain experts evaluate a software interface based on a provided list of heuristics. [34]
  • Discount usability testing: using a small number of users (3-5) for usability lab studies or heuristic evaluation. Since results are qualitative and used for improving interface design, the focus is on finding the usability errors rather than conclusively determining usability. [35]
  • Workflow Analysis: observing and measuring the work processes of users. Ideally, it is done as part of the project planning phase as well as the post-release phase to ensure that the project matches the users’ workflow.
  • Risk Assessment: determination of the amount of risk (quantitative or qualitative) a particular feature of a product raises [13].

Related papers

References

  1. 1.0 1.1 Murff HJ, Kannry J. "Physician Satisfaction with Two Order Entry Systems". J. Am Med Info Ass. 2001; 8: 499-509. http://www.ncbi.nlm.nih.gov/pubmed/11522770.
  2. ISO/IEC, 9241-11 Ergonomic requirements for office work with visual display terminals (VDT)s - Part 11 Guidance on usability. 1998: ISO/IEC 9241-11: 1998 (E).
  3. Raskin, Jef. The Humane Interface: New Directions for Designing Interactive Systems. Adison Wesley, 2000.
  4. Nielsen, Jakob. Usability 101: Introduction to Usability. Nielsen Norman Group. http://www.nngroup.com/articles/usability-101-introduction-to-usability/, 1/4/2012.
  5. 5.0 5.1 Zhang, Jiajie and Walji, Muhammed F. “TURF: Toward a unified framework of EHR usability”. Journal of Biomedical Informatics Vol. 44, 2011. pp. 1056-1067.
  6. Norman, Donald. “The Psychopathology of Everyday Things”. The Design of Everyday Things. New York, NY: Basic Books, 1988, Chap. 1.
  7. Nielsen, Jakob. “10 Usability Heuristics for User Interface Design”. Nielsen Norman Group. 1/1/1995. http://www.nngroup.com/articles/ten-usability-heuristics/.
  8. Research-Based Web Design and Usability Guidelines. US Department of Health and Human Services. http://www.usability.gov/pdfs/guidelines.html.
  9. Lidwell et al. Universal Principles of Design. Gloucester, Massachusetts: Rockport Publishers, 2003.
  10. Tognazzini, Bruce. “First Principles of Interaction Design (Revised and Expanded). AskTog, 3/5/2014. http://asktog.com/atc/principles-of-interaction-design/
  11. http://www.google.com/design/
  12. https://developer.apple.com/design/
  13. 13.0 13.1 13.2 HIMSS EHR Usability Task Force. Defining and Testing EMR Usability: Principles and Proposed Methods of EMR Usability Evaluation and Rating. HIMSS. June, 2009.
  14. Ash, et al. "Some Unintended Consequences of Information Technology in Health Care: The Nature of Patient Care Information System-related Errors". Journal of the American Medical Informatics Association, Vol 11, No 2, March/April 2004.
  15. Koppel et al. “Role of Computerized Physician Order Entry Systems in Facilitating Medication Errors”, JAMA, Vol. 293, No. 10, March 2005, pp. 1197-1203.
  16. 16.0 16.1 Meeks DW, et al. "An analysis of electronic health record-related patient safety concerns". Journal of the American Medical Informatics Association, Vol 21, 2014.
  17. American College of Physicians. "Survey of Clinicians: User satisfaction with electronic health records has decreased since 2010". ACP. 3/5/2013. http://www.acponline.org/pressroom/ehrs_survey.htm.
  18. Pfister, Helen R. and Ingargiola, Susan R. "ONC: Staying Focused on EHR Usability". iHealthBeat. 2/20/14. http://www.ihealthbeat.org/insight/2014/onc-staying-focused-on-ehr-usability
  19. Robert L. Edsall and Kenneth G. Adler, MD, MMM. "The 2012 EHR User Satisfaction Survey: Responses From 3,088 Family Physicians". Fam Pract Manag. 2012 Nov-Dec;19(11):23-30.
  20. Institute of Medicine. “Health IT and Patient Safety: Building Safer Systems for Patient Care”. National Academies Press, Washington D.C., 2011.
  21. McDonnell C, Werner K, Wendel L. “Electronic Health Record Usability: Vendor Practices and Perspectives”. AHRQ Publication No. 09(10)-0091-3-EF. Rockville, MD: Agency for Healthcare Research and Quality. May 2010.
  22. Johnson, C. et al. "EHR Usability Toolkit: A Background Report on Usability and Electronic Health Records". AHRQ Publication No. 11-0084-EF, August, 2011. https://www.rti.org/pubs/ehr_usability_toolkit_background_report.pdf
  23. Armijo, D. et al. "Electronic Health Record Usability: Interface Design Considerations". AHRQ Publication No. 09(10)-0091-2-EF, October 2009. http://healthit.ahrq.gov/sites/default/files/docs/citation/09-10-0091-2-EF.pdf
  24. 24.0 24.1 American Medical Association. Improving Care: Priorities to Improve Electronic Health Record Usability, 2014.
  25. Middleton B, et al. "Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA", Journal of the American Medical Informatics Association 2013; 20 e2-e8.
  26. 26.0 26.1 26.2 Health and Human Services Department. Health Information Technology: Standards, Implementation Specifications, and Certification Criteria for Electronic Health Record Technology, 2014 edition; Revisions to the Permanent Certification Program for Health Information Technology. Federal Register, 9/4/2012. https://www.federalregister.gov/articles/2012/09/04/2012-20982/health-information-technology-standards-implementation-specifications-and-certification-criteria-for#h-50.
  27. Strategic Health IT Advanced Research Projects (SHARP). HealthIT.gov, 2013. http://www.healthit.gov/policy-researchers-implementers/strategic-health-it-advanced-research-projects-sharp.
  28. Belden, Jeff et al. Inspired EHRs: Designing for Clinicians. University of Missouri, 2014. http://inspiredehrs.org/.
  29. Turf-EHR Usability Toolkit. UT Health School of Biomedical Informatics, 2014. https://sbmi.uth.edu/nccd/turf/.
  30. 30.0 30.1 Kushniruk AW, Patel VL. "Cognitive and usability engineering methods for the evaluation of clinical information systems". J Biomed Info. 2004; 37:56-76.
  31. 31.0 31.1 Kushniruk AW, Patel VL, Cimino JJ. "Usability testing in medical informatics: Cognitive approaches to evaluation of information systems and user interfaces". Proc AMIA Annu Fall Symp. 1997 : 218–222. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2233486/.
  32. 32.0 32.1 Rohrer, Christian. “When to use which user-experience research methods”. Nielsen Norman Group, 10/12/2014. http://www.nngroup.com/articles/which-ux-research-methods/
  33. Holllan J, Hutchins E, Kirsh D. "Distributed cognition: Toward a new foundation for human-computer interaction research". ACM Trans Comp Hum Int. 2000; 7:174-96 http://dl.acm.org/citation.cfm?id=353487
  34. Nielsen, J., and Molich, R. (1990). "Heuristic evaluation of user interfaces", Proc. ACM CHI'90 Conf. (Seattle, WA, 1-5 April), 249-256.
  35. Nielsen, Jakob. “Guerrilla HCI: Using Discount Usability Engineering to Penetrate the Intimidation Barrier”. Nielsen Norman Group, 1/1/1994. http://www.nngroup.com/articles/guerrilla-hci/.

Submitted by Michelle Hribar