Complementary methods of system usability evaluation: surveys and observations during software design and development cycles

From Clinfowiki
Revision as of 04:07, 12 November 2015 by RoniMV (Talk | contribs)

Jump to: navigation, search

This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.


Background

Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of Electronic Health Records (EHRs).

Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.

Usability of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.[1]

The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.

Methods

Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: comments from clinicians and findings derived from formal evaluation by usability experts.

  1. Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.
  2. Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions.
  3. Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software.[2]
  4. Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.

Results

Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.

Usability Assessment

The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:

  • Consistency
  • Transparency
  • Control
  • Context
  • Terminology
  • Biomedical
  • Safety
  • Customization
  • Fault
  • Speed
  • Workflow

All the results of the various methods and studies were presented in a number of tables and graphs.

Comments

The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%.

Findings

The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow

Conclusion

Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.

Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design

The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system

Comments

Overall however, each of the studies in this investigation work had very small sample sizes. While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative.

Related Articles

References

  1. Sittig, D.F. and Stead, W.W. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994; 1: 108–123.
  2. Morae. 3.1 ed., Okemos, MI: TechSmith Corporation; 2009.