Video analysis

From Clinfowiki
(Redirected from Video Analysis)
Jump to: navigation, search

Video analysis is an evaluation method of systems performance that involves videotaping all the interactions of a specified user population with the health information system (HIS) to determine whether HIS functions and components promote or inhibit more efficient or more effective patient care.

Description

Video analysis can be used in multiple ways; users may describe what they are doing and why as they work through a series of simulated or real tasks, or they may simply be taped as they go about their daily work. Consultants then analyze transcripts (generated manually or by computer) and actions observed on the tape to identify the functional and problematic components of the HIS. The recording logistics may be customized for each facility based on health unit work area configuration, user preferences, and financial resources. Using this method, consultants not only hear about HIS issues but also see what doesn't work. Furthermore, because consultants visually see problems, they need not ask users (who typically lack a software engineering background) how the system should be redesigned for better interactivity; they can use their education and experience to visualize solutions.

History

Video cameras have been used in the medical informatics field for more than a decade, and within other industries since development of video technology. Kushniruk and Patel (1) in 1995 described use of videotaping to gather data on medical system usability, and Kushniruk continues this work to date (2,3,4). Since then, it has been used to analyze user interaction with a medication order entry system (5), usability testing of medical devices by people with disabilities (6,7), and other medical applications.

Principle Use

Video analysis is used primarily, though not exclusively, during design, beta testing, or early implementation of HIS. It permits designers to determine how users' activity differs from designers' and programmers' vision and modify aspects of the HIS to better fit users' preferred work flow. It can also be used after an HIS has been in use for some time to identify opportunities for product refinement in future versions.

Advantages

Video analysis permits HIS designers to see how the product is used in the field rather than relying on potentially incomplete or misleading verbal descriptions, and offers visual clarification of issues mentioned by users. It allows users to demonstrate use patterns and problems with the HIS instead of struggling to describe challenges to designers who may not be familiar with their language. Video analysis also allows designers to address HIS issues without taking users away from the health unit or creating an additional burden on users' time.

Shortcomings

Some individuals are uncomfortable being videotaped and may avoid interacting with the HIS during periods when they know their activities will be recorded. If only one camera is used, it may be difficult to see everything the user does as well as the computer monitor simultaneously. Video analysis also yields large volumes of data that may consume many hours in analysis; Saadawi and associates used a video screen capture system in lieu of video recording medical students who tested an automated tutoring system to reduce the time spent analyzing data (8).

Examples in Informatics

Kuwata and colleagues (5) used video analysis to predict workflow changes and potential problems before release of a bar-code medication order entry system (MOES). They created a set of scenarios of MOES use in a large hospital that required physicians and nurses to give IV injections while using the system and interviewed the participants after they completed the scenarios. The investigators used a software coding system to interrelate and correlate video sequences with audio recordings, and two researchers coded the data to identify potential usability issues and characterize workflow. The researchers found noteworthy changes in process flow, e.g. steps that had been taken in parallel without the MOES must be completed sequentially when it is used, and MOES inefficiencies to be reworked prior to system release.

Lemke and associates (6,7) used video analysis to study medical devices' usability and accessibility for people with a wide range of disabilities and functional impairments. The Mobile Usability Lab tool integrates audio, video, and proprietary problem identification and analysis software to characterize challenges faced by people with disabilities who use medical instrumentation. An example of the video collected during user testing is available at http://www.rerc-ami.org/ami/projects/r/2/.

References

  1. Kushniruk AW, Patel VL. Cognitive computer-based video analysis: its application in assessing the usability of medical systems. Medinfo. 1995;8 Pt 2:1566-9.
  2. Kushniruk A, Patel V, Cimino JJ, Barrows RA. Cognitive evaluation of the user interface and vocabulary of an outpatient information system. Proceedings of the AMIA Fall Symposium. 1996;:22-6.
  3. Kushniruk AW, Patel VL, Cimino JJ. Usability testing in medical informatics: cognitive approaches to evaluation of information systems and user interfaces. Proceedings of the AMIA Fall Symposium. 1997;:218-22.
  4. Borycki E, Kushniruk A. Identifying and preventing texhnology-induced error using simulations: application of usability engineering techniques. Healthcare Quarterly. 2005;8 Spec No:99-105.
  5. Kuwata S, Kushniruk A, Borycki E, Watanabe H. Using simulation methods to analyze and predict changes in workflow and potential problems in the use of a bar-coding medication order entry system. AMIA 2006 Annual Proceedings. 994.
  6. Lemke M, Winters J, Danturthi S, Campbell S, Follette Story M, Barr A, Rempel D. A mobile tool for accessibility and usability testing of medical instrumentation. Conference Proceedings of the IEEE, Engineering in Medicine and Biology Society. 2004;7:4920-3.
  7. Rehabilitation Engineering Research Center on Accessible Medical Instrumentation Web site. Accessed February 23, 2007 at http://www.rerc-ami.org/ami/projects/d/1/.
  8. Saadawi GM, Legowski E, Medvedeva O, Chavan G, Crowley RS. A method for automated detection of usability problems from client user interface events. AMIA 2005 Symposium Proceedings, 654-8.