http://www.clinfowiki.org/wiki/api.php?action=feedcontributions&user=RoniMV&feedformat=atomClinfowiki - User contributions [en]2024-03-29T15:02:17ZUser contributionsMediaWiki 1.22.4http://www.clinfowiki.org/wiki/index.php/3M_Health_Information_Systems3M Health Information Systems2015-12-22T04:02:51Z<p>RoniMV: </p>
<hr />
<div><br />
3M Health Information Systems provides ntelligent tools to help compile and use health information for better clinical and financial performance. Best known for our market-leading coding system and ICD-10 expertise, 3M Health Information Systems also delivers innovative software and consulting services for clinical documentation improvement, computer-assisted coding, case mix and quality outcomes reporting, mobile physician solutions, and a robust healthcare data dictionary and terminology services to support your EHR.<br />
<br />
The web-based 3M Health System Performance Suite is built on 3M’s industry-leading risk stratification methodologies, including the 3M APR DRG Classification System and 3M Potentially Preventable Events (3M PPEs) software, which identifies hospital readmissions, complications, admissions, and other events that may be avoidable. The first modules of the new system are now available, offering easy-to-navigate interactive dashboards and powerful internal, state and federal data reporting tools.<br />
<br />
<br />
3M Health Information Systems established in 1983. Now, it has many applications in health informatics systems including following systems.<br />
<br />
<br />
*Coding and reimbursement<br />
*ICD-10 solutions and services<br />
*Medical records abstracting<br />
*Patient care planning<br />
*Classification and grouping<br />
*Clinical documentation improvement<br />
*Coding and billing compliance<br />
*Consulting services<br />
*Health data interoperability<br />
*Medical dictation and transcription<br />
*Online medical record<br />
*Pay for performance <br />
*Healthcare revenue cycle management<br />
<br />
Additional products offered<br />
<br />
Population Health Management (http://solutions.3m.com/wps/portal/3M/en_US/Health-Information-Systems/HIS/Products-and-Services/Population-Health-Management/)<br />
<br />
Solutions for small hospitals (http://solutions.3m.com/wps/portal/3M/en_US/Health-Information-Systems/HIS/Products-and-Services/Solutions-for-Small-Hospitals/)<br />
<br />
Value based health care (http://solutions.3m.com/wps/portal/3M/en_US/Health-Information-Systems/HIS/Products-and-Services/Value-based-Healthcare/)<br />
<br />
<br />
== Reference ==<br />
http://solutions.3m.com/wps/portal/3M/en_US/Health-Information-Systems/HIS/</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Lorenzo_patient_record_systemsLorenzo patient record systems2015-12-22T02:41:25Z<p>RoniMV: /* Introduction */</p>
<hr />
<div>'''Lorenzo patient record systems''' was a health record system developed in the United Kingdom and launched June 2010. It is currently dismantled and is largely regarded as one of the most expensive IT failures in Healthcare.<br />
<br />
== Introduction ==<br />
<br />
Lorenzo's systems offically went live on June 1, 2010 at its first trust in Morecambe Bay.<br />
<br />
Lorenzo is a bespoke developed Patient Health Record system developed by CSC for the National Health System in the UK. <br />
<br />
It has been tested meets It also meets all required healthcare IT standards — HL7, SNOMED Clinical Terms, Patient Demographic Service, Dictionary of Medicines and Devices — ensuring these standards apply to any healthcare system.<br />
Its open architecture enables access to data for managing and optimizing business processes so that clinical information is shared and care management is achieved. Further, medical standards, terminology and codes that help clinicians make accurate, informed care decisions are based on real-time information.<br />
CSC Awarded Follow-on Healthcare Contract Worth up to $90 Million over 10 Years <br />
<br />
==Developments==<br />
<br />
'''January 2015''': CSC received a follow-on contract to continue supporting the National Institute of Allergy and Infectious Diseases (NIAID), part of the National Institutes of Health. Valued at up to $90 million, the award is a 10-year Indefinite Delivery/Indefinite Quantity contract to support and conduct Phase 1 clinical trials of infectious disease therapeutics.<br />
<br />
== References ==<br />
<br />
# Health Service Journal [Online Article]. Lorenzo goes Live at Morecambe Bay [updated 2010; cited 7 Sep 2015][cited 9-7-2015. Available from: http://www.hsj.co.uk/news/technology/lorenzo-goes-live-at-morecambe-bay/5015377.article#.UqR9_OImRPY <br />
<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-19T15:53:40Z<p>RoniMV: /* Conclusion */</p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
'''Complementary methods of system usability evaluation: surveys and observations during software design and development cycles.'''<br />
<br />
Horsky J, McColgan K, Pang JE, Melnikas AJ, Linder JA, Schnipper JL, Middleton B.<br />
<br />
''J Biomed Inform.'' 2010 Oct;43(5):782-90. doi: 10.1016/j.jbi.2010.05.010.<br />
<br />
http://www.ncbi.nlm.nih.gov/pubmed/?term=Complementary+methods+of+system+usability+evaluation%3A+surveys+and+observations+during+software+design+and+development+cycles.<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of [[ EMR| Electronic Health Records (EHRs)]].<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
[[Usability|Usability]] of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.<ref name=’'Sittig''> Sittig, D.F. and Stead, W.W. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994; 1: 108–123. </ref><br />
<br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: '''comments''' from clinicians and '''findings''' derived from formal evaluation by usability experts.<br />
#Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
#Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions.<br />
# Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software.<ref Name = ''Morae''> Morae. 3.1 ed., Okemos, MI: TechSmith Corporation; 2009. </ref><br />
# Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
* Consistency<br />
*Transparency<br />
* Control<br />
* Context<br />
* Terminology<br />
* Biomedical<br />
* Safety<br />
* Customization<br />
* Fault<br />
* Speed<br />
* Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points of the design and deployment process of an EHR system<br />
<br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Are three methods better than one]]<br />
*[[Clinical Information Systems from Software Development Perspective]]<br />
<br />
== References == <br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category:Evaluation]]<br />
[[Category: EHR]]<br />
[[Category: HI5313-2015-FALL]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/A_Review_of_Emerging_Technologies_for_the_Management_of_Diabetes_MellitusA Review of Emerging Technologies for the Management of Diabetes Mellitus2015-11-19T03:38:30Z<p>RoniMV: /* Methods */</p>
<hr />
<div>The following is a review of Zarkogianni et al. 2015 review regarding the emerging technologies used for the management of Diabetes Mellitus: <ref name = "Zarkogianni et al. 2015"> Zarkogianni, K., Litsa, E., Mitsis, K., Wu, P., Kaddi, C., Cheng, C., ... & Nikita, K. (2015). A Review of Emerging Technologies for the Management of Diabetes Mellitus. http://www.ncbi.nlm.nih.gov/pubmed/26292334 </ref><br />
<br />
==Review 1==<br />
=== Introduction/Background ===<br />
Due to the rise in cost of health care delivery across the United States in patients suffering from chronic diseases, there has been an increasing trend towards the prevention of such diseases since each day the number of patients which can afford a treatment for their disease constantly increases. Zarkogianni et al. 2015, explore the utilization of these new technologies as means to prevent the pitfalls of treatments in the population by using the latest sensoring technologies and [[CDS]] in order to facilitate self-managing in patients and support decision making in physicians.<br />
<br />
=== Methods ===<br />
The review evaluated the following technologies:<br />
<br />
* Sensors for Glucose and lifestyle monitoring<br />
* [[Clinical Decision Support]] Systems (CDSS) for diabetes management <br />
* [[Predictive analytics|Predictive modeling]] using molecular data to assess the onset or progression of DM (Diabetes Mellitus)<br />
<br />
=== Results ===<br />
* Sensoring technology is evolving from a traditional invasive procedure towards a non-invasive procedure. Although, this shift is taking place the reliability of such test conducted under non-invasive methods aren't as accurate as those in traditional invasive procedures thus total support can't be given to them, however their evolution is a fact and we are not far from experiencing such technologies. <br />
<br />
* CDSS turns out to be the giant whose setting the path for the rest of this technologies. Backed-up with evidence-based medicine and support from a portion of the medical community, this tool can indeed increase the rate of un-diagnosed patients at risk of developing DM by physicians thus it is a must to maintain and optimize this technology so that its reliability and acceptance isn't lost by the medical community since its adoption is currently undergoing and hasn't ended to establish it as a permanent tool around the clinical setting. <br />
<br />
* In regards to the use of molecular data, it is an new technology with solid scientific information used for the unveiling of correlations and patterns observed in the development of DM. <br />
<br />
=== Conclusion ===<br />
Zerkogianni et al. 2015, recognize the increase in the rate of the evolution of technologies and that their integration with other clinical systems in the health setting such as an [[ EMR| EHR]] can optimize even more the information gathered through them to provide a higher quality of prevention rates within the cluster of not only DM, but other chronic illnesses. However, they also established that although there is an ongoing current adoption this process hasn't been fully achieved across every single health care setting. There are several clinical settings in which this systems haven't been adopted despite the focus on achieving meaningful use around the US. Moreover, they also acknowledge that none of the explored technologies are close to be perfect and an ongoing update and maintenance is required to fulfill the needs of those ill.<br />
<br />
=== Comments === <br />
I would criticize that there wasn't any evaluation in regards to m-Health an ongoing and fairly new system been implemented as a medium to also support the tracking and self-managing in patients suffering from chronic illnesses. Also, although we still face a portion of the clinical population who restrains from using technologies such as the ones discussed there is more than enough evidence of the benefits financially for the physicians and in quality for the service delivered to the patients. Due to these reasons the shift and full or partial adoption of this technologies will eventually take place around the next decade being that evidence illustrates a better quality of health care in the different scopes of its delivery. <br />
<br />
==Review 2==<br />
===Introduction===<br />
High prevalence of Diabetes Mellitus (DM) along with the poor health outcomes and the escalated costs of treatment and care poses the need to focus on prevention, early detection and improved management of the disease. The aim of this paper is to present and discuss the latest accomplishments in sensors for glucose and lifestyle monitoring along with Clinical Decision Support Systems (CDSS) facilitating self-disease management and supporting healthcare professionals in decision making.<br />
<br />
<br />
A critical literature review analysis is conducted focusing on advances in: <br />
*sensors for physiological and lifestyle monitoring <br />
*models and molecular biomarkers for predicting the onset and assessing the progress of DM <br />
*modeling and control methods for regulating glucose levels.<br />
<br />
===Results===<br />
Glucose and lifestyle sensing technologies are continuously evolving withcurrent research focusing on the development of noninvasive sensors for accurate glucose monitoring. A wide range of modeling, classification, clustering and control approaches have been deployed for the development of CDSS for diabetes management. Sophisticated multiscale, multilevel modeling frameworks taking into account information from behavioral down to molecular level are necessary to reveal correlations and patterns indicating the onset and evolution of DM.<br />
<br />
===Conclusion===<br />
Integration of data originating from sensor based systems and Electronic Health Records (EHR) combined with smart data analytical methods and powerful user centered approaches enable the shift toward preventive, predictive, personalized and participatory diabetes care.<br />
<br />
===Comments===<br />
The potential of sensing and predictive modeling approaches toward improving diabetes management is highlighted and the related challenges were identified. Helping patients with self-management is definitely needed with a widespread (and lifestyle dependent) disease such as diabetes.<br />
<br />
== References ==<br />
<references/><br />
<br />
[[Category: Reviews]]<br />
[[Category:CDS]]<br />
[[Category:CDSS]]<br />
[[Category: HI5313-2015-FALL]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/A_Review_of_Emerging_Technologies_for_the_Management_of_Diabetes_MellitusA Review of Emerging Technologies for the Management of Diabetes Mellitus2015-11-19T03:38:08Z<p>RoniMV: /* Methods */</p>
<hr />
<div>The following is a review of Zarkogianni et al. 2015 review regarding the emerging technologies used for the management of Diabetes Mellitus: <ref name = "Zarkogianni et al. 2015"> Zarkogianni, K., Litsa, E., Mitsis, K., Wu, P., Kaddi, C., Cheng, C., ... & Nikita, K. (2015). A Review of Emerging Technologies for the Management of Diabetes Mellitus. http://www.ncbi.nlm.nih.gov/pubmed/26292334 </ref><br />
<br />
==Review 1==<br />
=== Introduction/Background ===<br />
Due to the rise in cost of health care delivery across the United States in patients suffering from chronic diseases, there has been an increasing trend towards the prevention of such diseases since each day the number of patients which can afford a treatment for their disease constantly increases. Zarkogianni et al. 2015, explore the utilization of these new technologies as means to prevent the pitfalls of treatments in the population by using the latest sensoring technologies and [[CDS]] in order to facilitate self-managing in patients and support decision making in physicians.<br />
<br />
=== Methods ===<br />
The review evaluated the following technologies:<br />
<br />
* Sensors for Glucose and lifestyle monitoring<br />
* [[Clinical Decision Support]] Systems (CDSS) for diabetes management <br />
* [[Predictive models|Predictive modeling]] using molecular data to assess the onset or progression of DM (Diabetes Mellitus)<br />
<br />
=== Results ===<br />
* Sensoring technology is evolving from a traditional invasive procedure towards a non-invasive procedure. Although, this shift is taking place the reliability of such test conducted under non-invasive methods aren't as accurate as those in traditional invasive procedures thus total support can't be given to them, however their evolution is a fact and we are not far from experiencing such technologies. <br />
<br />
* CDSS turns out to be the giant whose setting the path for the rest of this technologies. Backed-up with evidence-based medicine and support from a portion of the medical community, this tool can indeed increase the rate of un-diagnosed patients at risk of developing DM by physicians thus it is a must to maintain and optimize this technology so that its reliability and acceptance isn't lost by the medical community since its adoption is currently undergoing and hasn't ended to establish it as a permanent tool around the clinical setting. <br />
<br />
* In regards to the use of molecular data, it is an new technology with solid scientific information used for the unveiling of correlations and patterns observed in the development of DM. <br />
<br />
=== Conclusion ===<br />
Zerkogianni et al. 2015, recognize the increase in the rate of the evolution of technologies and that their integration with other clinical systems in the health setting such as an [[ EMR| EHR]] can optimize even more the information gathered through them to provide a higher quality of prevention rates within the cluster of not only DM, but other chronic illnesses. However, they also established that although there is an ongoing current adoption this process hasn't been fully achieved across every single health care setting. There are several clinical settings in which this systems haven't been adopted despite the focus on achieving meaningful use around the US. Moreover, they also acknowledge that none of the explored technologies are close to be perfect and an ongoing update and maintenance is required to fulfill the needs of those ill.<br />
<br />
=== Comments === <br />
I would criticize that there wasn't any evaluation in regards to m-Health an ongoing and fairly new system been implemented as a medium to also support the tracking and self-managing in patients suffering from chronic illnesses. Also, although we still face a portion of the clinical population who restrains from using technologies such as the ones discussed there is more than enough evidence of the benefits financially for the physicians and in quality for the service delivered to the patients. Due to these reasons the shift and full or partial adoption of this technologies will eventually take place around the next decade being that evidence illustrates a better quality of health care in the different scopes of its delivery. <br />
<br />
==Review 2==<br />
===Introduction===<br />
High prevalence of Diabetes Mellitus (DM) along with the poor health outcomes and the escalated costs of treatment and care poses the need to focus on prevention, early detection and improved management of the disease. The aim of this paper is to present and discuss the latest accomplishments in sensors for glucose and lifestyle monitoring along with Clinical Decision Support Systems (CDSS) facilitating self-disease management and supporting healthcare professionals in decision making.<br />
<br />
<br />
A critical literature review analysis is conducted focusing on advances in: <br />
*sensors for physiological and lifestyle monitoring <br />
*models and molecular biomarkers for predicting the onset and assessing the progress of DM <br />
*modeling and control methods for regulating glucose levels.<br />
<br />
===Results===<br />
Glucose and lifestyle sensing technologies are continuously evolving withcurrent research focusing on the development of noninvasive sensors for accurate glucose monitoring. A wide range of modeling, classification, clustering and control approaches have been deployed for the development of CDSS for diabetes management. Sophisticated multiscale, multilevel modeling frameworks taking into account information from behavioral down to molecular level are necessary to reveal correlations and patterns indicating the onset and evolution of DM.<br />
<br />
===Conclusion===<br />
Integration of data originating from sensor based systems and Electronic Health Records (EHR) combined with smart data analytical methods and powerful user centered approaches enable the shift toward preventive, predictive, personalized and participatory diabetes care.<br />
<br />
===Comments===<br />
The potential of sensing and predictive modeling approaches toward improving diabetes management is highlighted and the related challenges were identified. Helping patients with self-management is definitely needed with a widespread (and lifestyle dependent) disease such as diabetes.<br />
<br />
== References ==<br />
<references/><br />
<br />
[[Category: Reviews]]<br />
[[Category:CDS]]<br />
[[Category:CDSS]]<br />
[[Category: HI5313-2015-FALL]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/A_Review_of_Emerging_Technologies_for_the_Management_of_Diabetes_MellitusA Review of Emerging Technologies for the Management of Diabetes Mellitus2015-11-19T03:37:18Z<p>RoniMV: /* Methods */</p>
<hr />
<div>The following is a review of Zarkogianni et al. 2015 review regarding the emerging technologies used for the management of Diabetes Mellitus: <ref name = "Zarkogianni et al. 2015"> Zarkogianni, K., Litsa, E., Mitsis, K., Wu, P., Kaddi, C., Cheng, C., ... & Nikita, K. (2015). A Review of Emerging Technologies for the Management of Diabetes Mellitus. http://www.ncbi.nlm.nih.gov/pubmed/26292334 </ref><br />
<br />
==Review 1==<br />
=== Introduction/Background ===<br />
Due to the rise in cost of health care delivery across the United States in patients suffering from chronic diseases, there has been an increasing trend towards the prevention of such diseases since each day the number of patients which can afford a treatment for their disease constantly increases. Zarkogianni et al. 2015, explore the utilization of these new technologies as means to prevent the pitfalls of treatments in the population by using the latest sensoring technologies and [[CDS]] in order to facilitate self-managing in patients and support decision making in physicians.<br />
<br />
=== Methods ===<br />
The review evaluated the following technologies:<br />
<br />
* Sensors for Glucose and lifestyle monitoring<br />
* [[Clinical Decision Support]] Systems (CDSS) for diabetes management <br />
* [[ Predictive models|Predictive modeling]] using molecular data to assess the onset or progression of DM (Diabetes Mellitus)<br />
<br />
=== Results ===<br />
* Sensoring technology is evolving from a traditional invasive procedure towards a non-invasive procedure. Although, this shift is taking place the reliability of such test conducted under non-invasive methods aren't as accurate as those in traditional invasive procedures thus total support can't be given to them, however their evolution is a fact and we are not far from experiencing such technologies. <br />
<br />
* CDSS turns out to be the giant whose setting the path for the rest of this technologies. Backed-up with evidence-based medicine and support from a portion of the medical community, this tool can indeed increase the rate of un-diagnosed patients at risk of developing DM by physicians thus it is a must to maintain and optimize this technology so that its reliability and acceptance isn't lost by the medical community since its adoption is currently undergoing and hasn't ended to establish it as a permanent tool around the clinical setting. <br />
<br />
* In regards to the use of molecular data, it is an new technology with solid scientific information used for the unveiling of correlations and patterns observed in the development of DM. <br />
<br />
=== Conclusion ===<br />
Zerkogianni et al. 2015, recognize the increase in the rate of the evolution of technologies and that their integration with other clinical systems in the health setting such as an [[ EMR| EHR]] can optimize even more the information gathered through them to provide a higher quality of prevention rates within the cluster of not only DM, but other chronic illnesses. However, they also established that although there is an ongoing current adoption this process hasn't been fully achieved across every single health care setting. There are several clinical settings in which this systems haven't been adopted despite the focus on achieving meaningful use around the US. Moreover, they also acknowledge that none of the explored technologies are close to be perfect and an ongoing update and maintenance is required to fulfill the needs of those ill.<br />
<br />
=== Comments === <br />
I would criticize that there wasn't any evaluation in regards to m-Health an ongoing and fairly new system been implemented as a medium to also support the tracking and self-managing in patients suffering from chronic illnesses. Also, although we still face a portion of the clinical population who restrains from using technologies such as the ones discussed there is more than enough evidence of the benefits financially for the physicians and in quality for the service delivered to the patients. Due to these reasons the shift and full or partial adoption of this technologies will eventually take place around the next decade being that evidence illustrates a better quality of health care in the different scopes of its delivery. <br />
<br />
==Review 2==<br />
===Introduction===<br />
High prevalence of Diabetes Mellitus (DM) along with the poor health outcomes and the escalated costs of treatment and care poses the need to focus on prevention, early detection and improved management of the disease. The aim of this paper is to present and discuss the latest accomplishments in sensors for glucose and lifestyle monitoring along with Clinical Decision Support Systems (CDSS) facilitating self-disease management and supporting healthcare professionals in decision making.<br />
<br />
<br />
A critical literature review analysis is conducted focusing on advances in: <br />
*sensors for physiological and lifestyle monitoring <br />
*models and molecular biomarkers for predicting the onset and assessing the progress of DM <br />
*modeling and control methods for regulating glucose levels.<br />
<br />
===Results===<br />
Glucose and lifestyle sensing technologies are continuously evolving withcurrent research focusing on the development of noninvasive sensors for accurate glucose monitoring. A wide range of modeling, classification, clustering and control approaches have been deployed for the development of CDSS for diabetes management. Sophisticated multiscale, multilevel modeling frameworks taking into account information from behavioral down to molecular level are necessary to reveal correlations and patterns indicating the onset and evolution of DM.<br />
<br />
===Conclusion===<br />
Integration of data originating from sensor based systems and Electronic Health Records (EHR) combined with smart data analytical methods and powerful user centered approaches enable the shift toward preventive, predictive, personalized and participatory diabetes care.<br />
<br />
===Comments===<br />
The potential of sensing and predictive modeling approaches toward improving diabetes management is highlighted and the related challenges were identified. Helping patients with self-management is definitely needed with a widespread (and lifestyle dependent) disease such as diabetes.<br />
<br />
== References ==<br />
<references/><br />
<br />
[[Category: Reviews]]<br />
[[Category:CDS]]<br />
[[Category:CDSS]]<br />
[[Category: HI5313-2015-FALL]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/A_Review_of_Emerging_Technologies_for_the_Management_of_Diabetes_MellitusA Review of Emerging Technologies for the Management of Diabetes Mellitus2015-11-19T03:33:14Z<p>RoniMV: /* Conclusion */</p>
<hr />
<div>The following is a review of Zarkogianni et al. 2015 review regarding the emerging technologies used for the management of Diabetes Mellitus: <ref name = "Zarkogianni et al. 2015"> Zarkogianni, K., Litsa, E., Mitsis, K., Wu, P., Kaddi, C., Cheng, C., ... & Nikita, K. (2015). A Review of Emerging Technologies for the Management of Diabetes Mellitus. http://www.ncbi.nlm.nih.gov/pubmed/26292334 </ref><br />
<br />
==Review 1==<br />
=== Introduction/Background ===<br />
Due to the rise in cost of health care delivery across the United States in patients suffering from chronic diseases, there has been an increasing trend towards the prevention of such diseases since each day the number of patients which can afford a treatment for their disease constantly increases. Zarkogianni et al. 2015, explore the utilization of these new technologies as means to prevent the pitfalls of treatments in the population by using the latest sensoring technologies and [[CDS]] in order to facilitate self-managing in patients and support decision making in physicians.<br />
<br />
=== Methods ===<br />
The review evaluated the following technologies:<br />
<br />
* Sensors for Glucose and lifestyle monitoring<br />
* [[Clinical Decision Support]] Systems (CDSS) for diabetes management <br />
* Predictive modeling using molecular data to assess the onset or progression of DM (Diabetes Mellitus)<br />
<br />
=== Results ===<br />
* Sensoring technology is evolving from a traditional invasive procedure towards a non-invasive procedure. Although, this shift is taking place the reliability of such test conducted under non-invasive methods aren't as accurate as those in traditional invasive procedures thus total support can't be given to them, however their evolution is a fact and we are not far from experiencing such technologies. <br />
<br />
* CDSS turns out to be the giant whose setting the path for the rest of this technologies. Backed-up with evidence-based medicine and support from a portion of the medical community, this tool can indeed increase the rate of un-diagnosed patients at risk of developing DM by physicians thus it is a must to maintain and optimize this technology so that its reliability and acceptance isn't lost by the medical community since its adoption is currently undergoing and hasn't ended to establish it as a permanent tool around the clinical setting. <br />
<br />
* In regards to the use of molecular data, it is an new technology with solid scientific information used for the unveiling of correlations and patterns observed in the development of DM. <br />
<br />
=== Conclusion ===<br />
Zerkogianni et al. 2015, recognize the increase in the rate of the evolution of technologies and that their integration with other clinical systems in the health setting such as an [[ EMR| EHR]] can optimize even more the information gathered through them to provide a higher quality of prevention rates within the cluster of not only DM, but other chronic illnesses. However, they also established that although there is an ongoing current adoption this process hasn't been fully achieved across every single health care setting. There are several clinical settings in which this systems haven't been adopted despite the focus on achieving meaningful use around the US. Moreover, they also acknowledge that none of the explored technologies are close to be perfect and an ongoing update and maintenance is required to fulfill the needs of those ill.<br />
<br />
=== Comments === <br />
I would criticize that there wasn't any evaluation in regards to m-Health an ongoing and fairly new system been implemented as a medium to also support the tracking and self-managing in patients suffering from chronic illnesses. Also, although we still face a portion of the clinical population who restrains from using technologies such as the ones discussed there is more than enough evidence of the benefits financially for the physicians and in quality for the service delivered to the patients. Due to these reasons the shift and full or partial adoption of this technologies will eventually take place around the next decade being that evidence illustrates a better quality of health care in the different scopes of its delivery. <br />
<br />
==Review 2==<br />
===Introduction===<br />
High prevalence of Diabetes Mellitus (DM) along with the poor health outcomes and the escalated costs of treatment and care poses the need to focus on prevention, early detection and improved management of the disease. The aim of this paper is to present and discuss the latest accomplishments in sensors for glucose and lifestyle monitoring along with Clinical Decision Support Systems (CDSS) facilitating self-disease management and supporting healthcare professionals in decision making.<br />
<br />
<br />
A critical literature review analysis is conducted focusing on advances in: <br />
*sensors for physiological and lifestyle monitoring <br />
*models and molecular biomarkers for predicting the onset and assessing the progress of DM <br />
*modeling and control methods for regulating glucose levels.<br />
<br />
===Results===<br />
Glucose and lifestyle sensing technologies are continuously evolving withcurrent research focusing on the development of noninvasive sensors for accurate glucose monitoring. A wide range of modeling, classification, clustering and control approaches have been deployed for the development of CDSS for diabetes management. Sophisticated multiscale, multilevel modeling frameworks taking into account information from behavioral down to molecular level are necessary to reveal correlations and patterns indicating the onset and evolution of DM.<br />
<br />
===Conclusion===<br />
Integration of data originating from sensor based systems and Electronic Health Records (EHR) combined with smart data analytical methods and powerful user centered approaches enable the shift toward preventive, predictive, personalized and participatory diabetes care.<br />
<br />
===Comments===<br />
The potential of sensing and predictive modeling approaches toward improving diabetes management is highlighted and the related challenges were identified. Helping patients with self-management is definitely needed with a widespread (and lifestyle dependent) disease such as diabetes.<br />
<br />
== References ==<br />
<references/><br />
<br />
[[Category: Reviews]]<br />
[[Category:CDS]]<br />
[[Category:CDSS]]<br />
[[Category: HI5313-2015-FALL]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Evaluating_the_Impact_of_Information_Technology_Tools_to_Support_the_Asthma_Medical_HomeEvaluating the Impact of Information Technology Tools to Support the Asthma Medical Home2015-11-19T03:26:30Z<p>RoniMV: /* Results */</p>
<hr />
<div>==Introduction==<br />
A [[Patient Centered Medical Home|Patient-Centered Medical Home (PCMH)]] is a model in which health care providers, family members, and social entities such as schools and places of worship work together to achieve the best possible health care outcomes for a patient. The aim of this study was to create a PCMH for pediatric asthma patients using information technology tools, and analyze the outcomes that resulted from this intervention. <br />
==Methods==<br />
• The study took place at four pediatric practices of the Ambulatory Care Network of New York-Presbyterian Hospital/Columbia University Medical Center.<br />
<br />
• The [[EMR]]s of the practices were modified to include a section which focused on asthma care. The section included the Asthma Control Test (ACT) and other fields for data such as severity of asthma.<br />
<br />
• The patients were followed from July 2009- June 2013 to assess usage of emergency departments and inpatient visits.<br />
<br />
==Results==<br />
Overall, the implementation resulted in a reduction of [[Emergency Department Setting|Emergency Department (ED)]] and inpatient admissions due to asthma. By the end of the study, ED visits were reduced by 17% and inpatient visits were reduced by 47%.<br />
<br />
==Discussion==<br />
The authors state that the inclusion of IT tools in asthma management greatly improved workflow and patient care. One of the limitations of the study was that there was no control group, therefore it is difficult to assess whether the reductions in ED and inpatient visits were due to factors other than the intervention. <br />
==My comments==<br />
This study is a great example of how the implementation of IT tools can result in improved health outcomes. I was impressed with the level of collaboration between clinicians, families, and schools to better manage asthma cases.<ref name="Matiz et al. 2015"> Matiz, L.A., Robbins-Milne, L., Krause, M.C., Peretz, P.J., Rausch, J.C.(2015). Evaluating the Impact of Information Technology Tools to Support the Asthma Medical Home . Clinical Pediatrics. http://cpj.sagepub.com.ezproxyhost.library.tmc.edu/content/early/2015/07/17/0009922815596070.long. doi: 10.1177/0009922815596070</ref><br />
<br />
== References ==<br />
<references/><br />
<br />
[[Category: Reviews]]<br />
[[Category: Public Health]]<br />
[[Category: Technologies]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Identifying_Previously_Undetected_Harm:_Piloting_the_Institute_for_Healthcare_Improvement%27s_Global_Trigger_Tool_in_the_Veterans_Health_AdministrationIdentifying Previously Undetected Harm: Piloting the Institute for Healthcare Improvement's Global Trigger Tool in the Veterans Health Administration2015-11-19T03:08:39Z<p>RoniMV: /* Related Articles */</p>
<hr />
<div>This is a review of the 2015 article "Identifying Previously Undetected Harm: Piloting the Institute for Healthcare Improvement's Global Trigger Tool in the Veterans Health Administration" by Mull et al.<ref name="GTT"> http://www.ncbi.nlm.nih.gov/pubmed/?term=Identifying+Previously+Undetected+Harm%3A+Piloting+the+Institute+for+Healthcare+Improvement%E2%80%99s+Global Mull HJ, Brennan CW, Folkes T, Hermos J, Chan J, Rosen AK, Simon SR.Identifying Previously Undetected Harm: Piloting the Institute for Healthcare Improvement's Global Trigger Tool in the Veterans Health Administration. Qual Manag Health Care. 2015 Jul-Sep;24(3):140-6. doi: 10.1097/QMH.0000000000000060.</ref><br />
<br />
== Introduction ==<br />
Adverse Event (AE) detection is an essential component of organizational patient safety programs, but at the moment, most organizations use resource-intensive and unreliable methods to detect these errors, such as random chart reviews or voluntary incident reporting. The Institute for Healthcare Improvement (IHI) developed a set of algorithms for different AEs and have a specific protocol for confirmatory chart review (using the [[EHR]]) that is much less time and resource-intensive than traditional random chart review protocols. So far, this process has been found to be better at determining "true positive" adverse events than many other methods. <br />
<br />
This study proposes to assess the effectiveness of the Institute for Healthcare Improvement’s [[Global trigger tool|Global Trigger Tool (GTT)]] in a VA facility by examining the overlap of AE detection between GTT and existing surveillance measures.<br />
<br />
== Methods ==<br />
<br />
The VA facility adapted the existing GTT methodology slightly, but roughly, "the IHI GTT process imposes a 20-minute time limit in which a trained reviewer, typically a nurse, records whether any of 52 triggers are evident. Trigger-flagged cases then undergo a second round of review by a physician to confirm the event and assign a rating of degree of harm using a validated harm scale. The second review may take as little as 2–5 minutes."<ref name="GTT"></ref> For this study, the population included medical/surgical hospitalizations at one large VA facility with a discharge between July 1 and October 27, 2012. <br />
<br />
<br />
Possible Adverse Events (trigger examples): <br />
(e.g.: blood or blood product; device or medical-surgical supply, including health information technology; fall; HAI; medication or other substance; pressure ulcer; surgery or anesthesia; and venous thromboembolism) <br />
<br />
== Results ==<br />
109 AEs identified using GTT methodology<br />
<br />
88% of identified AEs were not detected by the existing surveillance measures such as VA Surgical Quality Program (VASQIP) or Patient Safety Quality Indicators (PSIs)<br />
<br />
60% the AEs identified resulted in minor harm <br />
<br />
== Discussion ==<br />
<br />
This study corroborates with studies done in the private sector that showed the GTT is helpful in identifying AEs that are often not detected via other methods (or are detected with greater efficiency and less cost than those other methods). The authors of this study hope that its success will influence the VA to take steps to implement the GTT widely, across all institutions.<br />
<br />
== Comments ==<br />
This study is another example of the importance of using EHRs and the data they collect to further the improvement of patient safety in organizations. It is so important to use HIT to its potential and to do so in the most efficient and cost effective ways.<br />
<br />
== Related Articles ==<br />
<br />
[[Electronic health record-based triggers to detect potential delays in cancer diagnosis]]<br />
<br />
[[Electronic health record-based surveillance of diagnostic errors in primary care]]<br />
<br />
[[Department of Veterans Affairs Initiatives]]<br />
<br />
[[Global trigger tool]]<br />
<br />
== References ==<br />
<references/><br />
<br />
[[Category:Reviews]]<br />
[[Category:EHR]]<br />
[[Category: HI5313-2015-FALL]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Identifying_Previously_Undetected_Harm:_Piloting_the_Institute_for_Healthcare_Improvement%27s_Global_Trigger_Tool_in_the_Veterans_Health_AdministrationIdentifying Previously Undetected Harm: Piloting the Institute for Healthcare Improvement's Global Trigger Tool in the Veterans Health Administration2015-11-19T03:06:09Z<p>RoniMV: /* Introduction */</p>
<hr />
<div>This is a review of the 2015 article "Identifying Previously Undetected Harm: Piloting the Institute for Healthcare Improvement's Global Trigger Tool in the Veterans Health Administration" by Mull et al.<ref name="GTT"> http://www.ncbi.nlm.nih.gov/pubmed/?term=Identifying+Previously+Undetected+Harm%3A+Piloting+the+Institute+for+Healthcare+Improvement%E2%80%99s+Global Mull HJ, Brennan CW, Folkes T, Hermos J, Chan J, Rosen AK, Simon SR.Identifying Previously Undetected Harm: Piloting the Institute for Healthcare Improvement's Global Trigger Tool in the Veterans Health Administration. Qual Manag Health Care. 2015 Jul-Sep;24(3):140-6. doi: 10.1097/QMH.0000000000000060.</ref><br />
<br />
== Introduction ==<br />
Adverse Event (AE) detection is an essential component of organizational patient safety programs, but at the moment, most organizations use resource-intensive and unreliable methods to detect these errors, such as random chart reviews or voluntary incident reporting. The Institute for Healthcare Improvement (IHI) developed a set of algorithms for different AEs and have a specific protocol for confirmatory chart review (using the [[EHR]]) that is much less time and resource-intensive than traditional random chart review protocols. So far, this process has been found to be better at determining "true positive" adverse events than many other methods. <br />
<br />
This study proposes to assess the effectiveness of the Institute for Healthcare Improvement’s [[Global trigger tool|Global Trigger Tool (GTT)]] in a VA facility by examining the overlap of AE detection between GTT and existing surveillance measures.<br />
<br />
== Methods ==<br />
<br />
The VA facility adapted the existing GTT methodology slightly, but roughly, "the IHI GTT process imposes a 20-minute time limit in which a trained reviewer, typically a nurse, records whether any of 52 triggers are evident. Trigger-flagged cases then undergo a second round of review by a physician to confirm the event and assign a rating of degree of harm using a validated harm scale. The second review may take as little as 2–5 minutes."<ref name="GTT"></ref> For this study, the population included medical/surgical hospitalizations at one large VA facility with a discharge between July 1 and October 27, 2012. <br />
<br />
<br />
Possible Adverse Events (trigger examples): <br />
(e.g.: blood or blood product; device or medical-surgical supply, including health information technology; fall; HAI; medication or other substance; pressure ulcer; surgery or anesthesia; and venous thromboembolism) <br />
<br />
== Results ==<br />
109 AEs identified using GTT methodology<br />
<br />
88% of identified AEs were not detected by the existing surveillance measures such as VA Surgical Quality Program (VASQIP) or Patient Safety Quality Indicators (PSIs)<br />
<br />
60% the AEs identified resulted in minor harm <br />
<br />
== Discussion ==<br />
<br />
This study corroborates with studies done in the private sector that showed the GTT is helpful in identifying AEs that are often not detected via other methods (or are detected with greater efficiency and less cost than those other methods). The authors of this study hope that its success will influence the VA to take steps to implement the GTT widely, across all institutions.<br />
<br />
== Comments ==<br />
This study is another example of the importance of using EHRs and the data they collect to further the improvement of patient safety in organizations. It is so important to use HIT to its potential and to do so in the most efficient and cost effective ways.<br />
<br />
== Related Articles ==<br />
<br />
[[Electronic health record-based triggers to detect potential delays in cancer diagnosis]]<br />
<br />
[[Electronic health record-based surveillance of diagnostic errors in primary care]]<br />
<br />
[[Department of Veterans Affairs Initiatives]]<br />
<br />
== References ==<br />
<references/><br />
<br />
[[Category:Reviews]]<br />
[[Category:EHR]]<br />
[[Category: HI5313-2015-FALL]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Identifying_Previously_Undetected_Harm:_Piloting_the_Institute_for_Healthcare_Improvement%27s_Global_Trigger_Tool_in_the_Veterans_Health_AdministrationIdentifying Previously Undetected Harm: Piloting the Institute for Healthcare Improvement's Global Trigger Tool in the Veterans Health Administration2015-11-19T03:04:59Z<p>RoniMV: /* Introduction */</p>
<hr />
<div>This is a review of the 2015 article "Identifying Previously Undetected Harm: Piloting the Institute for Healthcare Improvement's Global Trigger Tool in the Veterans Health Administration" by Mull et al.<ref name="GTT"> http://www.ncbi.nlm.nih.gov/pubmed/?term=Identifying+Previously+Undetected+Harm%3A+Piloting+the+Institute+for+Healthcare+Improvement%E2%80%99s+Global Mull HJ, Brennan CW, Folkes T, Hermos J, Chan J, Rosen AK, Simon SR.Identifying Previously Undetected Harm: Piloting the Institute for Healthcare Improvement's Global Trigger Tool in the Veterans Health Administration. Qual Manag Health Care. 2015 Jul-Sep;24(3):140-6. doi: 10.1097/QMH.0000000000000060.</ref><br />
<br />
== Introduction ==<br />
[[Adverse drug event|Adverse Event (AE)]] detection is an essential component of organizational patient safety programs, but at the moment, most organizations use resource-intensive and unreliable methods to detect these errors, such as random chart reviews or voluntary incident reporting. The Institute for Healthcare Improvement (IHI) developed a set of algorithms for different AEs and have a specific protocol for confirmatory chart review (using the [[EHR]]) that is much less time and resource-intensive than traditional random chart review protocols. So far, this process has been found to be better at determining "true positive" adverse events than many other methods. <br />
<br />
This study proposes to assess the effectiveness of the Institute for Healthcare Improvement’s [[Global trigger tool|Global Trigger Tool (GTT)]] in a VA facility by examining the overlap of AE detection between GTT and existing surveillance measures.<br />
<br />
== Methods ==<br />
<br />
The VA facility adapted the existing GTT methodology slightly, but roughly, "the IHI GTT process imposes a 20-minute time limit in which a trained reviewer, typically a nurse, records whether any of 52 triggers are evident. Trigger-flagged cases then undergo a second round of review by a physician to confirm the event and assign a rating of degree of harm using a validated harm scale. The second review may take as little as 2–5 minutes."<ref name="GTT"></ref> For this study, the population included medical/surgical hospitalizations at one large VA facility with a discharge between July 1 and October 27, 2012. <br />
<br />
<br />
Possible Adverse Events (trigger examples): <br />
(e.g.: blood or blood product; device or medical-surgical supply, including health information technology; fall; HAI; medication or other substance; pressure ulcer; surgery or anesthesia; and venous thromboembolism) <br />
<br />
== Results ==<br />
109 AEs identified using GTT methodology<br />
<br />
88% of identified AEs were not detected by the existing surveillance measures such as VA Surgical Quality Program (VASQIP) or Patient Safety Quality Indicators (PSIs)<br />
<br />
60% the AEs identified resulted in minor harm <br />
<br />
== Discussion ==<br />
<br />
This study corroborates with studies done in the private sector that showed the GTT is helpful in identifying AEs that are often not detected via other methods (or are detected with greater efficiency and less cost than those other methods). The authors of this study hope that its success will influence the VA to take steps to implement the GTT widely, across all institutions.<br />
<br />
== Comments ==<br />
This study is another example of the importance of using EHRs and the data they collect to further the improvement of patient safety in organizations. It is so important to use HIT to its potential and to do so in the most efficient and cost effective ways.<br />
<br />
== Related Articles ==<br />
<br />
[[Electronic health record-based triggers to detect potential delays in cancer diagnosis]]<br />
<br />
[[Electronic health record-based surveillance of diagnostic errors in primary care]]<br />
<br />
[[Department of Veterans Affairs Initiatives]]<br />
<br />
== References ==<br />
<references/><br />
<br />
[[Category:Reviews]]<br />
[[Category:EHR]]<br />
[[Category: HI5313-2015-FALL]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Identifying_Previously_Undetected_Harm:_Piloting_the_Institute_for_Healthcare_Improvement%27s_Global_Trigger_Tool_in_the_Veterans_Health_AdministrationIdentifying Previously Undetected Harm: Piloting the Institute for Healthcare Improvement's Global Trigger Tool in the Veterans Health Administration2015-11-19T02:57:51Z<p>RoniMV: /* Introduction */</p>
<hr />
<div>This is a review of the 2015 article "Identifying Previously Undetected Harm: Piloting the Institute for Healthcare Improvement's Global Trigger Tool in the Veterans Health Administration" by Mull et al.<ref name="GTT"> http://www.ncbi.nlm.nih.gov/pubmed/?term=Identifying+Previously+Undetected+Harm%3A+Piloting+the+Institute+for+Healthcare+Improvement%E2%80%99s+Global Mull HJ, Brennan CW, Folkes T, Hermos J, Chan J, Rosen AK, Simon SR.Identifying Previously Undetected Harm: Piloting the Institute for Healthcare Improvement's Global Trigger Tool in the Veterans Health Administration. Qual Manag Health Care. 2015 Jul-Sep;24(3):140-6. doi: 10.1097/QMH.0000000000000060.</ref><br />
<br />
== Introduction ==<br />
Adverse Event (AE) detection is an essential component of organizational patient safety programs, but at the moment, most organizations use resource-intensive and unreliable methods to detect these errors, such as random chart reviews or voluntary incident reporting. The Institute for Healthcare Improvement (IHI) developed a set of algorithms for different AEs and have a specific protocol for confirmatory chart review (using the [[EHR]]) that is much less time and resource-intensive than traditional random chart review protocols. So far, this process has been found to be better at determining "true positive" adverse events than many other methods. <br />
<br />
This study proposes to assess the effectiveness of the Institute for Healthcare Improvement’s [[Global trigger tool|Global Trigger Tool (GTT)]] in a VA facility by examining the overlap of AE detection between GTT and existing surveillance measures.<br />
<br />
== Methods ==<br />
<br />
The VA facility adapted the existing GTT methodology slightly, but roughly, "the IHI GTT process imposes a 20-minute time limit in which a trained reviewer, typically a nurse, records whether any of 52 triggers are evident. Trigger-flagged cases then undergo a second round of review by a physician to confirm the event and assign a rating of degree of harm using a validated harm scale. The second review may take as little as 2–5 minutes."<ref name="GTT"></ref> For this study, the population included medical/surgical hospitalizations at one large VA facility with a discharge between July 1 and October 27, 2012. <br />
<br />
<br />
Possible Adverse Events (trigger examples): <br />
(e.g.: blood or blood product; device or medical-surgical supply, including health information technology; fall; HAI; medication or other substance; pressure ulcer; surgery or anesthesia; and venous thromboembolism) <br />
<br />
== Results ==<br />
109 AEs identified using GTT methodology<br />
<br />
88% of identified AEs were not detected by the existing surveillance measures such as VA Surgical Quality Program (VASQIP) or Patient Safety Quality Indicators (PSIs)<br />
<br />
60% the AEs identified resulted in minor harm <br />
<br />
== Discussion ==<br />
<br />
This study corroborates with studies done in the private sector that showed the GTT is helpful in identifying AEs that are often not detected via other methods (or are detected with greater efficiency and less cost than those other methods). The authors of this study hope that its success will influence the VA to take steps to implement the GTT widely, across all institutions.<br />
<br />
== Comments ==<br />
This study is another example of the importance of using EHRs and the data they collect to further the improvement of patient safety in organizations. It is so important to use HIT to its potential and to do so in the most efficient and cost effective ways.<br />
<br />
== Related Articles ==<br />
<br />
[[Electronic health record-based triggers to detect potential delays in cancer diagnosis]]<br />
<br />
[[Electronic health record-based surveillance of diagnostic errors in primary care]]<br />
<br />
[[Department of Veterans Affairs Initiatives]]<br />
<br />
== References ==<br />
<references/><br />
<br />
[[Category:Reviews]]<br />
[[Category:EHR]]<br />
[[Category: HI5313-2015-FALL]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T18:21:32Z<p>RoniMV: /* Conclusion */</p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
http://www.ncbi.nlm.nih.gov/pubmed/24249778<br />
<br />
== Background ==<br />
<br />
Electronic Health Records [[EMR|(EHRs)]] are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <ref name="Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778</ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education, emergency medicine and usability testing was assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor played the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels ('''WoWs''') were used to enable participants to use and move computers at their discretion.<br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by detailed planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They demonstrated that an organization could by working with the IT Department and the simulation center to create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Workstation Assessment for EHR Implementation at a tertiary care center]]<br />
<br />
== References ==<br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T17:52:02Z<p>RoniMV: /* Planning */</p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
http://www.ncbi.nlm.nih.gov/pubmed/24249778<br />
<br />
== Background ==<br />
<br />
Electronic Health Records [[EMR|(EHRs)]] are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <ref name="Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778</ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education, emergency medicine and usability testing was assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor played the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels ('''WoWs''') were used to enable participants to use and move computers at their discretion.<br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by detailed planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They demonstrated that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Workstation Assessment for EHR Implementation at a tertiary care center]]<br />
<br />
== References ==<br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T17:50:04Z<p>RoniMV: /* Conclusion */</p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
http://www.ncbi.nlm.nih.gov/pubmed/24249778<br />
<br />
== Background ==<br />
<br />
Electronic Health Records [[EMR|(EHRs)]] are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <ref name="Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778</ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education, emergency medicine and usability testing was assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor playing the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels ('''WoWs''') were used to enable participants to use and move computers at their discretion.<br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by detailed planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They demonstrated that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Workstation Assessment for EHR Implementation at a tertiary care center]]<br />
<br />
== References ==<br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T17:49:50Z<p>RoniMV: /* Conclusion */</p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
http://www.ncbi.nlm.nih.gov/pubmed/24249778<br />
<br />
== Background ==<br />
<br />
Electronic Health Records [[EMR|(EHRs)]] are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <ref name="Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778</ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education, emergency medicine and usability testing was assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor playing the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels ('''WoWs''') were used to enable participants to use and move computers at their discretion.<br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by detailed planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They did demonstrate that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Workstation Assessment for EHR Implementation at a tertiary care center]]<br />
<br />
== References ==<br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T17:49:22Z<p>RoniMV: /* Physical space */</p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
http://www.ncbi.nlm.nih.gov/pubmed/24249778<br />
<br />
== Background ==<br />
<br />
Electronic Health Records [[EMR|(EHRs)]] are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <ref name="Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778</ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education, emergency medicine and usability testing was assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor playing the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels ('''WoWs''') were used to enable participants to use and move computers at their discretion.<br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by meticulous planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They did demonstrate that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Workstation Assessment for EHR Implementation at a tertiary care center]]<br />
<br />
== References ==<br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T07:46:51Z<p>RoniMV: </p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
http://www.ncbi.nlm.nih.gov/pubmed/24249778<br />
<br />
== Background ==<br />
<br />
Electronic Health Records [[EMR|(EHRs)]] are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <ref name="Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778</ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education, emergency medicine and usability testing was assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor playing the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels (WoWs) were used to enable participants to use and move computers at their discretion. <br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by meticulous planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They did demonstrate that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Workstation Assessment for EHR Implementation at a tertiary care center]]<br />
<br />
== References ==<br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T07:46:34Z<p>RoniMV: </p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
http://www.ncbi.nlm.nih.gov/pubmed/24249778<br />
<br />
== Background ==<br />
<br />
Electronic Health Records [[EMR|(EHRs)]] are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <ref name="Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778</ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education, emergency medicine and usability testing was assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor playing the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels (WoWs) were used to enable participants to use and move computers at their discretion. <br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by meticulous planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They did demonstrate that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Workstation Assessment for EHR Implementation at a tertiary care center]]<br />
<br />
== References ==<br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Workstation_Assessment_for_EHR_Implementation_at_a_tertiary_care_centerWorkstation Assessment for EHR Implementation at a tertiary care center2015-11-18T07:44:07Z<p>RoniMV: </p>
<hr />
<div>This decade has witnessed an increased adoption of [[EMR|Clinical Information Systems]]. The adoption has been seen in large hospitals and small practices with the aim of improving patient care, transition of care and, improved quality at the point of care and medical outcomes. Constant evaluation of these systems, esp the hardware and software used, is critical for improvement. <br />
<br />
The most prominent hardware systems used for [[CPOE]] implementation are desktop computers, carts on wheels (COWs) and wall mounted units (WMUs). COWs have been proposed to be particularly helpful to nurses entering data and for clinicians to enter notes and review clinical data at patient bedside. I conducted a study on physicians, nurses and other healthcare personnel at a tertiary care centre where we initially assessed the utilization of the available desktops, WMUs and COWs every 15 minutes to assess the popularity of each and then a survey was conducted on the physicians, nurses and health care professionals to assess the reasons for differences, if any. <br />
<br />
The study showed that the desktops were the most popular and most utilized and COWs were actually the least popular among the three hardware devices. When interrogated for utilization every 15 minutes for 3 days in a Unit, desktop occupancy was 66.2% versus 25.4% for wall mounted units and 5.6% for carts on wheels. COWs were noted to be cumbersome and often discharged, preventing their use. Interestingly, most healthcare personnel did not want to enter patient data at patient bedside. Physicians and nurses liked to put in their notes and orders while sitting, esp. after long and stressful days, making desktop computers popular. Wireless carts also do not provide access to a chairs or phones which were identified as important requirements. Easy access to phones was important to communicate with colleagues, referring physicians, consultants, labs and the family and their proximity to the desktop units made them the most popular. It was seen that the WMUs were also less popular, mainly due to inaccessibility to seating and phones. <br />
<br />
To make the wireless carts more useful and usable, it was found that it could be helpful to attach a small chair and a telephone and allocate charging responsibility in an efficient manner. It was felt that availability of desktop computers at the nursing stations seem to be the most used of the three hardware devices but physicians also felt that their availability in physician and discharge lounges would be of great benefit. <br />
<br />
EHRs have the potential to transform healthcare but their adoption needs to be adequately supported to get maximum value for an organization. Some of the key issues that may impede the successful implementation of inpatient EHRs should be identified and the proposed alternatives that can help overcome the challenges should be readily embraced. <br />
<br />
==Related Articles ==<br />
<br />
*[[Using a medical simulation center as an electronic health record usability laboratory]]<br />
<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T07:42:18Z<p>RoniMV: /* Background */</p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
<br />
== Background ==<br />
<br />
Electronic Health Records [[EMR|(EHRs)]] are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <ref name="Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778</ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education, emergency medicine and usability testing was assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor playing the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels (WoWs) were used to enable participants to use and move computers at their discretion. <br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by meticulous planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They did demonstrate that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Workstation Assessment for EHR Implementation at a tertiary care center]]<br />
<br />
== References ==<br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T07:41:37Z<p>RoniMV: /* Related Articles */</p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
<br />
== Background ==<br />
<br />
Electronic Health Records (EHRs) are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <ref name="Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778</ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education, emergency medicine and usability testing was assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor playing the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels (WoWs) were used to enable participants to use and move computers at their discretion. <br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by meticulous planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They did demonstrate that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Workstation Assessment for EHR Implementation at a tertiary care center]]<br />
<br />
== References ==<br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T07:15:31Z<p>RoniMV: /* Planning */</p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
<br />
== Background ==<br />
<br />
Electronic Health Records (EHRs) are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <ref name="Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778</ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education, emergency medicine and usability testing was assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor playing the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels (WoWs) were used to enable participants to use and move computers at their discretion. <br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by meticulous planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They did demonstrate that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Are three methods better than one]]<br />
<br />
== References ==<br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T07:14:33Z<p>RoniMV: /* Planning */</p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
<br />
== Background ==<br />
<br />
Electronic Health Records (EHRs) are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <ref name="Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778</ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education emergency medicine and usability testing was assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor playing the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels (WoWs) were used to enable participants to use and move computers at their discretion. <br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by meticulous planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They did demonstrate that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Are three methods better than one]]<br />
<br />
== References ==<br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T07:13:37Z<p>RoniMV: </p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
<br />
== Background ==<br />
<br />
Electronic Health Records (EHRs) are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <ref name="Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778</ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education emergency medicine and usability testing were assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor playing the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels (WoWs) were used to enable participants to use and move computers at their discretion. <br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by meticulous planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They did demonstrate that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Are three methods better than one]]<br />
<br />
== References ==<br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T07:11:03Z<p>RoniMV: /* Background */</p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
<br />
== Background ==<br />
<br />
Electronic Health Records (EHRs) are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <ref name="Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778</ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education emergency medicine and usability testing were assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor playing the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels (WoWs) were used to enable participants to use and move computers at their discretion. <br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by meticulous planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They did demonstrate that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
<br />
== References ==<br />
<References/></div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T07:09:53Z<p>RoniMV: </p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
''J Am Med Inform Assoc.'' 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
<br />
== Background ==<br />
<br />
Electronic Health Records (EHRs) are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. <refname=“Haugen"> Haugen H. Advantages of simulation training.How to improve EMR adoption. http://www.ncbi.nlm.nih.gov/pubmed/24249778 </ref><br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education emergency medicine and usability testing were assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor playing the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels (WoWs) were used to enable participants to use and move computers at their discretion. <br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by meticulous planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They did demonstrate that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure, that might be found in many teaching hospitals and integrating with EHR software via collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real settings not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing in the development of a new laboratory environment.<br />
<br />
== Related Articles==<br />
<br />
<br />
== References ==<br />
<References/></div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-18T06:57:30Z<p>RoniMV: </p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB1, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
J Am Med Inform Assoc. 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.<br />
<br />
<br />
== Background ==<br />
<br />
Electronic Health Records (EHRs) are becoming commonplace in hospitals and health care facilities and while for the most part they have brought improvements to healthcare delivery there have also been some unintended consequences. Some healthcare providers have expressed concerns that these systems are not easy to use and often add to their workload. Usability evaluations conducted before during and after implementation of EHRs can have an impact on the efficiency, safety and user satisfaction of these systems.<br />
<br />
Medical simulation centers have long been used for training of medical staff and medical device vendors already use simulation centers as part of their product development process. To standardize clinical scenarios, simulate specific scenarios and collect usability metrics is difficult to do in a clinical environment. Indeed, some research groups have already created HIT usability laboratories for this purpose but these dedicated facilities require a great investment. < Ref name = “Haugen">Haugen H. Advantages of simulation training.How to improve EMR adoption.</ref> <br />
<br />
The authors wished to investigate whether it would be possible to use a Medical simulation center as a HIT usability laboratory by integrating the EHR software.<br />
<br />
== Methods ==<br />
<br />
As part of this investigation the EHR software was set up in a medical simulation center and configured to reflect and Emergency Department (ED) of a hospital to understand how ED doctors would use electronic documentation.<br />
<br />
=== Planning===<br />
A multidisciplinary team with experience in medical informatics, medical simulation, medical education emergency medicine and usability testing were assembled. A typical clinical scenario was created to study how ED clinicians would interact with the EHR system. A physician actor playing the patient and the research staff helped run the scenario with one analyst facilitating the scenario, one taking notes, another playing the role of nurse interrupter and one setting up and monitoring the audio visual recording.<br />
<br />
===Physical space ===<br />
The physical space of the medical simulation center was configured to represent a typical ED examination room. Workstations on wheels (WoWs) were used to enable participants to use and move computers at their discretion. <br />
<br />
=== EHRs, simulation scenario ===<br />
A custom-developed web-based ED EHR was integrated into the simulation center with collaboration of the IT department of the hospital to ensure the version being tested was the same as the current production system.<br />
<br />
All participants were resident physicians with substantial ED EHR experience to ensure that their use of the system was being analysed and not their clinical knowledge or technology usage. Participants were given a short scripted scenario that required 20minutes to perform.<br />
<br />
=== Data Collection ===<br />
Morae Recorder (TechSmith US) was installed onto the WoWs to capture real time screen actions and interactions in real time. The Morae Observer software was installed on a workstation in the examination room to record key interactions with the EHR.<br />
<br />
== Results ==<br />
All the residents successfully completed the simulation scenario and each session produced multifaceted qualitative and quantitative data on the participant’s workflow.<br />
<br />
Study team debriefings were held after each participant session. The data collected was compiled into cumulative lessons learned.<br />
<br />
== Conclusion == <br />
The authors did successfully integrate the ED EHR system with the medical simulation center and thereby did create an EHR usability testing laboratory.<br />
<br />
They achieved this by meticulous planning and strong partnerships with clinical educators, medical simulation experts and Information systems department. <br />
<br />
They did demonstrate that an organization could by working with the IT Department and the simulation center create an EHR simulation environment.<br />
<br />
== Comments ==<br />
<br />
This was a very interesting article that explored using existing medical simulation center infrastructure that might be found in many teaching hospitals and integrating with collaboration with technical and clinical experts to create a HIT system usability testing laboratory. <br />
<br />
There are many usability testing techniques and methods but a simulation laboratory might provide insights or richness of observations in near real setting not possible by other means. The findings of this study indicate the possibility of using existing infrastructure to achieve a usability laboratory thus reducing the cost of investing into the development of a new environment.<br />
<br />
== Related Articles==<br />
<br />
<br />
== References ==<br />
<References/></div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/MalwareMalware2015-11-18T00:13:43Z<p>RoniMV: /* How bad is the malware problem? */</p>
<hr />
<div>Software that gets installed on your computer by bundling with other downloadable programs, emails, files sharing or exploiting security holes in the system <ref>Runciman, B. (2011). Malware Response. ITNOW, 53(6), 34-36. http://itnow.oxfordjournals.org/content/53/6/34.short </ref>. The severity of malware infection varies from simple advertisement pop-ups to stealing of important information from your computer like data and passwords <ref>https://ist.mit.edu/security/malware </ref>.<br />
<br />
Types of malware: Spyware, Trojan horses, Viruses, Worms<br />
<br />
==References==<br />
<references/><br />
<br />
<br />
= Malware cont'd =<br />
<br />
Malware is a word derived from two words :malicious and software. <br />
It refers to any type of programming intended to cause harm. Malware can exist in many forms and the most common are viruses, worms, spyware and Trojan horses.<br />
<br />
The effects of a malware infection can range from corrupt files, altered or deleted data, disclosure of confidential data, disabling hardware, denial of legitimate user access and even hard drive crashes.<br />
The consequences of a malware infection can be devastating for the individual or organization and can result in compromised systems, lost or stolen data, slow down of systems, wasted resources and loss of users and client confidence.<br />
Often malware is designed to send itself from the user’s email account to all contacts in their address book.<br />
<br />
== Types of Malware ==<br />
The main types of malware are:<br />
<br />
• '''Viruses''' are programs that self -replicate within computers and across networks and alter files or data. While they usually require the user to action the executable file in an e-mail attachment for example, some can execute as embedded programming in the e-mail message itself. <br />
<br />
• '''Worms''' are a virus variant that can infect a computer without any user interaction. A worm doesn't alter files, but resides in active memory and duplicates itself thereby slowing the system down. Worms use parts of an operating system that are automatic and usually invisible to the user.<br />
<br />
• '''Trojans''' are malicious coding hidden in within innocuous programming or data in such a way that it can get control and do its chosen form of damage, such as ruining the file allocation table on your hard disk. A Trojan horse may be widely redistributed along with a virus. <br />
<br />
• '''Spyware''' is programming that installs onto your computer and secretly gathers information to relay to advertisers or other interested parties. Spyware can get in a computer as a software virus or as the result of installing a new program. Although not malicious in intent, spyware is often installed without consent and even without the user's knowledge sometimes as a result of clicking in a deceptive pop-up window. <br />
<br />
• '''Browser hijackers''' are programs that alter the computer's browser settings so that it redirects to Web sites the user had no intention of visiting. Most browser hijackers alter default home pages and search pages to those of their customers, who pay for that service because of the traffic it generates. Poorly coded browser hijackers may also slow down the computer or cause browser crashes.<br />
<br />
=== Blended Threats === <br />
Blended threats combine characteristics of more than one type of malware to maximize the damage they cause and the speed of contagion. Although each type of malware has defining characteristics, the distinctions between them are becoming blurred because blended threats are becoming increasingly common. <br />
<br />
• '''hybrid virus''' - one that combines characteristics of more than one type of virus to infect both program files and system sectors. The virus may attack at either level and proceed to infect the other once it has established itself.<br />
<br />
• '''hybrid virus/worm''' - malicious code that combines characteristics of both those type of malware, typically featuring the virus’ ability to alter program code with the worm's ability to reside in live memory and to propagate without any action on the part of the user.<br />
<br />
== How bad is the malware problem? ==<br />
<br />
2003 was the worst year to date for malware attacks and indications are that the number and severity of attacks will only increase.<br />
Some statistics:<br />
* Code Red infected every vulnerable computer on the Internet within 14 hours; Slammer did the same in 20 minutes. An IM exploit could spread to half a million computers in just 30 seconds (Symantec Security Response)<br />
* In 2001, one in 300 e-mails contained a virus; for 2004, that number is predicted to be one in 100 (MessageLabs)<br />
* Attacks increased tenfold in the past ten years, from 1,334 reported attacks in 1993 to 137,529 in 2003 (CERT Coordination Center)<br />
* 20-40 new or variant virus threats were reported daily to TrendMicro in 2003<br />
* The number of attacks between January and June, 2003 exceeded 70,000 -- double those of the previous year (Reuters)<br />
* Ninety-two out of 300 randomly selected companies suffered a major (more than 25 computers affected) virus attack in 2003 (Computer Virus Prevalence Report)<br />
* Companies in the above survey reported that 11% of their computers were infected in any given month (Computer Virus Prevalence Report)<br />
* Spyware is responsible for about a third of all Windows application crashes (Scott Culp, Microsoft)<br />
* Viruses cost businesses around the world $55 billion in 2003, up from $13 billion in 2001 (TrendMicro)<br />
<br />
== What are the future trends for malware distribution? ==<br />
Although most widely distributed malware of recent years has arrived via e-mail attachment, infected Web sites and program downloads are having an increasing impact. <br />
There are concerns that almost every Web site has serious vulnerabilities that allow a hacker easy access. As security isn't in-built for Web applications, an attacker can often hack into a site by viewing a Web page's source, grabbing some information from the commented code, and entering it into the address bar.<ref name="Brian"> Marshall Brian ‘How Computer Viruses Work’ http://computer.howstuffworks.com/virus.htm </ref><br />
<br />
== References ==<br />
<br />
<references/></div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/MalwareMalware2015-11-18T00:09:07Z<p>RoniMV: /* Types of Malware */</p>
<hr />
<div>Software that gets installed on your computer by bundling with other downloadable programs, emails, files sharing or exploiting security holes in the system <ref>Runciman, B. (2011). Malware Response. ITNOW, 53(6), 34-36. http://itnow.oxfordjournals.org/content/53/6/34.short </ref>. The severity of malware infection varies from simple advertisement pop-ups to stealing of important information from your computer like data and passwords <ref>https://ist.mit.edu/security/malware </ref>.<br />
<br />
Types of malware: Spyware, Trojan horses, Viruses, Worms<br />
<br />
==References==<br />
<references/><br />
<br />
<br />
= Malware cont'd =<br />
<br />
Malware is a word derived from two words :malicious and software. <br />
It refers to any type of programming intended to cause harm. Malware can exist in many forms and the most common are viruses, worms, spyware and Trojan horses.<br />
<br />
The effects of a malware infection can range from corrupt files, altered or deleted data, disclosure of confidential data, disabling hardware, denial of legitimate user access and even hard drive crashes.<br />
The consequences of a malware infection can be devastating for the individual or organization and can result in compromised systems, lost or stolen data, slow down of systems, wasted resources and loss of users and client confidence.<br />
Often malware is designed to send itself from the user’s email account to all contacts in their address book.<br />
<br />
== Types of Malware ==<br />
The main types of malware are:<br />
<br />
• '''Viruses''' are programs that self -replicate within computers and across networks and alter files or data. While they usually require the user to action the executable file in an e-mail attachment for example, some can execute as embedded programming in the e-mail message itself. <br />
<br />
• '''Worms''' are a virus variant that can infect a computer without any user interaction. A worm doesn't alter files, but resides in active memory and duplicates itself thereby slowing the system down. Worms use parts of an operating system that are automatic and usually invisible to the user.<br />
<br />
• '''Trojans''' are malicious coding hidden in within innocuous programming or data in such a way that it can get control and do its chosen form of damage, such as ruining the file allocation table on your hard disk. A Trojan horse may be widely redistributed along with a virus. <br />
<br />
• '''Spyware''' is programming that installs onto your computer and secretly gathers information to relay to advertisers or other interested parties. Spyware can get in a computer as a software virus or as the result of installing a new program. Although not malicious in intent, spyware is often installed without consent and even without the user's knowledge sometimes as a result of clicking in a deceptive pop-up window. <br />
<br />
• '''Browser hijackers''' are programs that alter the computer's browser settings so that it redirects to Web sites the user had no intention of visiting. Most browser hijackers alter default home pages and search pages to those of their customers, who pay for that service because of the traffic it generates. Poorly coded browser hijackers may also slow down the computer or cause browser crashes.<br />
<br />
=== Blended Threats === <br />
Blended threats combine characteristics of more than one type of malware to maximize the damage they cause and the speed of contagion. Although each type of malware has defining characteristics, the distinctions between them are becoming blurred because blended threats are becoming increasingly common. <br />
<br />
• '''hybrid virus''' - one that combines characteristics of more than one type of virus to infect both program files and system sectors. The virus may attack at either level and proceed to infect the other once it has established itself.<br />
<br />
• '''hybrid virus/worm''' - malicious code that combines characteristics of both those type of malware, typically featuring the virus’ ability to alter program code with the worm's ability to reside in live memory and to propagate without any action on the part of the user.<br />
<br />
== How bad is the malware problem? ==<br />
<br />
2003 was the worst year to date for malware attacks and indications are that the number and severity of attacks will only increase.<br />
Some statistics:<br />
• Code Red infected every vulnerable computer on the Internet within 14 hours; Slammer did the same in 20 minutes. An IM exploit could spread to half a million computers in just 30 seconds (Symantec Security Response)<br />
• In 2001, one in 300 e-mails contained a virus; for 2004, that number is predicted to be one in 100 (MessageLabs)<br />
• Attacks increased tenfold in the past ten years, from 1,334 reported attacks in 1993 to 137,529 in 2003 (CERT Coordination Center)<br />
• 20-40 new or variant virus threats were reported daily to TrendMicro in 2003<br />
• The number of attacks between January and June, 2003 exceeded 70,000 -- double those of the previous year (Reuters)<br />
• Ninety-two out of 300 randomly selected companies suffered a major (more than 25 computers affected) virus attack in 2003 (Computer Virus Prevalence Report)<br />
• Companies in the above survey reported that 11% of their computers were infected in any given month (Computer Virus Prevalence Report)<br />
• Spyware is responsible for about a third of all Windows application crashes (Scott Culp, Microsoft)<br />
• Viruses cost businesses around the world $55 billion in 2003, up from $13 billion in 2001 (TrendMicro)<br />
<br />
== What are the future trends for malware distribution? ==<br />
Although most widely distributed malware of recent years has arrived via e-mail attachment, infected Web sites and program downloads are having an increasing impact. <br />
There are concerns that almost every Web site has serious vulnerabilities that allow a hacker easy access. As security isn't in-built for Web applications, an attacker can often hack into a site by viewing a Web page's source, grabbing some information from the commented code, and entering it into the address bar.<ref name="Brian"> Marshall Brian ‘How Computer Viruses Work’ http://computer.howstuffworks.com/virus.htm </ref><br />
<br />
== References ==<br />
<br />
<references/></div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/MalwareMalware2015-11-18T00:08:27Z<p>RoniMV: /* What are the future trends for malware distribution? */</p>
<hr />
<div>Software that gets installed on your computer by bundling with other downloadable programs, emails, files sharing or exploiting security holes in the system <ref>Runciman, B. (2011). Malware Response. ITNOW, 53(6), 34-36. http://itnow.oxfordjournals.org/content/53/6/34.short </ref>. The severity of malware infection varies from simple advertisement pop-ups to stealing of important information from your computer like data and passwords <ref>https://ist.mit.edu/security/malware </ref>.<br />
<br />
Types of malware: Spyware, Trojan horses, Viruses, Worms<br />
<br />
==References==<br />
<references/><br />
<br />
<br />
= Malware cont'd =<br />
<br />
Malware is a word derived from two words :malicious and software. <br />
It refers to any type of programming intended to cause harm. Malware can exist in many forms and the most common are viruses, worms, spyware and Trojan horses.<br />
<br />
The effects of a malware infection can range from corrupt files, altered or deleted data, disclosure of confidential data, disabling hardware, denial of legitimate user access and even hard drive crashes.<br />
The consequences of a malware infection can be devastating for the individual or organization and can result in compromised systems, lost or stolen data, slow down of systems, wasted resources and loss of users and client confidence.<br />
Often malware is designed to send itself from the user’s email account to all contacts in their address book.<br />
<br />
== Types of Malware ==<br />
<br />
• '''Viruses''' are programs that self -replicate within computers and across networks and alter files or data. While they usually require the user to action the executable file in an e-mail attachment for example, some can execute as embedded programming in the e-mail message itself. <br />
<br />
• '''Worms''' are a virus variant that can infect a computer without any user interaction. A worm doesn't alter files, but resides in active memory and duplicates itself thereby slowing the system down. Worms use parts of an operating system that are automatic and usually invisible to the user.<br />
<br />
• '''Trojans''' are malicious coding hidden in within innocuous programming or data in such a way that it can get control and do its chosen form of damage, such as ruining the file allocation table on your hard disk. A Trojan horse may be widely redistributed along with a virus. <br />
<br />
• '''Spyware''' is programming that installs onto your computer and secretly gathers information to relay to advertisers or other interested parties. Spyware can get in a computer as a software virus or as the result of installing a new program. Although not malicious in intent, spyware is often installed without consent and even without the user's knowledge sometimes as a result of clicking in a deceptive pop-up window. <br />
<br />
• '''Browser hijackers''' are programs that alter the computer's browser settings so that it redirects to Web sites the user had no intention of visiting. Most browser hijackers alter default home pages and search pages to those of their customers, who pay for that service because of the traffic it generates. Poorly coded browser hijackers may also slow down the computer or cause browser crashes.<br />
<br />
=== Blended Threats === <br />
Blended threats combine characteristics of more than one type of malware to maximize the damage they cause and the speed of contagion. Although each type of malware has defining characteristics, the distinctions between them are becoming blurred because blended threats are becoming increasingly common. <br />
<br />
• '''hybrid virus''' - one that combines characteristics of more than one type of virus to infect both program files and system sectors. The virus may attack at either level and proceed to infect the other once it has established itself.<br />
<br />
• '''hybrid virus/worm''' - malicious code that combines characteristics of both those type of malware, typically featuring the virus’ ability to alter program code with the worm's ability to reside in live memory and to propagate without any action on the part of the user.<br />
<br />
== How bad is the malware problem? ==<br />
<br />
2003 was the worst year to date for malware attacks and indications are that the number and severity of attacks will only increase.<br />
Some statistics:<br />
• Code Red infected every vulnerable computer on the Internet within 14 hours; Slammer did the same in 20 minutes. An IM exploit could spread to half a million computers in just 30 seconds (Symantec Security Response)<br />
• In 2001, one in 300 e-mails contained a virus; for 2004, that number is predicted to be one in 100 (MessageLabs)<br />
• Attacks increased tenfold in the past ten years, from 1,334 reported attacks in 1993 to 137,529 in 2003 (CERT Coordination Center)<br />
• 20-40 new or variant virus threats were reported daily to TrendMicro in 2003<br />
• The number of attacks between January and June, 2003 exceeded 70,000 -- double those of the previous year (Reuters)<br />
• Ninety-two out of 300 randomly selected companies suffered a major (more than 25 computers affected) virus attack in 2003 (Computer Virus Prevalence Report)<br />
• Companies in the above survey reported that 11% of their computers were infected in any given month (Computer Virus Prevalence Report)<br />
• Spyware is responsible for about a third of all Windows application crashes (Scott Culp, Microsoft)<br />
• Viruses cost businesses around the world $55 billion in 2003, up from $13 billion in 2001 (TrendMicro)<br />
<br />
== What are the future trends for malware distribution? ==<br />
Although most widely distributed malware of recent years has arrived via e-mail attachment, infected Web sites and program downloads are having an increasing impact. <br />
There are concerns that almost every Web site has serious vulnerabilities that allow a hacker easy access. As security isn't in-built for Web applications, an attacker can often hack into a site by viewing a Web page's source, grabbing some information from the commented code, and entering it into the address bar.<ref name="Brian"> Marshall Brian ‘How Computer Viruses Work’ http://computer.howstuffworks.com/virus.htm </ref><br />
<br />
== References ==<br />
<br />
<references/></div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/MalwareMalware2015-11-18T00:08:05Z<p>RoniMV: </p>
<hr />
<div>Software that gets installed on your computer by bundling with other downloadable programs, emails, files sharing or exploiting security holes in the system <ref>Runciman, B. (2011). Malware Response. ITNOW, 53(6), 34-36. http://itnow.oxfordjournals.org/content/53/6/34.short </ref>. The severity of malware infection varies from simple advertisement pop-ups to stealing of important information from your computer like data and passwords <ref>https://ist.mit.edu/security/malware </ref>.<br />
<br />
Types of malware: Spyware, Trojan horses, Viruses, Worms<br />
<br />
==References==<br />
<references/><br />
<br />
<br />
= Malware cont'd =<br />
<br />
Malware is a word derived from two words :malicious and software. <br />
It refers to any type of programming intended to cause harm. Malware can exist in many forms and the most common are viruses, worms, spyware and Trojan horses.<br />
<br />
The effects of a malware infection can range from corrupt files, altered or deleted data, disclosure of confidential data, disabling hardware, denial of legitimate user access and even hard drive crashes.<br />
The consequences of a malware infection can be devastating for the individual or organization and can result in compromised systems, lost or stolen data, slow down of systems, wasted resources and loss of users and client confidence.<br />
Often malware is designed to send itself from the user’s email account to all contacts in their address book.<br />
<br />
== Types of Malware ==<br />
<br />
• '''Viruses''' are programs that self -replicate within computers and across networks and alter files or data. While they usually require the user to action the executable file in an e-mail attachment for example, some can execute as embedded programming in the e-mail message itself. <br />
<br />
• '''Worms''' are a virus variant that can infect a computer without any user interaction. A worm doesn't alter files, but resides in active memory and duplicates itself thereby slowing the system down. Worms use parts of an operating system that are automatic and usually invisible to the user.<br />
<br />
• '''Trojans''' are malicious coding hidden in within innocuous programming or data in such a way that it can get control and do its chosen form of damage, such as ruining the file allocation table on your hard disk. A Trojan horse may be widely redistributed along with a virus. <br />
<br />
• '''Spyware''' is programming that installs onto your computer and secretly gathers information to relay to advertisers or other interested parties. Spyware can get in a computer as a software virus or as the result of installing a new program. Although not malicious in intent, spyware is often installed without consent and even without the user's knowledge sometimes as a result of clicking in a deceptive pop-up window. <br />
<br />
• '''Browser hijackers''' are programs that alter the computer's browser settings so that it redirects to Web sites the user had no intention of visiting. Most browser hijackers alter default home pages and search pages to those of their customers, who pay for that service because of the traffic it generates. Poorly coded browser hijackers may also slow down the computer or cause browser crashes.<br />
<br />
=== Blended Threats === <br />
Blended threats combine characteristics of more than one type of malware to maximize the damage they cause and the speed of contagion. Although each type of malware has defining characteristics, the distinctions between them are becoming blurred because blended threats are becoming increasingly common. <br />
<br />
• '''hybrid virus''' - one that combines characteristics of more than one type of virus to infect both program files and system sectors. The virus may attack at either level and proceed to infect the other once it has established itself.<br />
<br />
• '''hybrid virus/worm''' - malicious code that combines characteristics of both those type of malware, typically featuring the virus’ ability to alter program code with the worm's ability to reside in live memory and to propagate without any action on the part of the user.<br />
<br />
== How bad is the malware problem? ==<br />
<br />
2003 was the worst year to date for malware attacks and indications are that the number and severity of attacks will only increase.<br />
Some statistics:<br />
• Code Red infected every vulnerable computer on the Internet within 14 hours; Slammer did the same in 20 minutes. An IM exploit could spread to half a million computers in just 30 seconds (Symantec Security Response)<br />
• In 2001, one in 300 e-mails contained a virus; for 2004, that number is predicted to be one in 100 (MessageLabs)<br />
• Attacks increased tenfold in the past ten years, from 1,334 reported attacks in 1993 to 137,529 in 2003 (CERT Coordination Center)<br />
• 20-40 new or variant virus threats were reported daily to TrendMicro in 2003<br />
• The number of attacks between January and June, 2003 exceeded 70,000 -- double those of the previous year (Reuters)<br />
• Ninety-two out of 300 randomly selected companies suffered a major (more than 25 computers affected) virus attack in 2003 (Computer Virus Prevalence Report)<br />
• Companies in the above survey reported that 11% of their computers were infected in any given month (Computer Virus Prevalence Report)<br />
• Spyware is responsible for about a third of all Windows application crashes (Scott Culp, Microsoft)<br />
• Viruses cost businesses around the world $55 billion in 2003, up from $13 billion in 2001 (TrendMicro)<br />
<br />
== What are the future trends for malware distribution? ==<br />
Although most widely distributed malware of recent years has arrived via e-mail attachment, infected Web sites and program downloads are having an increasing impact. <br />
There are concerns that almost every Web site has serious vulnerabilities that allow a hacker easy access. As security isn't in-built for Web applications an attacker can often hack into a site by viewing a Web page's source, grabbing some information from the commented code, and entering it into the address bar.<ref name="Brian"> Marshall Brian ‘How Computer Viruses Work’ http://computer.howstuffworks.com/virus.htm </ref><br />
<br />
<br />
== References ==<br />
<br />
<references/></div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/MalwareMalware2015-11-17T23:56:57Z<p>RoniMV: </p>
<hr />
<div>Software that gets installed on your computer by bundling with other downloadable programs, emails, files sharing or exploiting security holes in the system <ref>Runciman, B. (2011). Malware Response. ITNOW, 53(6), 34-36. http://itnow.oxfordjournals.org/content/53/6/34.short </ref>. The severity of malware infection varies from simple advertisement pop-ups to stealing of important information from your computer like data and passwords <ref>https://ist.mit.edu/security/malware </ref>.<br />
<br />
Types of malware: Spyware, Trojan horses, Viruses, Worms<br />
<br />
==References==<br />
<references/><br />
<br />
<br />
= Malware cont'd =<br />
<br />
Malware is a word derived from two words :malicious and software. <br />
It refers to any type of programming intended to cause harm. Malware can exist in many forms and the most common are viruses, worms, spyware and Trojan horses.<br />
<br />
The effects of a malware infection can range from corrupt files, altered or deleted data, disclosure of confidential data, disabling hardware, denial of legitimate user access and even hard drive crashes.<br />
The consequences of a malware infection can be devastating for the individual or organization and can result in compromised systems, lost or stolen data, slow down of systems, wasted resources and loss of users and client confidence.<br />
Often malware is designed to send itself from the user’s email account to all contacts in their address book.<br />
<br />
== Types of Malware ==<br />
<br />
• '''Viruses''' are programs that self -replicate within computers and across networks and alter files or data. While they usually require the user to action the executable file in an e-mail attachment for example, some can execute as embedded programming in the e-mail message itself. <br />
<br />
• '''Worms''' are a virus variant that can infect a computer without any user interaction. A worm doesn't alter files, but resides in active memory and duplicates itself thereby slowing the system down. Worms use parts of an operating system that are automatic and usually invisible to the user.<br />
<br />
• '''Trojans''' are malicious coding hidden in within innocuous programming or data in such a way that it can get control and do its chosen form of damage, such as ruining the file allocation table on your hard disk. A Trojan horse may be widely redistributed along with a virus. <br />
<br />
• '''Spyware''' is programming that installs onto your computer and secretly gathers information to relay to advertisers or other interested parties. Spyware can get in a computer as a software virus or as the result of installing a new program. Although not malicious in intent, spyware is often installed without consent and even without the user's knowledge sometimes as a result of clicking in a deceptive pop-up window. <br />
<br />
• '''Browser hijackers''' are programs that alter the computer's browser settings so that it redirects to Web sites the user had no intention of visiting. Most browser hijackers alter default home pages and search pages to those of their customers, who pay for that service because of the traffic it generates. Poorly coded browser hijackers may also slow down the computer or cause browser crashes.<br />
<br />
=== Blended Threats === <br />
Blended threats combine characteristics of more than one type of malware to maximize the damage they cause and the speed of contagion. Although each type of malware has defining characteristics, the distinctions between them are becoming blurred because blended threats are becoming increasingly common. <br />
<br />
• '''hybrid virus''' - one that combines characteristics of more than one type of virus to infect both program files and system sectors. The virus may attack at either level and proceed to infect the other once it has established itself.<br />
<br />
• '''hybrid virus/worm''' - malicious code that combines characteristics of both those type of malware, typically featuring the virus’ ability to alter program code with the worm's ability to reside in live memory and to propagate without any action on the part of the user.<br />
<br />
== How bad is the malware problem? ==<br />
<br />
2003 was the worst year to date for malware attacks and indications are that the number and severity of attacks will only increase.<br />
Some statistics:<br />
• Code Red infected every vulnerable computer on the Internet within 14 hours; Slammer did the same in 20 minutes. An IM exploit could spread to half a million computers in just 30 seconds (Symantec Security Response)<br />
• In 2001, one in 300 e-mails contained a virus; for 2004, that number is predicted to be one in 100 (MessageLabs)<br />
• Attacks increased tenfold in the past ten years, from 1,334 reported attacks in 1993 to 137,529 in 2003 (CERT Coordination Center)<br />
• 20-40 new or variant virus threats were reported daily to TrendMicro in 2003<br />
• The number of attacks between January and June, 2003 exceeded 70,000 -- double those of the previous year (Reuters)<br />
• Ninety-two out of 300 randomly selected companies suffered a major (more than 25 computers affected) virus attack in 2003 (Computer Virus Prevalence Report)<br />
• Companies in the above survey reported that 11% of their computers were infected in any given month (Computer Virus Prevalence Report)<br />
• Spyware is responsible for about a third of all Windows application crashes (Scott Culp, Microsoft)<br />
• Viruses cost businesses around the world $55 billion in 2003, up from $13 billion in 2001 (TrendMicro)<br />
<br />
== What are the future trends for malware distribution? ==<br />
Although most widely distributed malware of recent years has arrived via e-mail attachment, infected Web sites and program downloads are having an increasing impact. <br />
There are concerns that almost every Web site has serious vulnerabilities that allow a hacker easy access. As security isn't in-built for Web applications an attacker can often hack into a site by viewing a Web page's source, grabbing some information from the commented code, and entering it into the address bar.<Ref name=''Marshal''>Marshall Brian ‘How Computer Viruses Work’ http://computer.howstuffworks.com/virus.htm. </Ref><br />
<br />
<br />
== References ==<br />
<br />
<Ref/></div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-12T04:25:47Z<p>RoniMV: </p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
'''Complementary methods of system usability evaluation: surveys and observations during software design and development cycles.'''<br />
<br />
Horsky J, McColgan K, Pang JE, Melnikas AJ, Linder JA, Schnipper JL, Middleton B.<br />
<br />
''J Biomed Inform.'' 2010 Oct;43(5):782-90. doi: 10.1016/j.jbi.2010.05.010.<br />
<br />
http://www.ncbi.nlm.nih.gov/pubmed/?term=Complementary+methods+of+system+usability+evaluation%3A+surveys+and+observations+during+software+design+and+development+cycles.<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of [[ EMR| Electronic Health Records (EHRs)]].<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
[[Usability|Usability]] of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.<ref name=’'Sittig''> Sittig, D.F. and Stead, W.W. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994; 1: 108–123. </ref><br />
<br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: '''comments''' from clinicians and '''findings''' derived from formal evaluation by usability experts.<br />
#Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
#Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions.<br />
# Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software.<ref Name = ''Morae''> Morae. 3.1 ed., Okemos, MI: TechSmith Corporation; 2009. </ref><br />
# Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
* Consistency<br />
*Transparency<br />
* Control<br />
* Context<br />
* Terminology<br />
* Biomedical<br />
* Safety<br />
* Customization<br />
* Fault<br />
* Speed<br />
* Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system <br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Are three methods better than one]]<br />
<br />
== References == <br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Usability_Evaluation_of_a_Personal_Health_RecordUsability Evaluation of a Personal Health Record2015-11-12T04:20:06Z<p>RoniMV: </p>
<hr />
<div>This is a review of Segall et al 2011, Usability Evaluation of a Personal Health Record. <ref name="segall 2011">Segall, N, Saville, J, Engle, P, Carlson, B, Wright, M, Schulman, K, Tcheng, J. (2011). Usability Evaluation of a Personal Health Record. AMIA, 2011, 1233-1242. http://www-ncbi-nlm-nih-gov.ezproxyhost.library.tmc.edu/pmc/articles/PMC3243224/</ref><br />
<br />
----<br />
== Introduction ==<br />
----<br />
According to Segall et al, the main objective of this study is to evaluate usability and functionality of Health View [[PHR|electronic personal health record (PHR) system]] of Duke University Health System using [http://en.wikipedia.org/wiki/User-centered_design/''human centered design'']. <ref name="segall 2011"></ref>The article claims that use of PHR would improve the quality of healthcare delivered to a patient, however studies based on outcome measure of the system provided little and mixed information about this claim. Therefore, the study uses HCD to evaluate the PHR system and improve the usability and functionality of a system based on results from evaluation.<br />
<br />
== Background ==<br />
----<br />
The article defines PHR as an internet-based tool that contains individual’s life long health information. It is assumed that the system will provide cost effective coordinated care, especially among chronically ill patients by helping them to actively participant in their care decision. Further it explains, some of the main concerns in the use of PHR are: privacy, security and poor interface usability. It also points out that the success of PHR heavily depends on its usability, which encompasses: learnability, ease of navigation and intuitive use. As a consequence, the study used HCD to test those features. On top of that, the study incorporated a protocol called A think aloud where participants articulate their experience while using a system under test.<br />
<br />
== Methods ==<br />
----<br />
According to the article, the study selected twenty participants with chronic cardiovascular disease and three additional subjects with no chorionic cardiovascular disease. While explaining about their experience as they navigate through the system, each participant performs nine tasks or scenarios in a random order. The participants were asked to "think aloud" as they carried out these tasks. Further, the participants were interviewed about usability problems they encountered, whether they would use the system in the future, or if not, what features they would recommend for future versions of the system. Participants also completed a background survey, a usability survey eliciting their reaction to HealthView, and a survey gauging their interest in accessing differetn types of online health information.<br />
<br />
== Results ==<br />
----<br />
The article reported, participants who are unfamiliar with the system before reported a positive experience with it, while all subjects consider opening a new account or recommending one to a friend. Majority participants believed the system could improve their overall healthcare delivery. Further, the participants rated Heath View’s usability 3.9 on a scale of 1 to 5 for consistency, clarity of message, learnability and information organization. On the other hand, participants reported difficulties on navigation, data entry and medical terminology.<br />
<br />
== Discussion ==<br />
----<br />
Based up on results from the study, the article made some recommendations and improvements in areas such as: navigation, consistency, efficiency, functionality, error massage and the likes.<br />
<br />
== Conclusion ==<br />
----<br />
In conclusion, the article promoted the idea of using HCD methods to evaluate a system that is being developed. HCD methods facilitate a way to evaluate the system before a final work is done, which is much better way than evaluating a system at the end, which potentially can be very costly. In addition to that, the article recommends involving users while developing a new system so that its usability improves well.<br />
<br />
== Comments ==<br />
----<br />
There are mixed and limited studies done regarding the effect of PHR on healthcare delivery. This study added more needed information regarding the issue. In general the article indicated PHR will improve healthcare delivery, however some improvement needed to be done on its usability and functionality features. Aside from that, unlike previous studies, which evaluate a system after final products is done, this study used HCD methods to evaluate one that is being developed. Additionally, it incorporated potential users in the process of evaluation, which makes it very efficient and cost effective evaluation. <br />
<br />
== Related Articles ==<br />
<br />
*[[Complementary methods of system usability evaluation: surveys and observations during software design and development cycles]]<br />
<br />
== References ==<br />
<references/><br />
<br />
[[Category: Personal Health Record (PHR)]]<br />
[[Category: PHR]]<br />
[[Category: Reviews]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Are_three_methods_better_than_oneAre three methods better than one2015-11-12T04:16:17Z<p>RoniMV: /* Related Articles */</p>
<hr />
<div>This is a review of the article by Walji et al (2014) that studied the effectiveness of three methods to evaluate usability in EHRs.<br />
<br />
<br />
<br />
'''Are three methods better than one? A comparative assessment of usability evaluation methods in an EHR'''<br />
<br />
Muhammad F. Walji, Elsbeth Kalenderian, Mark Piotrowski, Duong Tran, Krishna K. Kookal, Oluwabunmi Tokede, Joel M. White, Ram Vaderhobli, Rachel Ramoni, Paul C. Stark, Nicole S. Kimmes, Maxim Lagerweij, Vimla L. Patel<br />
<br />
''International Journal of Medical Informatics''<br />
Volume 83, Issue 5, May 2014, Pages 361–367<br />
<br />
http://www.sciencedirect.com.ezproxyhost.library.tmc.edu/science/article/pii/S1386505614000239<br />
<br />
<br />
== Background ==<br />
Usability is often cited as a major barrier to the wider adoption of [[EMR|EHRs]]. Poor [[usability]] has been proven to reduce efficiency, decrease physician satisfaction and potentially compromises patient safety. <ref name="Patel"> Patel et al 2008. Translational cognition for decision support in critical care environments: a review http://www.ncbi.nlm.nih.gov/pubmed/?term=translational+cognition+for+decision+support++v+l+patel</ref><br />
<br />
Selecting an appropriate method or combination of methods for effectively evaluating EHRs can be a challenge. The authors evaluated three different methods for their effectiveness in detecting usability issues in EHRs<br />
<br />
== Methods ==<br />
The study was conducted with a mixture of third and fourth year students, residents and faculty from two dental schools at Harvard School of Dental Medicine (HSDM) and University of California San Francisco (UCSF). These schools were already using the dental EHR, [[ Axium|axiUm]] and dental terminology standard EzCodes. <ref name="Kalenderian"> Kalenderian et al 2011.The development of a dental diagnostic terminology http://www.ncbi.nlm.nih.gov/pubmed/21205730</ref> Participants from these schools conducted usability evaluation of the EHR using three usability methods: user testing, interviews and survey.<br />
=== User Testing ===<br />
Over a three-day site visit 32 end users from the dental schools performed a series of tasks, which has been developed in collaboration with dentists and researchers from University of Texas Health Sciences Center (UTH). Participants were asked to think aloud while conducting the task and Hierarchical Task Analysis (HTA) and Keystroke Level Model (KLM) were used to analyze the path and time of the tasks. <br />
<br />
=== Semi-structured Interview ===<br />
Researchers conducted 30 minute interviews with 36 participants to capture feedback data about the EHR use with regards to EZ code terminology, workflows and interface. The data was analyzed using MS Excel.<br />
<br />
=== Survey Questionnaire ===<br />
After the site visit 35 participants were asked to complete questionnaires comprising of 29 statement and four open-ended questions. The open-ended questions were around the usability of the EHR for diagnoses and terminology functionality.<br />
<br />
=== Analysis ===<br />
The data for all the methods were analyzed and the problems categorized into the three themes of EHR user interface, diagnostic terminology and clinical work domain/workflow.<br />
<br />
The degree of overlap of the problem themes identified, was also mapped for all three methods.<br />
<br />
== Results == <br />
A total of 187 problems were detected with the three methods: 54% user testing, 28% user interviews and 18% by survey. <br />
<br />
In terms of the problem themes, user interface, terminology and workflows the method of user testing identified problems at 100%, 80% and 67% respectively. For user interviews the figures for identification of problems with the method were 80%, 60% and 33% respectively. Similarly for the survey questionnaires the figures were 30%, 40% and 22%. <br />
<br />
The overlap analysis showed five common themes between all methods with 12 overlapping themes between user testing and interviews. User testing and survey method had 6 overlapping themes while survey and interview had five. <br />
<br />
== Conclusion ==<br />
The results showed user testing with the think aloud was the most effective method for identifying problems, with interviews method next, followed by the survey questionnaire. <br />
<br />
In addition, the overlap analysis results showed that while all the methods were able to identify errors or problems when conducting usability studies for EHRs, that a combination of methods would be better than any single method alone.<br />
<br />
== Comments == <br />
The study was an interesting look at the effectiveness of methods for studying EHRs and due to the generic nature of the methods used, could be applied to other health information technologies in addition to EHRs, such as Clinical Decision Support Systems (CDSS). <br />
The methods used were easy to understand and results easy to interpret with good tables, graphs and diagrams.<br />
The sample group was not randomized which the authors admit but the groups were relatively evenly sized across the methods.<br />
A key element for a successful EHR system is to be able to evaluate it prior, during and after implementation to make sure users concerns are addressed. It is important end users are acknowledged to ensure usability.<br />
<br />
== Related Articles ==<br />
<br />
* [[ Using Human-Centered Design Theory for EHR's]]<br />
* [[ Usability]]<br />
* [[Dental informatics]]<br />
* [[Detection and characterization of usability problems in structured data entry interfaces in dentistry]]<br />
* [[Complementary methods of system usability evaluation: surveys and observations during software design and development cycles]]<br />
<br />
== References==<br />
<references/><br />
<br />
[[Category: Reviews]]<br />
[[Category:HI5313-2015-FALL]] <br />
[[Category: EHR]]<br />
[[Category: Usability]]<br />
[[Category: Dental]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-12T04:15:31Z<p>RoniMV: /* Background */</p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of [[ EMR| Electronic Health Records (EHRs)]].<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
[[Usability|Usability]] of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.<ref name=’'Sittig''> Sittig, D.F. and Stead, W.W. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994; 1: 108–123. </ref><br />
<br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: '''comments''' from clinicians and '''findings''' derived from formal evaluation by usability experts.<br />
#Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
#Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions.<br />
# Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software.<ref Name = ''Morae''> Morae. 3.1 ed., Okemos, MI: TechSmith Corporation; 2009. </ref><br />
# Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
* Consistency<br />
*Transparency<br />
* Control<br />
* Context<br />
* Terminology<br />
* Biomedical<br />
* Safety<br />
* Customization<br />
* Fault<br />
* Speed<br />
* Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system <br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Are three methods better than one]]<br />
<br />
== References == <br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-12T04:14:04Z<p>RoniMV: </p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of [[ EMR| Electronic Health Records (EHRs)]].<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
Usability of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.<ref name=’'Sittig''> Sittig, D.F. and Stead, W.W. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994; 1: 108–123. </ref><br />
<br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: '''comments''' from clinicians and '''findings''' derived from formal evaluation by usability experts.<br />
#Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
#Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions.<br />
# Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software.<ref Name = ''Morae''> Morae. 3.1 ed., Okemos, MI: TechSmith Corporation; 2009. </ref><br />
# Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
* Consistency<br />
*Transparency<br />
* Control<br />
* Context<br />
* Terminology<br />
* Biomedical<br />
* Safety<br />
* Customization<br />
* Fault<br />
* Speed<br />
* Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system <br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==<br />
<br />
*[[Usability Evaluation of a Personal Health Record]]<br />
*[[Are three methods better than one]]<br />
<br />
== References == <br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-12T04:07:19Z<p>RoniMV: /* Methods */</p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of [[ EMR| Electronic Health Records (EHRs)]].<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
Usability of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.<ref name=’'Sittig''> Sittig, D.F. and Stead, W.W. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994; 1: 108–123. </ref><br />
<br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: '''comments''' from clinicians and '''findings''' derived from formal evaluation by usability experts.<br />
#Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
#Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions.<br />
# Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software.<ref Name = ''Morae''> Morae. 3.1 ed., Okemos, MI: TechSmith Corporation; 2009. </ref><br />
# Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
* Consistency<br />
*Transparency<br />
* Control<br />
* Context<br />
* Terminology<br />
* Biomedical<br />
* Safety<br />
* Customization<br />
* Fault<br />
* Speed<br />
* Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system <br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==<br />
<br />
== References == <br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-12T04:06:01Z<p>RoniMV: /* Background */</p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of [[ EMR| Electronic Health Records (EHRs)]].<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
Usability of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.<ref name=’'Sittig''> Sittig, D.F. and Stead, W.W. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994; 1: 108–123. </ref><br />
<br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: '''comments''' from clinicians and '''findings''' derived from formal evaluation by usability experts.<br />
<br />
#Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
<br />
#Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions.<br />
<br />
# Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software.<ref Name = ''Morae''> Morae. 3.1 ed., Okemos, MI: TechSmith Corporation; 2009. </ref><br />
<br />
# Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
* Consistency<br />
*Transparency<br />
* Control<br />
* Context<br />
* Terminology<br />
* Biomedical<br />
* Safety<br />
* Customization<br />
* Fault<br />
* Speed<br />
* Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system <br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==<br />
<br />
== References == <br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-12T04:03:21Z<p>RoniMV: </p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of Electronic Health Records (EHRs).<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
Usability of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.<ref name=’'Sittig''> Sittig, D.F. and Stead, W.W. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994; 1: 108–123. </ref><br />
<br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: '''comments''' from clinicians and '''findings''' derived from formal evaluation by usability experts.<br />
<br />
#Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
<br />
#Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions.<br />
<br />
# Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software.<ref Name = ''Morae''> Morae. 3.1 ed., Okemos, MI: TechSmith Corporation; 2009. </ref><br />
<br />
# Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
* Consistency<br />
*Transparency<br />
* Control<br />
* Context<br />
* Terminology<br />
* Biomedical<br />
* Safety<br />
* Customization<br />
* Fault<br />
* Speed<br />
* Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system <br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==<br />
<br />
== References == <br />
<References/><br />
<br />
<br />
[[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-12T03:59:24Z<p>RoniMV: </p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of Electronic Health Records (EHRs).<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
Usability of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.<ref name=’'Sittig''> Sittig, D.F. and Stead, W.W. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994; 1: 108–123. </ref><br />
<br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: '''comments''' from clinicians and '''findings''' derived from formal evaluation by usability experts.<br />
<br />
#Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
#Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions. <br />
# Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software. <br />
# Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
* Consistency<br />
*Transparency<br />
* Control<br />
* Context<br />
* Terminology<br />
* Biomedical<br />
* Safety<br />
* Customization<br />
* Fault<br />
* Speed<br />
* Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system <br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==<br />
<br />
== References == <br />
<br />
<References/><br />
<br />
[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-12T03:57:46Z<p>RoniMV: /* Background */</p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of Electronic Health Records (EHRs).<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
Usability of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.< ref name =’Sittig’ > Sittig, D.F. and Stead, W.W. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994; 1: 108–123. < /ref><br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: '''comments''' from clinicians and '''findings''' derived from formal evaluation by usability experts.<br />
<br />
#Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
#Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions. <br />
# Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software. <br />
# Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
* Consistency<br />
*Transparency<br />
* Control<br />
* Context<br />
* Terminology<br />
* Biomedical<br />
* Safety<br />
* Customization<br />
* Fault<br />
* Speed<br />
* Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system <br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==<br />
<br />
== References == <br />
<br />
<References/><br />
<br />
[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-12T03:56:21Z<p>RoniMV: /* Background */</p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of Electronic Health Records (EHRs).<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
Usability of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.<Ref name =’Sittig’> Sittig, D.F. and Stead, W.W. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994; 1: 108–123 < /Ref><br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: '''comments''' from clinicians and '''findings''' derived from formal evaluation by usability experts.<br />
<br />
#Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
#Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions. <br />
# Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software. <br />
# Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
* Consistency<br />
*Transparency<br />
* Control<br />
* Context<br />
* Terminology<br />
* Biomedical<br />
* Safety<br />
* Customization<br />
* Fault<br />
* Speed<br />
* Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system <br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==<br />
<br />
== References == <br />
<br />
<References/><br />
<br />
[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-12T03:55:36Z<p>RoniMV: </p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of Electronic Health Records (EHRs).<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
Usability of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.<Ref name =’Sittig’> Sittig, D.F. and Stead, W.W. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994; 1: 108–123 < /Ref><br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: '''comments''' from clinicians and '''findings''' derived from formal evaluation by usability experts.<br />
<br />
#Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
#Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions. <br />
# Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software. <br />
# Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
* Consistency<br />
*Transparency<br />
* Control<br />
* Context<br />
* Terminology<br />
* Biomedical<br />
* Safety<br />
* Customization<br />
* Fault<br />
* Speed<br />
* Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system <br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==<br />
<br />
== References == <br />
<br />
</References><br />
<br />
[Category:Reviews]]<br />
[[Category:Usability]]<br />
[[Category: EHR]]</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-12T03:53:40Z<p>RoniMV: /* Background */</p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of Electronic Health Records (EHRs).<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
Usability of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.<Ref name =’Sittig’> Sittig, D.F. and Stead, W.W. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994; 1: 108–123 < /Ref><br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: '''comments''' from clinicians and '''findings''' derived from formal evaluation by usability experts.<br />
<br />
#Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
#Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions. <br />
# Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software. <br />
# Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
* Consistency<br />
*Transparency<br />
* Control<br />
* Context<br />
* Terminology<br />
* Biomedical<br />
* Safety<br />
* Customization<br />
* Fault<br />
* Speed<br />
* Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system <br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-12T03:52:45Z<p>RoniMV: </p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of Electronic Health Records (EHRs).<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
Usability of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.<br />
<br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: '''comments''' from clinicians and '''findings''' derived from formal evaluation by usability experts.<br />
<br />
#Email via embedded link - 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
#Online Survey - a link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions. <br />
# Think aloud study and observations - evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software. <br />
# Walkthroughs expert evaluations and interviews - a team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
* Consistency<br />
*Transparency<br />
* Control<br />
* Context<br />
* Terminology<br />
* Biomedical<br />
* Safety<br />
* Customization<br />
* Fault<br />
* Speed<br />
* Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system <br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: surveys and observations during software design and development cycles2015-11-12T03:42:09Z<p>RoniMV: Created page with "This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et..."</p>
<hr />
<div>This is a review of an article titled Complementary methods of system usability evaluation: surveys and observations during software design and development cycles by Horsky et al (2010).<br />
<br />
<br />
<br />
== Background ==<br />
<br />
Studies estimate up to 40% of systems are either abandoned or fail to meet business requirements and usability of information systems has a significant impact on the adoption of Electronic Health Records (EHRs).<br />
<br />
Clinicians resist being forced to change established workflows, long training times and excessive time spent completing data entry rather than with the patient.<br />
<br />
Usability of the system often has a direct relationship with error rates, clinical productivity, user fatigue and satisfaction all of which can impact user adoption.<br />
<br />
The objective of this study was to compare data from four usability evaluation methods and assess how useful they were in the software development process of the SmartForms function of an EHR.<br />
<br />
<br />
== Methods ==<br />
<br />
Four different studies of usability and human-computer interaction were conducted with a total of 45 physicians from Partners Healthcare Practice to collect two types of data: comments from clinicians and findings derived from formal evaluation by usability experts.<br />
<br />
1. Email via embedded link <br />
The 18 clinicians using the SmartForms, which is part of the outpatient clinical records system had the option of sending email messages by clicking embedded links in the application to open a free text window where they could type their comments. The comments were collected, date time and author stamped and logged.<br />
2. Online Survey<br />
A link to an online survey was sent via email to 15 participants with multiple-choice question about their satisfaction, frequency of use and problems with two open-ended questions. <br />
3. Think aloud study and observations<br />
Evaluations were conducted with 6 clinicians where they were asked they were asked to complete a series of task while they verbalized their thoughts. The 30-45 minutes sessions of each subject were recorded for video and audio with Morae usability software. <br />
<br />
4. Walkthroughs expert evaluations and interviews<br />
A team of health informatics professional conducted usability assessments, walkthroughs and interviews with 6 primary care providers whose experience with the application ranged from novice to expert.<br />
<br />
== Results ==<br />
<br />
Analysis of the data was conducted separately for comments by clinicians and on findings by usability experts.<br />
<br />
=== Usability Assessment ===<br />
<br />
The 155 statements collected about usability problems were collected, codded and formulated into 12 heuristic categories:<br />
• Consistency<br />
• Transparency<br />
• Control<br />
• Context<br />
• Terminology<br />
• Biomedical<br />
• Safety<br />
• Customization<br />
• Fault<br />
• Speed<br />
• Workflow<br />
<br />
All the results of the various methods and studies were presented in a number of tables and graphs.<br />
<br />
=== Comments ===<br />
<br />
The results indicated that emails were the most popular form of communication but their variety, fragmented and unstructured nature makes them to hard to interpret. Emails for three heuristic categories of Terminology, Fault and Biomedical made up 80%. <br />
<br />
=== Findings ===<br />
The results showed there were 47 findings, which were classified into the three categories of Cognition, Customization and Workflow<br />
<br />
== Conclusion ==<br />
Comments from clinicians working with the software in real settings are more descriptive and provide information on technical and biomedical errors that observational studies do not often capture.<br />
<br />
Findings from expert evaluations focused on conceptual and interaction related aspects of the application. The experts were also able to more readily capture positive and successful aspects of the design<br />
<br />
The results overall suggest that no one single method would comprehensibly suit evaluation of all usability issues. Each method is optimally suited to evaluation at different points for the design and deployment process for an EHR system <br />
== Comments == <br />
<br />
Overall however, each of the studies in this investigation work had very small sample sizes. <br />
While the description of the methods could have been presented in a clearer manner the tables and particularly graphs in this article are very informative. <br />
== Related Articles ==</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Complementary_methods_of_system_usability_evaluation:_Surveys_and_observations_during_software_design_and_development_cyclesComplementary methods of system usability evaluation: Surveys and observations during software design and development cycles2015-11-11T19:26:44Z<p>RoniMV: Created page with "This is an review of the article titled Complementary methods of system usability evaluation: Surveys and observations during software design and development cycles by Horsky ..."</p>
<hr />
<div>This is an review of the article titled Complementary methods of system usability evaluation: Surveys and observations during software design and development cycles by Horsky et al.<br />
<br />
<br />
'''Complementary methods of system usability evaluation: Surveys and observations during software design and development cycles.'''<br />
<br />
Jan Horsky, Kerry McColgan, Justine E. Pang, Andrea J. Melnikas, Jeffrey A. Linder, Jeffrey L. Schnipper, Blackford Middleton<br />
<br />
Journal of Biomedical Informatics Volume 43, Issue 5, October 2010, Pages 782–790</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Using_a_medical_simulation_center_as_an_electronic_health_record_usability_laboratoryUsing a medical simulation center as an electronic health record usability laboratory2015-11-11T19:21:06Z<p>RoniMV: Created page with "This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al '''Using a medical simulation ..."</p>
<hr />
<div>This is a review of the article titled Using a medical simulation center as an electronic health record usability laboratory by Landman et al<br />
<br />
<br />
<br />
'''Using a medical simulation center as an electronic health record usability laboratory.'''<br />
<br />
Landman AB1, Redden L, Neri P, Poole S, Horsky J, Raja AS, Pozner CN, Schiff G, Poon EG.<br />
<br />
J Am Med Inform Assoc. 2014 May-Jun;21(3):558-63. doi: 10.1136/amiajnl-2013-002233. Epub 2013 Nov 18.</div>RoniMVhttp://www.clinfowiki.org/wiki/index.php/Comprehensive_Analysis_of_a_Medication_Dosing_Error_Related_to_CPOEComprehensive Analysis of a Medication Dosing Error Related to CPOE2015-11-10T18:00:37Z<p>RoniMV: </p>
<hr />
<div>This is a systematic review of the article entitled “Comprehensive Analysis of a Medication Dosing Error Related to CPOE” by Jan Horsky <ref name= “Horskey 2005”> Comprehensive Analysis of a Medication Dosing Error Related to CPOE. [J Am Med Inform Association 2005;12:377–382. DOI 10.1197/jamia.M1740] http://dx.doi.org/10.1197/jamia.M1740 </ref>.<br />
<br />
==Introduction==<br />
New drugs that manage or relieve previously untreated diseases have come about due to new innovations in pharmacology research. These advancements in drug therapy have led to increased incidence of [[Adverse drug event]] (ADEs) due to avoidable causes such as prescribing errors. <br />
[[Computerized physician order entry]] (CPOE) systems are known to drastically minimize the incidence of ADEs by confirming legibility of orders and integrating clinical decision support ([[CDS]]) such as checking for allergies.<br />
<br />
However, the progressive effect of CPOE on prescribing safety can be compromised by the advent of new forms of errors. These errors are related to the intricacy of the human-computer interaction and may be a consequence of poor user training or inadequate understanding of data handling by a CPOE application. <br />
<br />
Understanding the acuity of users at crucial stages of an incident that occurred during the use of CPOE is extremely beneficial to the process of characterizing cognitively based errors.<br />
In this article, the case of a serious medication error that occurred at a large academic medical institution is described and a synopsis of how the error was analyzed is discussed. The authors hope that characterization of the entire process of the error will provide key insight and recommendations for improving CPOE systems and clinical ordering procedures.<br />
<br />
==Case Description==<br />
*An elderly man was admitted to a medical intensive care unit with septic shock and respiratory failure then transferred to a pulmonary service unit.<br />
*On a Saturday morning, Provider A diagnosed the patient as hypokalemic after observing a low serum KCL in the setting of renal insufficiency.<br />
*Provider A decided to replete the patient’s KCL by providing 40 mEq of KCL via an IV route over a period of 4 hours as indicated by institutional guidelines.<br />
*After the order was entered, Provider A realized that the patient already had an IV fluid line and subsequently decided to provide KCL as an additive to the currently running IV fluid.<br />
*Provider A then entered a new order for infusion of 100 mEq of KCL in 1 liter of D5W solution at a rate of 75ml/hr. <br />
*The order for 40 mEq of KCL through IV was supposed to be discontinued at this point but Provider A mistakenly discontinued a similar order entered by another clinician from two days earlier.<br />
*Provider A then received notification from the pharmacy department that the dose of 100 mEq of KCL in 1 liter of D5W was higher than the maximum allowed for the facility. <br />
*Provider A discontinued the order for 100 mEq of KCL in 1 liter of D5W and wrote a new order for 80mq/L KClr.<br />
*This new order for 80mq/L KClr was supposed to deliver 1L of fluid however the order did not specify stop time or maximum volume of fluid to be delivered.<br />
*As a result, the fluid continued to be administered for 36 hours. Unfortunately, Provider A unintentionally caused the patient to receive a total of 256 mEq KCL over 36 hours.<br />
*On Sunday morning, there was a change in coverage. Provider A asked Provider B to check the patient’s KCl level.<br />
*Provider B reviewed the patient’s most recent serum KCl which was taken on Saturday morning (before the infusion of potassium). The value was 3.1 mEq/ L which indicated that the patient was hypokalemic. Provider B did not realize that the lab result was indicative of the patient’s potassium status prior to unnecessary KCl repletion.<br />
*Provider B then ordered 60mEq KCl by injection to be given even while the previous potassium drip was still running.<br />
*Order entry logs revealed that another dose of 40mEq KCl IV injection was also ordered by Provider B but no clear evidence from sources indicate that it was actually given.<br />
*Therefore, the patient received a total volume of 316 mEq KCl over 42 hours.<br />
*On Monday, when the patient’s potassium levels were checked, the patient was found to be dangerously hypokalemic with a serum potassium level of 7.8 mEq/L.<br />
*Once the errors were discovered, immediate measures were taken and the patient was treated.<br />
<br />
==Methods and Examples==<br />
The case was reviewed by the hospital Significant Event Committee and experts in cognitive evaluation of information systems. The mission was to identify possible cognitive errors in the chain actions that led to the [[medication error]] and make suggestions to change system interface design and user training in order to eliminate the chance of a similar event. Three significant methods were used in order to create a reconstruction of events that took place.<br />
===Analysis of Order Entry Logs===<br />
All medication orders for the patient over the three days that the incident took place were evaluated. From this analysis, it was discovered that Provider A interacted with the order entry system on three occasions within a 2-hour period. Provider B interacted with the system on three separate occasions, manipulating four orders within the span of an hour. Inappropriate use of CPOE application was also uncovered, such as the use of free-text comment field to limit total fluid volume to 1 liter.<br />
===Visual and Cognitive Evaluation of Ordering Screens===<br />
Unfortunately, the data captured by computer entry logs did not have any information regarding what values were visible on the screen at the time the orders were being filled. Six orders were identified as being potentially erroneous but it was uncertain what the users’ motives were for activating and discontinuing them. Other inconsistencies in visual layout, screen control behavior, and ordering clarity were examined.<br />
===Semi-structured Interviews with Clinicians===<br />
The purpose of these interviews was to integrate the collected data with personal observations and to discover how clinicians interpreted information available to them while using the order entry system. Also, any verbal exchanges with the patient and an explanation for the changes in the order were examined. <br />
<br />
==Results and Discussion==<br />
It was found that this medication error occurred as a result of several factors:<br />
*Misconceptions about the relation between intravenous volume and time duration<br />
*Sub-optimal display of IV bolus injection and medicated fluid drip orders<br />
*Misconception of latest and “dated” laboratory results<br />
*Lack of certain automated checking functions in the order entry system<br />
*Inadequate training of safe and efficient ordering practices<br />
<br />
===Specific Recommendations for System and Ordering Procedure Changes===<br />
The hospital’s Medication Safety and Informatics Committee made the following recommendations for changes:<br />
*Screens for ordering continuous IV fluid drips and drips of limited volume need to be clearly distinct so that the ordering of each is unambiguous.<br />
*Screens that list active medication orders also should list IV drip orders.<br />
*Laboratory results review screen needs to clearly visually indicate when the most recent results are not from the current day.<br />
*Add an alert that would inform users, ordering potassium (drip or bolus) when the patient already has another active order for potassium.<br />
*Add an alert informing users ordering potassium when there has not been a serum potassium value recorded in the past 12 hours or the most recent potassium value is greater than 4.0. This would reduce the likelihood of ordering potassium when the patient is hyperkalemic.<br />
*Make other minor changes to increase the consistency of ordering screen behavior.<br />
*Training for the order entry application should not be limited to procedural knowledge but should emphasize conceptual understanding and safe entry strategies.<br />
<br />
==Conclusion==<br />
The basis of this medical error was as a result of failures in interaction among human and system agents. The classes of errors that we described are likely to occur in similar systems at other institutions. Sophisticated information systems require comprehensive analyses of human errors for design changes that accentuate clarity of communicated information and employ useful safeguards against patient injury.<br />
<br />
==Comments==<br />
This article is very useful for understanding medical errors based on user cognition while using order entry systems. Extensive research needs to be done in order to enhance visual display, cognition-friendly functions and decision support in health information technology systems.<br />
<br />
==References==<br />
<references/><br />
<br />
==Related Articles==<br />
[[Medication errors: prevention using information technology systems]]<br />
<br />
[[Category: CPOE]]<br />
[[Category: Medication Errors]]<br />
[[Category: Drug-drug interaction]]<br />
[[Category: Reviews]]<br />
[[Category: HI5313-2015-FALL]]<br />
[[Category: Medication Error]]</div>RoniMV