EHR Usability Test Report Of Epitomax 10 - Drummond Group, LLC

Transcription

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECEHR Usability Test Report ofEpitomax 10.0Report based on ISO/IEC 25062:2006Common Industry Format for Usability TestReportsThe information contained in this proposal is intended to be for the use of the individual orentity designated above. Any disclosure, reproduction, distribution or other use of thisdocument by an individual or entity other than the intended recipient is prohibited.Report Prepared By:PsyTech Solutions, Inc.Date(s) of Usability Test:8/6/18-8/9/18Anthony KlineDate of t1138 Stone Creek DriveHummelstown, PA. 17036Page 1 of 34

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECTable of ContentsI.Executive Summary . 3II.Introduction . 7III.Method . 8A.Participants . 8B.Study Design. 10C.Objectives. 10D.Procedures . 10E.Test Location . 11F.Test Environment . 11G.Test Forms and Tools . 11H.Participant Instructions . 12I.Usability Metrics . 12IV.Data Scoring . 14V.Results . 15A.Data Analysis and Reporting . 15B.Discussion of the Findings . 15VI.Effectiveness . 16VII.Efficiency . 16VIII.Satisfaction . 16IX.Major Findings . 16X.Areas for Improvement . 17XI.Appendices . 18A.Appendix 1 . 18B.Appendix 2: Test Session Agenda and Objectives . 20C.Appendix 3: System Usability Scale Questionnaire . 27D.Appendix 4: Task step by step guide . 28Task 1 Demographics . 28Task 2 Computerized Provider Order Entry . 29Task 3 Problem List . 30Task 4 Implantable Device . 31Task 5 Clinical Decision Support . 32Task 6 Clinical Information Reconciliation . 33Page 2 of 34

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECI.Executive SummaryUsability testing of the PsyTech Solutions, Inc. EHR (Epitomax, version 10.0), an ambulatory EHR wereconducted virtually via web-conference between 8/6/18 and 8/9/18 by EMR Advocate, Inc., on behalfof PsyTech Solutions, Inc. The purpose of this testing was to test and validate several commonlyperformed tasks for usability within the newly 2015 Edition revised user interface and provide evidenceof usability for the EHR Under Test (EHRUT). During the usability test, 10 healthcare providers matchingthe target demographic criteria served as participants and used the EHRUT in simulated, butrepresentative tasks.This study collected performance data on 6 objectives, corresponding to the 2015 Edition criteria, thatare going through certification and would be realistic and representative of the kinds of activities a usermight perform with this EHR. The tasks were force-ranked based on the risk as to the potential tocause patient harm if not completed correctly, including:Task 1: Opening a Patient’s Chart and Entering and Editing Demographics Information§170.315 (a)(5) DemographicsStep 1: Open the patient Test Patient’s ChartStep 2: Navigate to the demographics page, then click EditStep 3: Enter the patient’s DOB as 02/14/1948Step 4: Select English as the patient’s Preferred LanguageStep 5: Select Female as the patient’s GenderStep 6: Select declined to specify as the patient’s Sexual OrientationStep 7: Select Female as the patient’s Gender IdentityStep 8: Select Black or African American as the patient’s RaceStep 9: Select Not Hispanic or Latino as the patient’s EthnicityStep 10: Click UpdateStep 11: Return to the demographic page and edit DOB and Gender IdentityStep 12: Click UpdateStep 13: Observe the newly saved changesTask 2: Ordering and Editing Labs§170.315 (a)(2) CPOE – laboratory,Step 1: Navigate to the Episodes section of the flowsheetStep 2: Then Navigate to the Orders tabStep 3: Select the Test Lab (Fasting Blood Sugar)Step 4: Select Date/TimeStep 5: Click UpdateStep 6: Click EditStep 7: Alter Start Date/TimeStep 8: Click UpdatePage 3 of 34

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECTask 3: Entering and Editing the Problem List§170.315 (a)(6) Problem listStep 1: Navigate to the Diagnosis TabStep 2: Click the Add button to add a new ProblemStep 3: Search for F33.2 – Major Depressive disorderStep 4: Select Diagnosis for the problem Type, today’s date and Priority 1Step 5: Click UpdateStep 6: Select EditStep 6: Enter 3/24/2017 as the date for the Diagnosis DateStep 7: Click UpdateTask 4: Adding an Implantable Device§170.315 (a)(14) Implantable device listStep 1: Navigate to the Implantable Device tabStep 2: Click the Add buttonStep 3: In the UDI field enter this device 1(21)1234Step 4: Click the Retrieve Data buttonStep 5: Review the returned information and click UpdateStep 6: Click EditStep 7: Enter comments into the Comments SectionStep 8: Click UpdateTask 5: Clinical Decision Support§170.315 (a)(9) Clinical decision supportStep 1: Navigate to the Administration Tab and thento the Clinical Intervention Rules Maintenance TabStep 2: Click AddStep 3: Create Age Min – 40 Age Max - 75Step 5: In the Problem field enter Malignant Neoplasm of Colon NOS - %269461008Step 5: Check the Rule is Enabled Box for PsychologistsStep 6: Enter the Clinical Intervention message: Screen for colorectal cancer trial –Screen for possible inclusionStep 7: Enter the following references:Bibliographic Citation URL: ls for the Bibliographic ReferenceFunding Source: NIHReference Date: 1/11/2017Author: The American Cancer Society medical and editorial content teamPage 4 of 34

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECStep 8: Click UpdateStep 9: Navigate to the Diagnosis and add Malignant Neoplasm ofColon and review triggered alertStep 10: Enter that you reviewed this alert on today’s date then clickMark as reviewedStep 11: Repeat creation and alert generation for Medications,Allergies, Vitals, Labs, ProblemsTask 6: Clinical Information Reconciliation§170.315 (b)(2) Clinical information reconciliation andincorporationStep 1: Navigate to the Clinical Summary tab of the patient’s chartStep 2: Locate the referral note (C-CDA) and click IncorporateStep 3: Click Resolve the AllergiesObserve the Allergies on the Imported List vs. the Current ListSelect the Allergies to move to the Current List from the Imported ListSelect the Allergies to delete form the Current ListSelect the Review TabAfter confirmation click the Update tabStep 4: Click Resolve the MedicationsObserve the Medications on the Imported List vs. the Current ListSelect the Medications to move to the Current List from the Imported ListSelect the Medications to delete/discontinued form the Current ListSelect the Review TabAfter confirmation click the Update tabStep 5: Click Resolve the DiagnosisObserve the Diagnosis on the Imported List vs. the Current ListSelect the Diagnosis to move to the Current List from the Imported ListSelect the Diagnosis to delete form the Current ListSelect the Review TabAfter confirmation click the Update tabStep 7: Review the changes to the patient’s chartStep 7: Click the generate C-CDA tab to generate a new C-CDA (with new data)Step 8: Observe the generation of Clinical Decision Support Rules for Medications, MedicationAllergies and Problems form the incorporated data from the C-CDADuring the one-on-one usability test, each participant was greeted by the administrator and given anoverview of the test procedure. All the participants had prior experience with the EHR; all were giventhe opportunity to see the new features for the 2015 Edition, where they could watch a trainercomplete each of the objectives that were new features within the Epitomax EHR. The administratorintroduced the test and instructed participants to complete a series of Tasks (given one at a time) usingPage 5 of 34

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECthe EHRUT. During the testing the administrator timed the test and recorded user performance dataelectronically. The administrator did not give the participants any additional assistance in how tocomplete the objectives.The following types of data were observed/collected for each participant: Number of objectives successfully completed within the allotted time without assistanceTime to complete the objectivesNumber and types of errorsPath deviationsParticipant’s verbalizationsParticipant’s satisfaction ratings of the systemAll participant data was de-identified – no correlation can be made from the identity of the participantto the data collected. Following the conclusion of the testing, participants verbally answered a post-testquestionnaire. Participants were compensated for their time with a 25 gift card. Variousrecommended metrics, in accordance with the examples set forth in the NIST Guide to the ProcessesApproach for Improving the Usability of Electronic Health Records, were used to evaluate the usabilityof the EHRUT.Following is a summary of the performance data collected on the 10100%0%0:1513%0%5Task 3: Entering and EditingProblem List10100%0%0:258%0%5Task 4: Entering and Editing anImplantable 1:3016%0%5TaskTask 1: Opening a Patient’sChart and Entering and EditingDemographics InformationTask 2: Ordering and Editing LabsTask 5: Recording and Changing andTriggering Clinical Decision SupportelementsTask 6: Clinical ReconciliationPathDeviationTaskRatings5 EasyNTask TimeErrorsDeviationsDeviationsMean(Observed / Mean (SD) (Observed / Mean (SD)(SD)Optimal)Optimal)The results from the System Usability Scale scored the subjective satisfaction with the system based on theperformance with these objectives to be 73.Page 6 of 34

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECIn addition to the performance data, the following qualitative observations were made: II.Major Findingso There were no deviations from the optimal path to complete the Tasks. Theefficiency of data entry will improve with use. All participants successfullycompleted the tasks and stated that the ease of use was a 5 out of 5.oThe likelihood of an error, and the potential to cause harm to a patient is very lowdo to the logical workflow and intuitive design of the tasks.oThe most risk prone area of the system, of the tasks performed, is clinicalinformation reconciliation. This was mainly because the system requires so manyclicks and going to different parts of the EHR. Bringing in a C-CDA was themost foreign task for all participants. It is not currently a significant part of theirdaily routine to chart a patient encounter. The benefits of electronic incorporationof Medications, Problems and Medication Allergies needs to be realized by theparticipants as they do it more often.oAll users voluntarily commented that they would like the final step for all dataentry steps to be “Save” vs. the current button “Update”. This was to make thisEHR function consistent with most every software application that they use.Overall the EHR software was easy to use. Many of the users were surprised tolearn about Clinical Decision Support and Clinical Information Reconciliationfunctionalities. There was a lot of positive feedback given to the new Lab Orderfeature. This would improve the process.oOne participant would see the CDS configuration function expanding to otherareas of patient process including administrative areas, like insurance verification.Areas for Improvemento Having the “Update” button go to “Save”.o Add a Favorites dropdown to the Diagnosis selection based on the Super Bill list.o Add a Favorites dropdown to the Lab Orders based on the most commonlyordered labs for Behavioral Health.o Reorder the Race and Ethnicity dropdowns to have the most often selected entriesat the top of the list.IntroductionThe EHRUT tested for this study was Epitomax version 10.0, an ambulatory EHR. Designed to presentmedical information to healthcare providers in the Behavioral Health scope-of-practice. The EHRUT iscloud-based and can be accessed from anywhere with an internet connection and web-browser. Theusability testing replicated realistic exercises and conditions.The purpose of this study was to test and validate the usability of the current user interface and provideevidence of usability in the EHR Under Test (EHRUT). To this end, measures of effectiveness, efficiency,and user satisfaction, such as time on objective, objective ratings, and path deviations were capturedPage 7 of 34

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECduring the usability testing.III.MethodA.ParticipantsA total of 10 participants were tested on the EHRUT. Participants in the test were comprised of a groupof Clinical Social Workers, Medical Assistants, Physicians and Mental Health Counselors. Participantswere recruited by PsyTech Solutions, Inc. and were compensated for their time with a 25 giftcertificate. The participants had no direct connection to the development or organization producing theEHRUT. Participants were not from the testing or supplier organization. Participants were given theopportunity to have the same orientation and level of training as the actual end users would havereceived.For the test purposes, end-user characteristics were identified and translated into a recruitmentscreener used to solicit potential participants; an example of a screener is provided in Appendix 1.Recruited participants had a mix of backgrounds and demographic characteristics conforming to therecruitment screener. The following is a table of participants by characteristics, including demographics,professional experience, computing experience and user needs for assistive technology. Participantnames were replaced with Participant ID’s so that an individual’s data cannot be tied back to individualidentities.Page 8 of 34

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECParticipant xperienceComputer ExperienceProduct (EHR)ExperienceDo you require anyassistive technologies touse a computer1Female23-39Some CollegeMedical Assistant10 yearsEmail, Accessing an EHR,Research, Reading News,Shopping, Banking, Social Media,Word Processor (i.e. MS Word)5 yearsNoClinical Social Worker9 yearsEmail, Accessing an EHR,Research, Reading News,Shopping, Banking, Social Media,Word Processor (i.e. MS Word)3 yearsNoClinical Social Worker8 yearsEmail, Accessing an EHR,Research, Reading News,Shopping, Banking, Social Media,Word Processor (i.e. MS Word)1 yearNoCollege Graduate(Clinical lege GraduateMental Health Counselor(Associate Degree)12 yearsEmail, Accessing an EHR,Research, Reading News,Shopping, Banking, Social Media,Word Processor (i.e. MS Word)5 yearsNo5Male40-59College Graduate(Masters of SocialWork)Mental Health Counselor3 yearsEmail, Accessing an EHR,Research, Reading News,Shopping, Banking, Social Media,Word Processor (i.e. MS Word)3 yearsNo6Female40-59Collage Graduate(MD/PhD)Physician10 yearsEmail, Accessing an EHR,Research, Reading News,Shopping, Banking, Social Media,Word Processor (i.e. MS Word)5 yearsNo7Male40-59Post Graduate(MD/PhD)Physician25 yearsEmail, Accessing an EHR,Research, Reading News,Shopping, Banking, Social Media,Word Processor (i.e. MS Word)6 yearsNoPhysician30 yearsEmail, Accessing an EHR,Research, Reading News,Shopping, Banking, Social Media,Word Processor (i.e. MS Word)4 yearsNoClinical Social Worker2 yearsEmail, Accessing an EHR,Reading News, Shopping5 yearsNoMedical Assistant15 yearsEmail, Accessing an EHR,Research, Reading News,Shopping, Banking, Social Media,Word Processor (i.e. MS Word)15 yearsNoMasters DegreeSocial Work8Male60-74Post Graduate(MD/PhD)9Female23-39Post Graduate(MD/PhD)College Graduate(RN, BSN)10Female40-5910 participants (matching the demographics in the section on Participants) were recruited and 10participated in the usability test.Participants were scheduled for 1-hour sessions and GoToMeeting calendar invites were used tokeep track of the participant schedule.Page 9 of 34

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECB.Study DesignOverall, the objective of this test was to uncover areas where the application performed well – that iseffectively, efficiently, and with satisfaction – and areas where the application failed to meet the needsof the participants. The data from this test may serve as a baseline for future tests with an updatedversion of the same EHR and/or comparison with other EHR’s provided the same objectives are used. Inshort, this testing serves as both a means to record or benchmark current usability, but also to identifyareas where improvements must be made.During the usability test, participants interacted with the Epitomax version 10.0 EHR as designed forthe 2015 Edition Certification. Each participant was provided the same instructions. The system wasevaluated for effectiveness, efficiency, and satisfaction as defined by measures collected andanalyzed for each participant: Number of objectives successfully completed within the allotted time without assistanceTime to complete the objectivesNumber and types of errorsPath deviationsParticipant’s verbalizations (comments)Participant’s satisfaction ratings of the systemAdditional information about the various measures can be found on Usability Metrics (See Section I).C.ObjectivesA number of force-ranked Tasks were constructed that would be realistic and representative of thekinds of activities a user might do with this EHR, including (full details provided in Appendix 4):Task 1: Opening a Patient’s Chart and Entering and Editing Demographics InformationTask 2: Ordering and Editing LabsTask 3: Entering and Editing Problem ListsTask 4: Entering and Editing an Implantable DeviceTask 5: Recording and Changing and Triggering Clinical Decision Support elementsTask 6: Clinical ReconciliationThe Tasks were selected based on their frequency of use, criticality of function, and those that may bemost troublesome for users. The Tasks were constructed considering the study objectives.Additionally, based on the risk as to the potential to cause patient harm if not completed correctly.The participants were asked to offer comments if they noticed any potential for data entry error, and orpotentials to cause patient harm. None were offered.D.ProceduresUpon joining the GoToMeeting conference, participants were greeted. Participants were thenassigned a participant ID.The test administrator/usability tester ensured the tests ran smoothly. The usability testerPage 10 of 34

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECconducting the test was an experienced usability practitioner with 5-plus years of experienceconducting these tests.The administrator/usability tester moderated the session including administering instructions andobjectives. The administrator/usability tester also monitored objective times, obtained postobjective rating data, and took notes on participant comments. The administrator/usability testeralso served as the data logger and took notes on objective success, path deviations, number and typeof errors, and comments.Participants were instructed to perform the Tasks (see specific instructions below): As quickly as possible making as few errors and deviations as possible.Without assistance; administrators could give immaterial guidance andclarification on objectives, but not instructions on use.Without using a think aloud technique.For each objective, the participants were given a written copy of the objective. Objective timing beganonce the administrator finished reading the question. The objective time was stopped once theparticipant indicated they had successfully completed the objective. Scoring is discussed below insection 4.Following the session, the administrator/usability tester gave the participant the post-testquestionnaire and thanked everyone for their participation.Participants’ demographic information, objective success rate, time on objective, errors, deviations,verbal responses, and post-test questionnaire were recorded into a spreadsheet.E.Test LocationAll tests were conducted virtually via GoToMeeting. Only the participant and the administrator/usabilitytester were on the web-conference.F.Test EnvironmentThe EHRUT would be typically used in a healthcare office or facility. In this instance, the testing wasconducted virtually. Some of the participants were in a healthcare office, and others were at home. Fortesting, all users used a Windows or computer with Google Chrome web-browser and an internetconnection. The participants used a mouse and keyboard when interacting with the EHR.Technically, the system performance (i.e. response time) was representative to what actual users wouldexperience in a field implementation. Additionally, participants were instructed not to change any of thedefault system settings (such as control of font size).G.Test Forms and ToolsDuring the usability test, various documents and instruments were used including:1) Test Session Agenda2) Post-test QuestionnairePage 11 of 34

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECExamples of these documents can be found in Appendices 3 and 4 respectively.The participant’s interaction with the EHRUT was not captured or recorded digitally.H.Participant InstructionsThe administrator reads the following instructions aloud to each participant:“Hi, Thanks for taking the time to participate in our EHR usability testing study.Before we get started I would like to provide a little context around the flow of today’s session.You will be completing a set of 6 Tasks in your typical EHR. For the first 5 minutes we willreview each of the Tasks you will be asked to complete during the test. Once you’re comfortablewith the objectives we will begin the test (approximately 30 minutes). I want to call out that this isnot a test of how well you know or use the system, but rather you are testing the system for us.After each objective I will ask you to rate the difficulty of the objective on a 0-5 scale, 0 beingextremely difficult and 5 being very easy. Once you are finished with the test I will ask you tocomplete the post-test questionnaire. Any questions? Alright let’s begin.”Participants were given 6 Tasks to complete. The detailed objectives are listed in Appendix 4.I.Usability MetricsEffectiveness:Objective FailuresIf the participant abandoned the objective, did not reach the correctanswer or performed it incorrectly, or reached the end of the allottedtime before successful completion, the objective was counted as an“Failures.” No objective times were taken for errors.The total number of errors was calculated for each objective and thendivided by the total number of times that objective was attempted. Notall deviations would be counted as errors.11 This should also beexpressed as the mean number of failed objectives per participant.On a qualitative level, an enumeration of errors and error types shouldbe collected.Efficiency:ObjectiveDeviationsThe participant’s path (i.e., steps) through the application wasrecorded. Deviations occur if the participant, for example, went to awrong screen, clicked on an incorrect menu item, followed an incorrectlink, or interacted incorrectly with an on-screen control. This path wascompared to the optimal path. The number of steps in the observedpath is divided by the number of optimal steps to provide a ratio of pathdeviation.It is strongly recommended that objective deviations be reported.Optimal paths (i.e., procedural steps) should be recorded whenconstructing objectives.Efficiency:Objective TimeEach objective was timed from when the administrator said “Begin”until the participant said, “Done.” If he or she failed to say “Done,” thetime was stopped when the participant stopped performing theobjective. Only objective times for objectives that were successfullycompleted were included in the average objective time analysis.Average time per objective was calculated for each objective.Variance measures (standard deviation and standard error) were alsocalculated.Page 12 of 34

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECSatisfaction:Objective RatingParticipant’s subjective impression of the ease of use of theapplication was measured by administering both a simple postobjective question as well as a post-session questionnaire. After eachobjective, the participant was asked to rate “Overall, this objectivewas:” on a scale of 1 (Very Difficult) to 5 (Very Easy). These data areaveraged across participants.Common convention is that average ratings for systems judged easyto use should be 3.3 or above.To measure participants’ confidence in and likeability of the [EHRUT]overall, the testing team administered the System Usability Scale(SUS) post-test questionnaire. Questions included, “I think I wouldlike to use this system frequently,” “I thought the system was easy touse,” and “I would imagine that most people would learn to use thissystem very quickly.” See full System Usability Score questionnaire inAppendix 3.According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic HealthRecords, EHR’s should support a process that provides a high level of usability for all users. The goal isfor users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction.To this end, metrics for effectiveness, efficiency, and user satisfaction were captured during the usabilitytesting. The goals of the test were to assess:1) Effectiveness of EHRUT by measuring participant success rates and errors2) Efficiency of EHRUT by measuring the average objective time and path deviations3) Satisfaction with EHRUT by measuring ease of use ratingsPage 13 of 34

1138 Stone Creek Drive, Hummelstown, PA 17036 866.3.PSYTECIV.Data ScoringThe following table (Table 1) details how objectives were scored, errors, evaluated, and the time sRationale and ScoringAn objective was counted as a “Success” if the participant wasable to achieve the correct outcome, without as

Epitomax 10.0 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports The information contained in this proposal is intended to be for the use of the individual or entity designated above. Any disclosure, reproduction, distribution or other use of this