Specifying And Comparing Implementation Strategies Across Seven Large .

Transcription

Perry et al. Implementation Science(2019) ARCHOpen AccessSpecifying and comparing implementationstrategies across seven largeimplementation interventions: a practicalapplication of theoryCynthia K. Perry1* , Laura J. Damschroder2,3, Jennifer R. Hemler4, Tanisha T. Woodson5, Sarah S. Ono5 andDeborah J. Cohen5AbstractBackground: The use of implementation strategies is an active and purposive approach to translate researchfindings into routine clinical care. The Expert Recommendations for Implementing Change (ERIC) identified anddefined discrete implementation strategies, and Proctor and colleagues have made recommendations for specifyingoperationalization of each strategy. We use empirical data to test how the ERIC taxonomy applies to a largedissemination and implementation initiative aimed at taking cardiac prevention to scale in primary care practice.Methods: EvidenceNOW is an Agency for Healthcare Research and Quality initiative that funded seven cooperativesacross seven regions in the USA. Cooperatives implemented multi-component interventions to improve hearthealth and build quality improvement capacity, and used a range of implementation strategies to foster practicechange. We used ERIC to identify cooperatives’ implementation strategies and specified the actor, action, target,dose, temporality, justification, and expected outcome for each. We mapped and compiled a matrix of the specifiedERIC strategies across the cooperatives, and used consensus to resolve mapping differences. We then groupedimplementation strategies by outcomes and justifications, which led to insights regarding the use of and linkagesbetween ERIC strategies in real-world scale-up efforts.Results: Thirty-three ERIC strategies were used by cooperatives. We identified a range of revisions to the ERICtaxonomy to improve the practical application of these strategies. These proposed changes include revisions tofour strategy names and 12 definitions. We suggest adding three new strategies because they encapsulate distinctactions that were not described in the existing ERIC taxonomy. In addition, we organized ERIC implementationstrategies into four functional groupings based on the way we observed them being applied in practice. Thesegroupings show how ERIC strategies are, out of necessity, interconnected, to achieve the work involved in rapidlytaking evidence to scale.Conclusions: Findings of our work suggest revisions to the ERIC implementation strategies to reflect theirutilization in real-work dissemination and implementation efforts. The functional groupings of the ERICimplementation strategies that emerged from on-the-ground implementers will help guide others in choosingamong and linking multiple implementation strategies when planning small- and large-scale implementationefforts.(Continued on next page)* Correspondence: perryci@ohsu.edu1School of Nursing, Oregon Health & Science University, 3455 SW USVeterans Hospital Rd, Portland, OR 97239, USAFull list of author information is available at the end of the article The Author(s). 2019 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, andreproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link tothe Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication o/1.0/) applies to the data made available in this article, unless otherwise stated.

Perry et al. Implementation Science(2019) 14:32Page 2 of 13(Continued from previous page)Trial registration: Registered as Observational Study at www.clinicaltrials.gov (NCT02560428).Keywords: Implementation strategy mapping, Implementation strategies, Large-scale initiative, Capacity building,Implementation facilitation,BackgroundIt is well recognized that an active and purposive approachis required to translate research findings into routine clinical care. Methods used to disseminate and implement research findings into clinical practice are termedimplementation strategies, [1] and interventions designedto facilitate the application of evidence into clinical practice typically use a combination of implementation strategies tailored to a particular clinical context. Multipleimplementation frameworks [2, 3] provide guidance onhow to define factors that may explain or predict implementation outcomes and how to assess implementationinterventions (e.g., Consolidated Framework for Implementation Research (CFIR) [4]; Promoting Action on Research in Health Services (PARIHS) [5]; NormalizationProcess Theory (NPT) [6]). These frameworks providetheoretical structures and are an indispensable foundationfor developing testable hypothesized models, establishingan understanding of the factors that impact implementation, and building a base of implementation knowledge [7,8]. However, these frameworks may not necessarily provide guidance about how to organize detailed implementation strategies to achieve specific outcomes, may useinconsistent names for implementation strategies, andmay lack adequate strategy descriptions—making understanding and replication difficult.Powell and colleagues sought to address the problemof multiple names and definitions for implementationstrategies in 2012 [9]. Through a comprehensive reviewof the literature, they identified 68 discrete strategiesand organized them into six key implementation domains. These domains were organized by type, withsimilar implementation strategies grouped into the samedomain. They then updated and expanded this list byengaging a broad group of implementation scientists ina three-phase, modified-Delphi process that refined andexpanded the list to 73 discrete implementation strategies, each with a name and definition [10]. These 73strategies were grouped into nine clusters based uponstrategy type, and ratings of importance (high/low) andfeasibility (high/low) were elicited for each strategy [11].This updated compilation is known as the Expert Recommendations for Implementing Change (ERIC).ERIC has made an important theoretical contributionto identifying a broad range of implementation strategiesand developing a common nomenclature with definitions. An important next step toward advancing thiswork is to test and refine the taxonomy based on theuse of these implementation strategies in on-the-groundimplementation efforts; Proctor and colleagues [1] recognized this need and have started this work. They haverecommended that implementation strategies be furtherspecified to include such details as the actor (who isenacting the implementation strategy), the actions (specific activities to support implementation), the targets(entity impacted), temporality, dose, expected outcome,and justification. Researchers have used the ERIC andProctor specifications to understand the operationalization of implementation strategies and have found thisuseful [12–16]. The level of specification has been helpful in identifying how strategies can be tailored to different settings [14], fostering comparison across clinicalsites in a multisite study [16], and helping to inform thedevelopment of activity logs to track implementationstrategies [13].Researchers who applied the ERIC and Proctor frameworks to real-world implementation efforts recommended additional implementation strategies for theERIC taxonomy [13]. Others have used ERIC to guidedetailed mapping of implementation strategies duringimplementation study planning [12]; one of these studiesused ERIC to develop a blueprint to guide implementation efforts [15]. As all implementers know, however,there is a big difference between what you propose orplan to do during implementation and what really happens. To date, few studies (and none we could identify)use data from large-scale, multi-site implementation efforts to apply and refine the ERIC taxonomy.This paper reports a study we conducted to apply theERIC taxonomy and the Proctor reporting recommendations to multiple large-scale studies. We use empiricaldata collected during active implementation with an intent of further refining ERIC, if changes were identifiedas necessary.MethodsSettingThe setting for this study is the EvidenceNOW initiative[17] funded by the Agency for Healthcare Research andQuality (AHRQ). EvidenceNOW focused on rapid dissemination and implementation of cardiovascular preventive care (appropriate aspirin use, blood pressure andcholesterol management, and smoking cessation—theABCS of heart health) among smaller primary care

Perry et al. Implementation Science(2019) 14:32practices as well as capacity building (e.g., quality improvement experience, data access, practice leadership).In the request for applications, AHRQ “strongly encouraged” the use of specific implementation strategies, including practice facilitation, data review and feedbackand benchmarking, peer-to-peer learning, and expertconsultation [18], and each cooperative was required toproduce clinical quality measures and practice capacitymeasurement to determine change over time in thesedomains. In addition to funding seven cooperatives,AHRQ also funded a national evaluation of EvidenceNOW called Evaluating System Change to AdvanceLearning and Take Evidence to Scale (ESCALATES) toharmonize and coordinate the collection of quantitativedata, to collect additional qualitative data, and to bringtogether the cross-cooperative comparative findings. Formore details on the ESCALATES evaluation, see Cohenet al. [19].SampleEvidenceNOW funded seven regional cooperatives thatspanned 12 states (Oregon, Washington, Idaho, Colorado,New Mexico, Oklahoma, Wisconsin, Illinois, Indiana,New York City, North Carolina, Virginia). For details onthe regions and states, see the EvidenceNOW website [17]and Ono et al. [20]. Each cooperative engaged over 200primary care practices in this effort (N 1721). Practiceswere smaller ( 10 clinicians) and varied with respect toownership and geographic location. Accomplishing thegoal of large-scale rapid dissemination and implementation in less than 3 years required a large workforce anduse of a range of implementation strategies.Data sourcesWe reviewed and triangulated four sources of qualitativedata from years one and two of the EvidenceNOW initiative to identify and define the cooperatives’ implementation strategies. First, we reviewed the cooperatives’study proposals, which provided an outline of broad,overarching implementation strategies. Second, we examined cooperative-level documents, including trainingtools and conceptual models, to further specify contentof these overarching strategies. We also analyzed facilitator training documents, which provided insights intospecific activities and implementation strategies that facilitators would use to support practices. For example,templates for facilitators to document and track theirwork provided detailed descriptions of activities that facilitators may perform. The third source of data wereentries written by cooperative team members on the ESCALATES interactive online diary [21]. The online diary[21] is a web-based platform that allowed our evaluationteam to communicate with each cooperative team abouttheir implementation experiences and clarify detailsPage 3 of 13about on-the-ground implementation support they wereproviding to practices. These entries were made duringactive implementation of the interventions across cooperatives. The fourth data source analyzed comprised fieldnotes from site visits to cooperatives and transcriptsfrom interviews with cooperative leadership and keyteam members during active implementation. Analyzinginterview transcripts and field notes provided importantinsight into the pragmatic, on-the-ground use of implementation strategies to support practice change.All field notes were prepared soon after the site visit.Interviews followed a semi-structured guide and wereprofessionally transcribed, checked for accuracy, andde-identified. Field notes, interviews, and other documents were entered into Atlas.ti for data managementand analysis.Data analysisWe used an iterative and inductive analytic approach[22] to identify and describe cooperatives’ implementation strategies. Guided by Proctor and colleagues’ recommendations for specifying and reporting details onimplementation strategies [1], we constructed an intervention table for each cooperative describing the broadimplementation strategies and more detailed activitiesthat cooperatives included in their interventions andspecified the actors, action targets, dose, temporality, expected outcomes, and justification for each. Wemember-checked these tables with the cooperative leadership and refined these documents, as needed. (seeAdditional file 1 for an example of one such table). Fromthese seven intervention tables, we identified 266 detailed actions that were documented by cooperativemembers to help practices implement changes to improve capacity or ABCS.Next, four of the authors (LJD, CKP, JRH, TTW) independently mapped each action to a strategy in the ERICcompilation [10] based on alignment of each action withthe definition for each ERIC strategy. We met weeklyover 5 months to independently map and discuss mappings until we reached consensus. Throughout thisprocess, we operationalized ERIC definitions to helpguide decisions about alignment of ERIC implementation strategies with the EvidenceNOW actions and identified areas where ERIC definition expansion, revision,and/or reorganization was needed to better reflect theactions cooperatives used and how that could help makethe ERIC strategies more pragmatic and easily appliedby researchers and practitioners.Next, we created a consolidated cross-cooperativematrix that organized implementation strategies basedupon their practical application. We pile-sorted each implementation strategy into functionally similar groupsbased on their justification and expected outcomes. We

Perry et al. Implementation Science(2019) 14:32reviewed and labeled each grouping and considered thecooperatives’ practical applications of strategies withinand across the groupings, recognizing interdependenciesand differences, such as “actors” (who enacts the strategies within a grouping; facilitators with health ve-employed data experts).ResultsThe 266 actions identified across the seven cooperativesmapped to 33 of the 73 ERIC implementation strategies.We propose refinements for 13 strategies. Table 1 liststhe ERIC strategies, our proposed changes, and our rationale for these changes. Recommended refinements include changes in strategy names for four strategies andchanges in definitions for 12 strategies. Additionally, wepropose adding three new distinct strategies which weidentified as being used in practices among the cooperatives: assess and redesign workflow, create online learningcommunities, and engage community resources. Definitions for these strategies are provided in Table 1. TheERIC strategies taxonomy included “Ancillary Material”that the authors included as an Additional file [10]. Werecommend changes in ancillary material for 15 strategies and provide ancillary material for the three newstrategies; details are provided in our Additional file 2.We will refer to ERIC strategies using our new proposedlabels when applicable (e.g., implementation facilitationinstead of ERIC’s original name of facilitation).We grouped the 33 ERIC strategies used by the cooperatives into four functional groupings: (1) build healthinformation technology to support data-informed qualityimprovement (QI), (2) build QI capacity and improveoutcomes, (3) enhance clinician and practice memberknowledge, and (4) build community connections andpatient involvement. Tables 2, 3, 4, and 5 list implementation strategies and identify the ERIC cluster to whicheach strategy belongs along with the actor, specific actions, and targets; each table covers of one of the fourgroupings listed above. The following sections describethe four overarching functions and the implementationstrategies within each grouping.Build health information technology to support datainformed QIMultiple implementation strategies are needed toachieve data-informed QI. A critical aspect of the cooperatives’ efforts to rapidly disseminate and implementevidence into practice was helping practices gain accessto data to support data-informed QI functions (seeTable 2). All cooperatives included an audit and providefeedback strategy to support data-informed QI. Thisstrategy relied on region- and practice-level HIT infrastructure to deliver trusted data to practices at regularPage 4 of 13intervals throughout intervention. Audit functions (e.g.,ability to produce ABCS performance reports) were necessary for feedback functions (e.g., communication ofthe audit to clinicians and staff ), which in turn, were necessary to inform QI efforts. For instance, generated datawere used to identify quality gaps and help to identifyopportunities on which to focus improvement efforts.These data were also used to monitor the impact thatchanges had on ABCS outcomes. Audit and providefeedback, thus, was a key strategy to improve ABCSoutcomes.On the ground, up to seven additional ERIC implementation strategies were necessary to accomplish building the infrastructure and capacity needed to implementand sustain audit and provide feedback functions, asshown in Table 2. All but one cooperative sought to deliver audit and provide feedback functions in a way thatcould be sustained past the time of their funded projectperiod. The degree to which these seven additional strategies were used by the cooperatives was based on the robustness of existing HIT infrastructure in their region.For example, one cooperative used all seven strategieslisted in Table 2, which required significant time and investment. This cooperative hired HIT experts to connectpractices’ electronic health record (EHR) systems withexternal registries (use data warehousing techniques)and to develop dashboards—available through centralized portals (develop and organize quality monitoringsystems)—to provide practices with quality reportsneeded to support audit and provide feedback functions.At times, HIT experts (or an HIT-trained facilitator)worked with practices to change how the practice members documented information from appointments andother sources within their EHR to ensure valid andcomplete data were extracted (change records systems).This step was necessary to get good quality data to acentralized data warehouse, and for the warehouse thento provide the application and cleaned data to supportquality reports, which were provided back to practices.This was necessary because many practices did not havethe capability to generate robust and useable reportswithin their own setting. The local technical assistanceHIT experts provided were extended to assisting practices with translating information from reports to actionable opportunities for improvement related to thegenerated performance reports and monitoring of progress over time.Build QI capacity and improve outcomesPractice facilitators used a wide range of implementationstrategies to build QI capacity and improve outcomes(see Table 3). Facilitation was the central implementationstrategy used by all cooperatives. Practice facilitators,who were actors in this process, used many different

Perry et al. Implementation Science(2019) 14:32Page 5 of 13Table 1 Proposed changes to ERIC strategy labels and/or definitions with rational for changes (proposed changes are in italics)ERIC nameCurrent ERIC definitionProposed changes to definitionUse data expertsInvolve, hire, and/or consult experts toinform management on the use of datagenerated by implementation effortsInvolve, hire, and/or consult experts toWe broadened functions of data expertsacquire, structure, manage, report, and use beyond just management of data.data generated by implementationeffortsRationale for proposed changeFund and contractand/or negotiatewith vendors for theclinical innovationGovernments and other payers ofservices issue requests for proposals todeliver the innovation, use contractingprocesses to motivate providers todeliver the clinical innovation, anddevelop new funding formulas thatmake it more likely that providers willdeliver the innovationNoneProvide localtechnical assistanceDevelop and use a system to delivertechnical assistance focused onimplementation issues using localpersonnelDevelop and use a system to deliverWe clarified the definition to indicatetechnical assistance within local settingsany technical assistance provided in thethat is focused on implementation issues local setting, whether provided by localstaff or by other on-site individuals.Audit and providefeedbackCollect and summarize clinicalperformance data over a specified timeperiod and give it to clinicians andadministrators to monitor, evaluate, andmodify provider behaviorDevelop summaries of clinicalperformance over a specific time period,often including a comparator, and give itto clinicians and/or administrators.Summary content (e.g., nature of the data,choice of comparator) and their delivery(e.g., mode, format) are designed tomodify specifically targeted behavior(s) oractions of individual practitioners, teams,or health care organizationsWe broadened the definition to includeproviding comparator data (benchmark)and to indicate that goal is to modify atargeted behavior and/or action ofmultiple actors. These facets are listed inancillary materials but were notincluded in the published definition.Use animplementationadvisorSeek guidance from experts inimplementationSeek guidance from experts inimplementation, including providingsupport and training for theimplementation work forceWe broadened this definition to includeproviding support and training forfacilitators.ImplementationfacilitationA process of interactive problem solvingand support that occurs in a context of arecognized need for improvement and asupportive interpersonal relationship[A] multi-faceted interactive process ofproblem solving, enabling and supportingindividuals, groups and organizations intheir efforts to adopt and incorporateinnovations into routine practices thatoccurs in a context of a recognized needfor improvement and a supportiveinterpersonal relationshipThe name was changed to specify“implementation” because facilitation isa very broad concept. Implementationfacilitation includes practice facilitation,a more specific type of implementationfacilitation.We broadened the definition toacknowledge that facilitation is morethan just “interactive problem-solving.”Assess for readinessand identify barriersand facilitatorsAssess various aspects of an organizationto determine its degree of readiness toimplement, barriers that may impedeimplementation, and strengths that canbe used in the implementation effortAssess various aspects of an organization We revised to clarify identification ofto determine its degree of readiness tobarriers and leveraging facilitators.implement and identify barriers that mayimpede implementation and strengthsthat can be leveraged to facilitate theimplementation effortDevelop animplementationblueprintDevelop a formal implementationblueprint that includes all goals andstrategies. The blueprint should includethe following: (1) aim/purpose of theimplementation; (2) scope of the change(e.g., what organizational units areaffected); (3) timeframe and milestones;and (4) appropriate performance/progress measures. Use and update thisplan to guide the implementation effortover timeDevelop a formal implementationblueprint that includes all goals andstrategies. The blueprint should includethe following: (1) aim/purpose of theimplementation; (2) scope of the change(e.g., what organizational units areaffected); (3) timeframe and milestones;and (4) appropriate performance/progress measures. Use and update thisplan to guide the implementation effortover timeWe suggest deleting the word “Formal”to include informal as well as formalimplementation blueprints. This willinclude plans that are developed forquality improvement as well as largerformal plans for implementation.Additionally, the definition is expandedto explicitly acknowledge its role inguiding implementation over time.Organizeimplementationteams and teammeetingsDevelop and support teams of clinicianswho are implementing the innovationand give them protected time to reflecton the implementation effort, sharelessons learned, and support oneanother’s learningDevelop and support teams of clinicians,staff, patients and other stakeholders whoare implementing or may be users of theinnovation. Provide protected time forteams to reflect on the implementationprogress, share lessons learned, makerefinements to plans, and support oneanother’s learningWe broadened the name to include allpossible team members and to includeformation of teams as well as meetings.Removing the term clinician allows for amulti-disciplinary team and increases engagement among all team members.We broadened definition to be inclusiveof all team members and clarified intentWe broadened the name to include therole of negotiation. Having outsideassistance to negotiate with EHRvendors can be valuable in addition topayment.

Perry et al. Implementation Science(2019) 14:32Page 6 of 13Table 1 Proposed changes to ERIC strategy labels and/or definitions with rational for changes (proposed changes are in italics)(Continued)ERIC nameCurrent ERIC definitionProposed changes to definitionRationale for proposed changeDevelop educational Develop and format manuals, toolkits,materialsand other supporting materials in waysthat make it easier for stakeholders tolearn about the innovation and forclinicians to learn how to deliver theclinical innovationDevelop and format manuals, toolkits,and other supporting materials to makeit easier for stakeholders to learn aboutthe innovation and for clinicians to learnhow to deliver the clinical innovation.This can include technology-delivered (e.g.,online/smartphone-based static ordynamic) content and health messagingWe expanded to include technologydelivered content and messaging.Conducteducationaloutreach visitsHave a trained person meet withproviders in their practice settings toeducate providers about the clinicalinnovation with the intent of changingthe provider’s practiceHave a trained person meet withindividuals or teams in their work settingsto educate them about the clinicalinnovation with the intent of changingbehavior to reliably use the clinicalinnovation as designedWe broadened definition to includeteam members beyond providers andclarified language to more clearly statethat this strategy aims to encouragesustained use of the innovation.Conduct ongoingtrainingPlan for and conduct training in theclinical innovation in an ongoing wayPlan for and conduct training in theWe expanded the definition to includeclinical innovation in an ongoing way for all individuals involved.all individuals involved withimplementation and users of the clinicalinnovation e.g., clinicians, implementationstaff, practice facilitatorsConducteducationalmeetingsHold meetings targeted toward differentstakeholder groups (e.g., providers,administrators, other organizationalstakeholders, and community, patient/consumer, and family stakeholders) toteach them about the clinical innovationHold meetings targeted towardeducating multiple stakeholder groups(i.e., providers, administrators, otherorganizational stakeholders, communitymembers, patients/consumers, families)about the clinical innovation and/or itsimplementationWe revised the definition to addspecificity about the purpose of theeducation (the innovation and/or itsimplementation) and to clarify thateducation is among multiple types ofstakeholders.Assess and redesignworkflowObserve and map current workprocesses and plan for desired workprocesses, identifying changes necessaryto accommodate, encourage, orincentivize use of the clinical innovationas designedNew: Added as this work is not reflectedin current ERIC strategies.Create onlinelearningcommunitiesCreate an online portal for clinical staffmembers to share and access resources,webinars, and FAQs related to thespecific evidenced-based intervention,and provide interactive features to encourage learning across settings andteams, e.g., regular blogs, facilitated discussion boards, access to experts, andnetworking opportunitiesNew: Added as this work is not reflectedin current ERIC strategies.Engage communityresourcesConnect practices and their patients toNew: Added as this work is not reflectedcommunity resources outside thein current ERIC strategies.practice (e.g., state and county healthdepartments; non-profit organizations; resources related to addressing the socialdeterminants of health; and organizations focused on self-management techniques and supportof the definition.Please see Additional file 2 for complete list of proposed changes plus changes to “Ancillary Material” originally included in Additional File 6 published withPowell et al. (10)implementation strategies identified by ERIC to promotepractice change and to improve capacity for conductingregular QI to improve ABCS outcomes. For example, facilitators assisted practices in engaging clinicians, practice members, and leaders in practice change efforts(identify and prepare champions); organized and facilitated meetings (organize implementation team and teammeetings); assessed and offered suggestions for revisingclinical work flows (assess and redesign workflow);assessed HIT needs and data set-up (assess for readinessand identify barriers and facilitators); discussed andidentified an improvement plan (develop an implementation blu

ent settings [14], fostering comparison across clinical sites in a multisite study [16], and helping to inform the development of activity logs to track implementation strategies [13]. Researchers who applied the ERIC and Proctor frame-works to real-world implementation efforts recom-mended additional implementation strategies for the