Key Lessons From The National Evaluation Of The CHIPRA Quality .

Transcription

The National Evaluation of theCHIPRA Quality Demonstration Grant ProgramKey Lessons from the NationalEvaluation of the CHIPRA QualityDemonstration Grant ProgramFINAL SUMMARY

Key Lessons from the National Evaluation of the CHIPRA Quality Demonstration Grant ProgramThis report was prepared for the Agency for Healthcare Researchand Quality (AHRQ) by Mathematica Policy Research and itspartners the Urban Institute and AcademyHealth under contractHHSA29020090002191. This document is in the public domain and may beused and reprinted without permission.The authors of this report are responsible for its content. No statementin the report should be construed as an official position of the U.S.Department of Health and Human Services or the Agency for HealthcareResearch and Quality.Suggested CitationIreys H, Zickafoose J, Brach C, et al. Key Lessons from the NationalEvaluation of the CHIPRA Quality Demonstration Grant Program. FinalSummary. AHRQ Publication No. 15-0071. Rockville, MD: Agency forHealthcare Research and Quality; September 2015.AcknowledgmentsWe appreciate the invaluable contributions of key staff of the nationalevaluation team, Anna Christensen, PhD; Grace Anglin, MPH; DanaPetersen, PhD; Tricia Higgins, PhD; and the many others at MathematicaPolicy Research, the Urban Institute, and AcademyHealth whose effortsmade this publication possible. We also thank our colleagues at theCenters for Medicare & Medicaid Services Karen LLanos, ElizabethHill, and Barbara Dailey; the members of our Technical Expert Panelwho provided sage advice over the years; staff and stakeholders at thedemonstration States; and our editor at AHRQ, Mary Grady.The National Evaluation of theCHIPRA Quality Demonstration Grant Program

The National Evaluation of the CHIPRA Quality Demonstration Grant Program: Key Lessons LearnedKey Lessons from the NationalEvaluation of the CHIPRA QualityDemonstration Grant ProgramPrepared for:Agency for Healthcare Research and QualityRockville, MD 20850www.ahrq.govPrepared by:Mathematica Policy ResearchWashington, DCContract No. HHSA29020090002191Authors:Mathematica Policy ResearchHenry Ireys, PhDJoseph Zickafoose, MD, MSAgency for Healthcare Research and QualityCindy Brach, MPPLinda Bergofsky, MSW, MBAUrban InstituteKelly Devers, PhDRachel Burton, MPPAcademyHealthLisa Simpson, MB, BCh, MPH, FAAPEllen AlbrittonAHRQ Publication No. 15-0071September 2015The National Evaluation of theCHIPRA Quality Demonstration Grant Program

Key Lessons from the National Evaluation of the CHIPRA Quality Demonstration Grant ProgramIntroductionThe Children’s Health Insurance Program ReauthorizationAct of 2009 (CHIPRA) authorized and funded the CHIPRAQuality Demonstration Grant Program to identify strategiesfor improving the quality of health care for childrenenrolled in Medicaid and the Children’s Health InsuranceProgram (CHIP).1 The Centers for Medicare & MedicaidServices (CMS) awarded 10 demonstration grants thatranged from 8.7 to 11.3 million each, funding 18 Statesthat implemented 52 separate projects. The CHIPRA qualitydemonstration, which ran from 2010 to 2015, was one ofthe nation’s largest investments of Federal dollars aimed atlearning how to improve children’s health and health care.2Within the broad mandate of the CHIPRA legislation,demonstration States pursued a variety of activities,projects, and approaches. This summary,3 which drawsfrom products produced throughout the evaluation,4highlights program objectives, the strategies States used,and the lessons learned about: Reporting and using the core set of quality measuresfor children. Transforming service delivery to promote quality of care. Improving service systems for youth with seriousemotional disorders. Applying health information technology (IT) for qualityimprovement (QI).The national evaluation of this demonstration grantprogram, funded by CMS and overseen by the Agency forHealthcare Research & Quality (AHRQ), was conductedby Mathematica Policy Research and its partners, theUrban Institute and AcademyHealth. The purpose of theevaluation was to provide insights into best practices andreplicable strategies for improving the quality of health carefor children. (Refer to Evaluation Methods at the end of thissummary for more information.) Building partnerships to improve quality of children’shealth care. Using Federal grants to build intellectual capital at theState level.To illustrate some of the lessons learned, this summaryincludes short descriptions of selected activitiesimplemented.AKVTORMAIDWYPAList of grantees and their partner statesOregon. Partners: Alaska and West VirginiaMaryland. Partners: Georgia and WyomingMEUTUtah. Partner: IdahoILFlorida. Partner: IllinoisNCMaine. Partner: VermontColorado. Partner: New MexicoMDWVCOSCNMMassachusetts. No partners.GASouth Carolina. No partners.Pennsylvania. No partners.North Carolina. No partners.FLDuration of grant program: February 2010 – February 2016Duration of national evaluation: August 2010 – September 2015Total amount of all grants awarded: 99,982,521The National Evaluation of theCHIPRA Quality Demonstration Grant ProgramPage 2Final Summary, September 2015

Key Lessons from the National Evaluation of the CHIPRA Quality Demonstration Grant ProgramReporting and using the Child CoreSet of quality measuresProgram objectivesCMS encourages all States to voluntarily report the Core Set ofChildren’s Health Care Quality Measures for Medicaid and CHIP (theChild Core Set) each year.5 As part of the demonstration, 10 Statesimplemented projects involving the Child Core Set or similar qualitymeasures.6 Guided by CMS’s original grant solicitation, the projectsimplemented by these States aimed to: (1) enhance technical capacitiesfor accurately reporting the core measures to CMS, and (2) developstrategies for using the core measures to improve quality of care at theState, health system, or practice level.State strategiesTo accomplish their objectives, the 10 demonstration States used varyingcombinations of the following strategies: Hiring dedicated computer programmers to develop the technicalprocedures needed to calculate the measures using CMS specifications. Developing new procedures to assemble data files from diversesources and checking them for accuracy. Collecting the patient experience surveys needed to calculate certainmeasures. Establishing statewide groups to provide technical oversight andpolicy direction for using measures to track performance. Developing reports for policymakers, providers, and consumers tocompare performance with national benchmarks. Identifying variation across practices, regions, or plans; andmonitoring changes in performance over time. Supporting pay-for-reporting programs to encourage use of electronichealth record (EHR) data for measurement.Lessons learnedStates can substantially improve their capacity to report qualitymeasures for children by strategically enhancing technical resources anddeveloping methods for linking data sets. Key stakeholders within Statesespecially value measures that can be used for QI within health systemsand practices. Most States have not yet demonstrated widespread use ofEHR data for calculating quality measures.The National Evaluation of theCHIPRA Quality Demonstration Grant ProgramPage 3Final Summary, September 2015

Key Lessons from the National Evaluation of the CHIPRA Quality Demonstration Grant ProgramMaine increased the number of Child CoreSet measures it reported to CMS from14 in 2010 to 18 in 2014 through variousstrategies. For example, the State identifiedways to use health information exchange(HIE) data to calculate measures and madeother adjustments, such as adding a newbilling code modifier to distinguish betweenglobal developmental and autism screenings.However, the State was unable to report onall 26 measures due to the limited availabilityof administrative data on behavioral healthservices and clinical data from practices’EHRs.Massachusetts conducted interviews withpractices and focus groups with families tohelp them design useful quality measurereports. Report production was delayedbecause interpreting measure specificationsand developing legal agreements to accessneeded data took longer than expected. TheState reported that its efforts ultimately yieldedrobust and useful reports for practices,families, and policymakers on Medicaid, CHIP,and commercially insured patients.North Carolina incorporated additional childfocused measures into quarterly reports thatthe State makes available to all practicesserving Medicaid and CHIP beneficiaries.Practices indicated that the reports helpedthem assess their performance and identifyQI priorities. However, given delays in claimsprocessing and infrequent reporting periods,the reports were difficult for practices to useto assess whether redesigned workflowsimproved care. In response, State-hiredpractice facilitators helped practices runsupplemental reports directly from their EHRsso they could track QI changes in real-time.The National Evaluation of theCHIPRA Quality Demonstration Grant ProgramSpecifically, analysis of information from the projects implemented bythe 10 demonstration States working in this area and from a survey ofphysicians fielded in several States yielded the following insights: Reporting capacity was influenced by a State’s Medicaid dataavailability, technical expertise (for example, the capacity tolink State data systems together), past experience with qualitymeasurement, availability of staff time, and demand for themeasures. Both the availability of the demonstration funds andsubstantial technical assistance from CMS allowed States toovercome some of the challenges they faced and increase the numberof measures reported to CMS. States can use validated quality measures for children to monitorquality and compare performance across health systems and managedcare plans. Policy or programmatic changes, such as stipulatingbenchmarks in managed care contracts and developing incentivesfor improvement, can be used to increase performance specifically inrelation to children’s health care. Access to fee-for-service claims data enables but does not guaranteethat all administrative measures can be reported. Stakeholders value State reports on the performance of healthplans and child-serving practices, especially when States integratestakeholder input into report design and when States align measuresacross diverse reporting requirements. The majority of child-serving physicians receive quality reports andbelieve they are effective for QI, but only one-third actually use qualityreports in their QI activities. Physicians in demonstration States usedquality reports for QI at about the same rate as physicians in a similarstate that did not have a demonstration grant. Lack of timely data makes it difficult for providers to use Stateproduced quality reports to assess efforts to improve quality. Practicesneed substantial technical assistance from EHR vendors and QIspecialists to use their own EHR data to inform QI initiatives. States may not be able to produce measures that require EHR databecause States and health systems have not yet developed theinfrastructure needed to support data transfer from providers’ EHRs.Furthermore, incomplete or inconsistent documentation in EHRs andpaper charts means that practices first have to improve documentationbefore they can improve measure reporting.Page 4Final Summary, September 2015

Key Lessons from the National Evaluation of the CHIPRA Quality Demonstration Grant ProgramTransforming service delivery to promotequality of careProgram objectivesCMS asked States to develop projects that would test new or improvedprovider-based models for providing health care services to children and theirfamilies. Fourteen States fielded projects in this topic area,7 examining servicedelivery models in settings such as pediatric and family practicesand school-based health centers (SBHCs).State strategiesTo accomplish their objectives, these demonstration States used varyingcombinations of the following strategies: Learning collaboratives, including group instruction with peer-to-peerlearning opportunities, in-person meetings, and web-based learning sessions. Intensive one-on-one support (such as technical assistance or practicefacilitation) to help practices and SBHCs develop QI teams, identify QIactivities, collect and analyze data (including from EHRs) to track progress,and/or improve care coordination functions. Addition of new staff to perform a broad set of functions related to carecoordination (such as facilitating and tracking referrals or administeringscreening and assessment tools) and QI (such as overseeing data collectionand chart reviews or creating and maintaining registries). Stipends or other payments to support staff time and compensate practices’loss of billable hours while working on QI activities. Training and certification, such as providing credit toward maintenance ofcertification (MOC) requirements for participation in learning activities. Guidance in the steps needed to obtain recognition as a patient-centeredmedical home (PCMH). Efforts to engage families in QI activities, such as financial support for parentadvisors whose role was to assist practices’ in their QI efforts.Lessons learnedTo make progress in transforming service delivery systems, States will needa combination of strategies, such as learning collaboratives, direct facilitationof practice-level changes (for example, technical assistance to help practicesdevelop performance data), and payments to practices to support staff time forimplementing new QI efforts.Specifically, analysis of the projects implemented by the 14 demonstrationStates working in this area yielded the following insights: Learning collaboratives can be a useful means for supporting practicetransformation, but only when providers play major roles in selectingtopics and structuring the sessions.The National Evaluation of theCHIPRA Quality Demonstration Grant ProgramPage 5Final Summary, September 2015

Key Lessons from the National Evaluation of the CHIPRA Quality Demonstration Grant ProgramSouth Carolina convened a learningcollaborative to help 18 child-serving practicesbuild their QI capacity. Demonstration staffused in-person learning sessions, conferencecalls, and one-on-one support to helppractices select, implement, and monitor QIinitiatives of their choosing. Practices reportedthat they appreciated the flexibility to establishtheir own QI priorities and placed a high valueon learning from other practices. As a resultof their participation, practices reported usingadditional developmental and psychosocialscreenings, providing oral health preventiveservices more regularly, and improvingadherence to care guidelines for chronicconditions.Colorado and New Mexico hired QI coachesand provided stipends to help SBHCs carry outQI projects. While working with the first of threecohorts of SBHCs, demonstration staff realizedthat supporting the SBHCs took more time andresources than anticipated. As a result, eachState worked with 11 SBHCs instead of 17, asoriginally planned. The participating SBHCspursued a variety of QI activities includingincreasing the percentage of adolescentsreceiving all recommended Early and PeriodicScreening, Diagnosis, and Treatment (EPSDT)services and implementing new youthengagement strategies.Oregon, West Virginia, and Alaska usedlearning collaboratives, one-on-one practicefacilitation, and stipends to help a total of21 practices enhance their medical homefeatures. As a result, all participating practicesimplemented new care coordination strategies,such as routinely following up with caregiversof children who were referred for specializedcare or developing condition-specific careplans. Seventeen of the 21 practices hirednew care coordinators to accomplish thesetasks. Practices highly valued the new carecoordination staff and functions. However,many practices are concerned about sustainingthem after the demonstration ends becausereimbursement for care coordination servicesfor children is not currently available.The National Evaluation of theCHIPRA Quality Demonstration Grant Program Practices need a variety of supports to remain engaged in learningcollaboratives and other QI activities (for example, technicalassistance, practice facilitators, stipends, MOC credits). States also canuse web-based learning sessions to supplement or replace in-personmeetings to make attendance easier, especially for practices in rural orfrontier communities. With encouragement from the State, practices used a self-administeredassessment of medical homeness that tracked changes over time andhelped focus QI activities on areas most in need of attention. Most practices lack the technical competencies to gather the dataneeded to implement and track practice-level QI efforts. Althoughlearning collaboratives can help build providers’ capacity, not allpractices want to improve data collection and measurement skills;some view the burden of data collection and measurement activities asoutweighing the benefits. Some States hired practice facilitators (sometimes called QI specialistsor coaches) to help practices and SBHCs develop QI teams, identifyand undertake QI activities, and collect and analyze data to trackprogress. To be effective, practice facilitators need to: (1) possessstrong interpersonal skills that support practice engagement; (2)have technical knowledge in quality measurement, QI strategies,and clinical content areas; and (3) have caseloads that permit them tospend sufficient time with a practice or SBHC. SBHCs may have limited experience in engaging youth in discussionsabout their own health and health care. States can help SBHCs by hiringyouth engagement specialists who can assist in hosting workshopsfor youth and health literacy training for SBHC staff, and practicefacilitators who can help gather and review data to inform SBHCs’clinical services. Developing sustainable methods for systematically engaging familiesand youth is challenging. For example, four States used demonstrationfunds to find and pay parent advisors to help practices with their QIactivities but did not continue financial support for this effort after thedemonstration period. Allowing practices to hire care coordinators directly (instead of theState hiring them centrally) better supported integration of thesestaff into daily operations; practices could select individuals with thecredentials, demeanor, and communication style that best fit their needsand culture. States and practices raised concerns about their ability tofund care coordinator and practice facilitator positions or to continuetheir participation in QI activities after the demonstration grant periodends. New grant or demonstration funds or payment mechanisms thatinclude reimbursement for care coordination and QI related activitiesmay help practices and SBHCs sustain these activities.Page 6Final Summary, September 2015

Key Lessons from the National Evaluation of the CHIPRA Quality Demonstration Grant ProgramImproving service systems for youthwith serious emotional disorders andtheir familiesProgram objectivesCMS awarded demonstration grants to three States (Maryland, Georgia,and Wyoming) to improve and better coordinate the diverse servicesthat children with serious emotional disorders and their families need tofunction in their homes and communities.State strategiesTo accomplish their objectives, States used the demonstration funds todevelop new care management entities (CMEs), improve existing ones,or explore methods for sustaining them. CMEs are a combined servicedelivery and payment model for integrating services across multipleagencies serving children with serious emotional disorders. One Stateused peer support training programs to help youth and caregivers developskills needed to support other youth with serious emotional disorders andtheir families.Lessons learnedDesigning or improving CMEs is a complex and lengthy undertaking.Several factors facilitate the process and help lay the foundation forstrong programs. Analysis of the projects implemented by the threedemonstration States working in this area yielded the following insights: Broad stakeholder involvement is critical to securing the cross-agencycoordination and extensive youth, family, and provider involvementneeded for CMEs to operate effectively. Agencies representingMedicaid, child welfare, behavioral health services, juvenile justice,social services, and education need to collaborate on the CME designprocess.Maryland contracted with a team of researchersto analyze data submitted by the States’ CMEs,as well as administrative data from Medicaid,child welfare, and the juvenile justice system. Theresearchers helped the State establish data-sharingagreements, reduce cross-system variation in thestructure of service records, and improve dataconsistency. Although addressing these challengescaused delays, Maryland was able to assess thetotal cost of care across child-serving agencies andidentify service gaps, opportunities for better carecoordination, and incidences of psychotropic drugmisuse or overuse. Over the long term, the Statealso expects to benefit from its new capacity fordata analysis.Georgia developed two new training curriculato prepare youth with behavioral healthconditions and their caregivers to providepeer support. The State indicated that activelyengaging youth and caregivers in curriculumdevelopment fostered their support for thecurriculum and helped make the trainings bothrelevant and accessible. The State also aimedto improve access to and the quality of CMEservices. However, the State’s ability to doso was limited by external factors, includingadministrative and financial changes underwayin the State’s Medicaid program. Advice and assistance from experienced consultants can help Statesunderstand the array of options for designing their CMEs. Analyzing data on service use, cost, and eligibility from multipleagencies helps States understand how youth with serious emotionaldisorders received services, which in turn can inform CME designdecisions. States can encounter incomplete administrative data filesand difficulties in establishing interagency data-sharing agreements.Outside analysts, such as university-based researchers, can assist in thechallenging task of assembling the appropriate data. Engaging youth, caregivers, and family advocacy groups in thecurriculum development can help States create an accessible,comprehensive curriculum. Youth and caregivers who provide peersupport may themselves need support if they are faced with a personalor family mental health, physical health, or other social crisis.The National Evaluation of theCHIPRA Quality Demonstration Grant ProgramPage 7Wyoming used the demonstration funds to pilotits first CME. Designing the CME took nearly 3years, and the State faced several challengesincluding child-serving agencies’ lack of priorknowledge about the model and their competingjob responsibilities. Dedicating staff to leadingCME development, and consulting both witha contractor and States with CME expertise,including Maryland and Georgia, helped withthe design process.Final Summary, September 2015

Key Lessons from the National Evaluation of the CHIPRA Quality Demonstration Grant ProgramApplying health informationtechnologies (IT) for QIProgram objectivesCMS encouraged States to develop and enhance current health ITapplications, establish links among databases, provide incentives for theadoption and use of health IT, analyze health IT data, and implement QIactivities based on the analyses. Federal policymakers were looking to thisdemonstration to provide information on the use and impact of healthIT to improve child health care quality and reduce costs, and to informtechnical assistance to promote broader adoption of health IT. CMS’grant solicitation required States to coordinate with other Federal grantprograms underway at the time.8State strategiesFourteen demonstration States implemented health IT projects,9 exploringa mix of strategies for using technology to improve quality of care. Keystrategies included using combinations of EHRs, personal health record(PHRs), and health information exchanges (HIEs) to support: Automated reporting of measures in the Child Core Set. EPSDT reporting. Clinical screening and decision support. Coordinating among different types of providers (especially inconnection with medical homes) through secure information sharingpathways. Engaging consumers through patient portals and secure email. Adapting EHR systems to better meet the needs of child-servingpractices.Lessons learnedImplementing health IT applications to support QI for children typicallytakes far longer and requires more resources than program staff anticipate.In addition, new Federal guidelines and the rapid evolution of health ITadded to implementation challenges for States with projects in this area.Nonetheless, some States successfully implemented focused IT applications.Analysis of the projects implemented by the 14 demonstration States workingin this area yielded the following insights: Developing effective communication pathways between practices’EHRs and HIEs requires substantial resources dedicated to fixinginteroperability problems, resolving privacy and other legal issues, andworking closely with private IT vendors.The National Evaluation of theCHIPRA Quality Demonstration Grant ProgramPage 8Final Summary, September 2015

Key Lessons from the National Evaluation of the CHIPRA Quality Demonstration Grant Program Differences in EHR functionality, system incompatibility, and poorInternet connections made implementing QI projects challenging forsome SBHCs. When these challenges can be overcome, SBHCs findit easier to collect and report data from their EHRs than from papercharts. In the process of working with contractors to develop an ITapplication, States must ensure that end users will actually use theapplication. Although the model EHR Format for children addresses many childoriented functions, incorporating its requirements into current EHRsis likely to be challenging. Practice facilitators can help child-servingpractices and health systems maximize the functionality of their EHRs.Getting EHR vendors to modify products to be more child-oriented,however, will continue to be very difficult because child-servingorganizations represent a small share of EHR vendors’ business. Helping States use health IT to improve quality of care may requirea separate demonstration program that assembles a higher level oftechnical assistance than is feasible in a multi-faceted grant program. Projects involving the development of electronic screening methodswere able to achieve their objectives and showed that:– Technology can streamline the administration of screening tools forhealth risks such as developmental delay or autism.– The use of electronic screening tools in practices and SBHCs canenhance documentation that services were provided and cansupport data quality, tracking, and monitoring and a higher qualityof care.– Adolescents, families, and providers find electronic screening easyto use. Additionally, adolescents valued tablet-based screening as away of communicating directly and privately with their clinicians.Pennsylvania, in partnership with Children’sHospital of Philadelphia and Geisinger HealthSystem, implemented a fully electronic screeningprocess for developmental disabilities and otherconditions. This activity contributed to improveddocumentation of screening and laid a foundationfor more consistent and rapid referrals to earlyintervention programs and other resources forchildren with positive screens. Providers reportedthat the screeners are useful, though some siteshave been slower to integrate them than othersbecause of EHR limitations and competingorganizational priorities.Utah and Idaho laid the ground work for aninterstate HIE. The States initially planned to link theirindividual HIEs to share public health information,such as immunization data. However, Utah’s HIEdevelopment fell behind schedule as a resultof vendor turnover, interoperability issues, andprolonged data sharing negotiations with providergroups. In addition, CHIPRA staff in Idaho had towork with the State’s legislature to overcome privacyrelated legal challenges to interstate exchange. Inspite of these challenges, Utah and Idaho remainedcommitted to sharing data, so the States investigatedalternative mechanisms. Ultimately, Utah was ableto use direct file transfer to send records to Idahofor more than 10,000 Idaho children who had beenimmunized in Utah.Practices’ use of a Vermont electronicregistry was limited because many providersexperienced difficulty in connecting their EHRsto the system, were concerned that the systemrequired duplicative data entry, or both. Inresponse, State-funded practice facilitatorshelped practices pull reports directly frompractices’ EHRs.The National Evaluation of theCHIPRA Quality Demonstration Grant ProgramPage 9Final Summary, September 2015

Key Lessons from the National Evaluation of the CHIPRA Quality Demonstration Grant ProgramBuilding partnerships to improve qualityof children’s health careProgram objectivesIn its solicitation for grant applications, CMS encouraged multi-Statepartnerships to increase the number of participating States and promotethe spread of knowledge and experience.State strategiesSix of the demonstration grants involved multi-State partnershipsinvolving a total of 14 States.10 Partners used combinations of the followingstrategies to foster communication and collaboration: Hiring independent organizations to convene the partners and fosterlearning across States. Developing joint projects, integrating activities, and setting upcomplementary implementation schedules. Seque

The Children's Health Insurance Program Reauthorization Act of 2009 (CHIPRA) authorized and funded the CHIPRA Quality Demonstration Grant Program to identify strategies for improving the quality of health care for children enrolled in Medicaid and the Children's Health Insurance Program (CHIP).1 The Centers for Medicare & Medicaid