QbD Maturity Model Walkthrough And Scoring Examples - CTTI

Transcription

lllllii. CLINICAL. P!lill TRIALS,. 1111111 TRANSFORMATION Ill""'" INITIATIVEQbD Maturity Model Walkthroughand Scoring Examples

QbD Maturity Model OverviewFor today's assessment, what departmentor organizational level are you addressing? QUALITY CULTURElAwareness & Supports]Incentives- - - · · - STUDY DESIGN) ) Stakeholder Engagement]Critical-to-Quality FocusSTUDY CONDUCT))Handover from Study Design to Execution))Management of Risks to CTQsCONTINUOUS IMPROVEMENTllLessons LearnedContinuous Improvement Metrics -- --- ·-··-· ·- ·-DDDDDDDDCTTI has developed aMaturity Model to helporganizationsunderstand andimplement a Quality byDesign approach toclinical trials.The next few slides willwalk through the mainelements of thisMaturity Model one ata time.Starting on slide 8,there are examples ofhow to use this tool toscore QbD maturityand establish prioritiesfor improvement overtime.

8 Factors (Elements) of QbDFor today's assessment, what departmentor organizational level are you addressing? The Maturity Modelfocuses on eightFactors that, takentogether, represent acompleteimplementation ofQuality by Design forclinical trials.0 QUALITY CULTURElAwareness & Supports]IncentivesSTUDY DESIGN) ) Stakeholder Engagement]Critical-to-Quality FocusSTUDY CONDUCT))Handover from Study Design to Execution))Management of Risks to CTQsCONTINUOUS IMPROVEMENTllLessons LearnedContinuous Improvement Metrics DDThe eight Factors inthe Maturity Model aresorted into fourCategories (QualityCulture, Study Design,etc.). These categoriesare directly alignedwith CTTI’sRecommendations forimplementing QbD.

Each Factor Scored as Level 1-5Each Factor can bescored from Level 1 toLevel 5.In general:For today's assessment, what departmentor organizational level are you addressing? Level 1 implies little orno intentionalimplementation ofQbD principles. QUALITY C LTURE- - - · · - Awareness & SupportsIncentivesSTUDY DESIGNStakeholder EngagementCritical-to-Quality FocusSTUDY CONDUCT Handover from Study Design to ExecutionManagement of Risks to CTQsCONTINUOUS IMPROVEMENTLessons LearnedContinuous Improvement Metrics -- --- Levels 2-4 implyincreasingly completeand effectiveimplementation.DD Level 5 describes anidealized state ofcompleteimplementation, alongwith continuousimprovement efforts.As shown on the nextslide, each Level isdescribed in text in anaccompanying table.

Example: Level Descriptions for 2 lityFocusLevel 1Ad hocLevel 2EarlyLevel 3DevelopingLevel 4ImplementingLevel 5OptimizingStudy designedwith input primarilyfrom protocolwriting teamStudy designconsiders some, butnot all, stakeholders’needsStudy designidentifies andconsiders allstakeholders’ needs;not all stakeholdersdirectly engagedStudy designincludes directengagement with allstakeholders fromearliest stages ofstudy planningStudy designcollaboratively considersneeds of all stakeholdersProtocols includedata collection notnecessary forpatient safety orcredibility offindingsData collectionconsidered againststudy objectives, butnon-essentialendpoints andassessments remainCritical-to-qualityfactors (CTQs) notformally identifiedCTQs and associatedrisks to study qualitydiscussed, but notsystematicallyaddressedOperationalimplications ofprotocol not fullyconsideredOperationalimplications often notconsidered untilprotocol is near-finalPeriodically updatingunderstanding of whothe stakeholders are,across the researchenterprise, and theircurrent needsAll endpoints andStudy designStudy design is asassessmentsprocess enforcessimple as possible, withconsidered againststrong justification for complexity proportionatescientific rationale,any study endpoints to objectivesbut other factors may and assessmentsProtocol and supportingstill drive decisionsbeyond the mostdocuments simplifiedfundamentalFormal process inand streamlined, and allplace for identifyingCTQs systematically protocol-specific trainingand addressingidentified andaligned with CTQsCTQsaddressed inStudy-specific risksprotocol design,Operationaloperational planning, proactively identified,implicationsupdated and controlledand riskconsidered fromthroughout studymanagement andearly stages oflifecyclemonitoringprotocol designThis slide shows anexample of how theLevels for twodifferent ocus) are definedin the MaturityModel.This is comparableto a ‘scoring rubric’.An organizationwould look at thetext in each Leveland decide, for agiven Factor, whichbest describes theircurrent state.In the full tool,similar text isprovided for alleight Factors.

Maturity ‘Scores’ Can Be Tracked Over TimeFor today's assessment, what departmentor organizational level are you addressing?For each Factor, theLevel that theorganization is atbecomes its ‘score’.For example, if anorganization is atLevel 3 on Incentives,then it’s ‘score’ forthat Factor would be3. QUALITY C LTUREDDAwareness & SupportsIncentivesSTUDY DESIGNStakeholder EngagementCritical-to-Quality FocusSTUDY CONDUCT Handover from Study Design to Execution}Management of Risks to CTQsCONTINUOUS IMPROVEMENTLessons LearnedContinuous Improvement Metrics DDDDDDThe Maturity Modelalso includes an areato record scores foreach Factor. If helpful, anintermediate scorecam be used. Forexample, if anorganization hassome elements ofLevel 3 and some ofLevel 4, they mightdecide on a score of3.5.

Score the Whole Organization or Departmentstoday's assessment, what departmentor organizational level are you addressing? QbD maturity scores canbe determined for theorganization as a whole,or for particular businessunits or departments.It is possible and evenlikely for different businessunits or departments to beat different Levels on eachQbD maturity Factor.0 So, when using the QbDMaturity Model, it isimportant to:IncentivesSTUDY DESIGNStakeholder EngagementCritical-to-Quality FocusSTUDY CONDUCT Handover from Study Design to ExecutionManagement of Risks to CTQsCONTINUOUS IMPROVEMENTLessons LearnedContinuous Improvement Metrics DD1. Start by determiningwhat the unit ofassessment will be(whole organization,particular businessunit, etc.).2. Score based on thetypical or averageexperience for theselected unit ofassessment.

Scoring Example: Study DesignFor today's assessment, what departmentor organizational level are you addressing? This next set of slidesprovides an example ofhow an organization (orbusiness unit,department, etc.) coulduse the Maturity Modelto score its currentimplementation of QbD,as well as to plan adesired future state.0 IncentivesSTUDY DESIGN) ) Stakeholder Engagement]Critical-to-Quality FocusSTUDY CONDUCT Handover from Study Design to Execution)Management of Risks to CTQsCONTINUOUS IMPROVEMENTLessons LearnedContinuous Improvement Metrics· · - .e--- . D23D DD Here, the exampleorganization hasdetermined it iscurrently at Level 3 onStakeholderEngagement, and atLevel 2 on Critical-toQuality Focus.The next two slidesflesh out the rationale.

Rationale for ‘Stakeholder Engagement’ tyFocusLevel 1Ad hocLevel 2EarlyLevel 3DevelopingLevel 4ImplementingStudy designedwith input primarilyfrom protocolwriting teamStudy designconsiders some, butnot all, stakeholders’needsStudy designidentifies andconsiders allstakeholders’ needs;not all stakeholdersdirectly engagedStudy designincludes directengagement with allstakeholders fromearliest stages ofstudy planningProtocols includedata collection notnecessary forpatient safety orcredibility offindingsData collectionconsidered againststudy objectives, butnon-essentialendpoints andassessments remainAll endpoints andassessmentsconsidered againstscientific rationale,but other factors maystill drive decisionsStudy designprocess enforcesstrong justification forany study endpointsand assessmentsbeyond the mostfundamentalCritical-to-qualityfactors (CTQs) notformally identifiedCTQs and associatedrisks to study qualitydiscussed, but notsystematicallyaddressedFormal process inplace for identifyingand addressingCTQsOperationalimplications ofprotocol not fullyconsideredOperationalimplications often notconsidered untilprotocol is near-finalOperationalimplicationsconsidered fromearly stages ofprotocol designLevel vely considersvia advisoryneeds of all stakeholdersboards, Periodicallybut notupdatingunderstandinguntil protocolis of whothe stakeholders are,nearing across the researchenterprise, and theircompletioncurrent needsCTQs systematicallyidentified andaddressed inprotocol design,operational planning,and riskmanagement andmonitoringStudy design is assimple as possible, withcomplexity proportionateto objectivesProtocol and supportingdocuments simplifiedand streamlined, and allprotocol-specific trainingaligned with CTQsStudy-specific risksproactively identified,updated and controlledthroughout studylifecycleIn evaluating itsmaturity onStakeholderEngagement, theexample organizationdetermined that: It does a good jobof engaging therange of internalfunctions, andeven sites andCROs; But it typically doesnot engagepatients until toolate in the studydesign process fortheir input to beoptimally effectiveat helping toidentify and avoid‘errors that matter’.Note: A list of potentialstakeholders to engage instudy design is available atthis link.

Rationale for ‘Critical-to-Quality Focus’ tyFocusLevel 1Ad hocLevel 2EarlyLevel 3DevelopingLevel 4ImplementingLevel 5OptimizingStudy designedwith input primarilyfrom protocolwriting teamStudy designconsiders some, butnot all, stakeholders’needsStudy designidentifies andconsiders allstakeholders’ needs;not all stakeholdersdirectly engagedStudy designincludes directengagement with allstakeholders fromearliest stages ofstudy planningStudy designcollaboratively considersneeds of all stakeholdersProtocols includedata collection notnecessary forpatient safety orcredibility offindingsData collectionconsidered againststudy objectives, butnon-essentialendpoints andassessments remainCritical-to-qualityfactors (CTQs) notformally identifiedCTQs and associatedrisks to study qualitydiscussed, but notsystematicallyaddressedOperationalimplications ofprotocol not fullyconsideredOperationalimplications often notconsidered untilprotocol is near-finalPeriodically updatingunderstanding of whothe stakeholders are,across the researchenterprise, and theircurrent needsAll endpoints andStudy designStudy design is asassessmentsprocess enforcessimple as possible, withconsidered againststrong justification for complexity proportionatescientific rationale,any study endpoints to objectivesbut other factors may and assessmentsProtocol and supportingstill drive decisionsbeyond the mostdocuments simplifiedfundamentalFormal process inand streamlined, and allplace for identifyingCTQs systematically protocol-specific trainingand addressingidentified andaligned with CTQsCTQsaddressed inStudy-specific risksprotocol design,Operationaloperational planning, proactively identified,implicationsupdated and controlledand riskconsidered fromthroughout studymanagement andearly stages oflifecyclemonitoringprotocol designApproach to studyplanning hassome overlap withQbD concepts,but QbD notformally appliedIn evaluating Criticalto-Quality Focus, theexample organizationdetermined that: Its processes arealigned in principlewith QbD’s focus onproactivelymitigating risks of‘errors that matter’; However, work willbe needed toensure completeand systematicimplementation ofthe key concepts(e.g., by updatingSOPs and providingtemplates forcapturing decisionsabout risk that aredirectly tied tocritical-to-qualityfactors).

Identifying Desired Future tyFocusCurrent StateLevel 1Ad hocLevel 2EarlyLevel 3DevelopingLevel 4ImplementingLevel 5OptimizingStudy designedwith input primarilyfrom protocolwriting teamStudy designconsiders some, butnot all, stakeholders’needsStudy designidentifies andconsiders allstakeholders’ needs;not all stakeholdersdirectly engagedStudy designincludes directengagement with allstakeholders fromearliest stages ofstudy planningStudy designcollaboratively considersneeds of all stakeholdersProtocols includedata collection notnecessary forpatient safety orcredibility offindingsData collectionconsidered againststudy objectives, butnon-essentialendpoints andassessments remainCritical-to-qualityfactors (CTQs) notformally identifiedCTQs and associatedrisks to study qualitydiscussed, but notsystematicallyaddressedOperationalimplications ofprotocol not fullyconsideredOperationalimplications often notconsidered untilprotocol is near-finalPeriodically updatingunderstanding of whothe stakeholders are,across the researchenterprise, and theircurrent needsAll endpoints andStudy designStudy design is asassessmentsprocess enforcessimple as possible, withconsidered againststrong justification for complexity proportionatescientific rationale,any study endpoints to objectivesbut other factors may and assessmentsProtocol and supportingstill drive decisionsbeyond the mostdocuments simplifiedfundamentalFormal process inand streamlined, and allplace for identifyingCTQs systematically protocol-specific trainingand addressingidentified andaligned with CTQsCTQsaddressed inStudy-specific risksprotocol design,Operationaloperational planning, proactively identified,implicationsupdated and controlledand riskconsidered fromthroughout studymanagement andearly stages oflifecyclemonitoringprotocol designDesired State(End of Next Year)Finally, the exampleorganizationdiscusses where itwould like to be by theend of the next year.Here, the organizationdecided to prioritizemoving bothStakeholderEngagement andCritical-to-QualityFocus to Level 4.Similar discussionscould be held for alleight of the Factors inthe Maturity Model.See additional scoringexamples on slides13-19.

Considerations for Using the Maturity ModelThe full tool provides detailedinstructions, but some of themost important considerationsare identified here.(Not best-case or worst-case)Perhaps most importantly, theMaturity Model is not aquantitative benchmarkingtool for comparison betweenorganizations, nor is it meantto provide audit or inspectionstandards.The discussion ismore important thanthe numberRather, it is meant to supportmeaningful discussion withinan organization aboutimplementation gaps andopportunities for continuousimprovement.Score based on what is typicalEngage all stakeholdersFacilitate open dialogueand honest assessmentPlan for incremental and iterativeimprovement over timeAdditionally, the goals foreach organization may differ:not all will strive to be Level 5in all areas, and anorganization does not have tobe Level 5 in all areas to beapplying QbD.

Factors:Level 1Ad hocLevel 2EarlyLevel 3DevelopingLevel 4ImplementingLevel 5OptimizingContinuing the scoringexamples, here an exampleorganization has determinedthat it is at Level 2 forAwareness & Supports.Awareness& SupportsNo QbDframeworkSome awarenessBroad awareness,leadership supportAwareness extendsto partnerorganizationsQbD embedded inorganizational cultureand institutionalized,no longer requiringindividual focalpersonIt is still at early stages ofimplementing Quality byDesign, but has made acommitment to increasingeffectiveness by:Scoring Example: Quality CultureNo individualsresponsible fordriving QbDimplementationPiloting processesand supports (e.g.,workgroups,trainings)Focal pointidentified, but rolenot fully defined orcommunicatedIncentivesNo formal orinformal incentivesfor implementingQbDIncentives mayreward the wrongbehaviorsPiloting incentivesfor some elements ofQbD (seeRecommendations)Early stages shed but not Processes/supportsorganization-wideimplemented acrossorganizationDedicated subjectmatter expert(s)Subject matterassigned formalexpert(s) networkedresponsibilities forwith designateddrivingcontacts acrossimplementationinternal and externalstakeholdersIncentivesestablished formost (but not all)elements of QbD,and for most (butnot all) relevantstakeholdersIncentives for allstakeholdersencourageimplementation of allelements of QbDProcesses/supportsperiodically reviewedand enhanced viaconsultation with allstakeholdersIncentives monitoredfor effectiveness,regularly reviewedand enhancedIncentives withunintended negativeconsequences havebeen eliminated Conducting educationalworkshops for studyteams as they beginformally implementingQbD approaches Drafting processes forconducting multistakeholder discussionsearly in protocoldevelopment Identifying a QbD‘champion’ who hasleadership support forcarrying implementationforward over time.

The example organizationhas determined that it is atLevel 1 for Incentives.Scoring Example: Quality CultureFactors:Level 1Ad hocLevel 2EarlyLevel 3DevelopingLevel 4ImplementingLevel 5OptimizingAwareness& SupportsNo QbDframeworkSome awarenessBroad awareness,leadership supportAwareness extendsto partnerorganizationsQbD embedded inorganizational cultureand institutionalized,no longer requiringindividual focalpersonNo individualsresponsible fordriving QbDimplementationPiloting processesand supports (e.g.,workgroups,trainings)Focal pointidentified, but rolenot fully defined orcommunicatedIncentivesNo formal orinformal incentivesfor implementingQbDIncentives mayreward the wrongbehaviorsPiloting incentivesfor some elements ofQbD (seeRecommendations)Processes/supportsestablished but not Processes/supportsorganization-wideimplemented acrossorganizationDedicated subjectmatter expert(s)Subject matterassigned formalexpert(s) networkedresponsibilities forwith designateddrivingcontacts acrossimplementationinternal and externalstakeholdersIncentivesestablished formost (but not all)elements of QbD,and for most (butnot all) relevantstakeholdersIncentives for allstakeholdersencourageimplementation of allelements of QbDHave not yetidentified incentivesto support QbDimplementationProcesses/supportsperiodically reviewedand enhanced viaconsultation with allstakeholdersIncentives monitoredfor effectiveness,regularly reviewedand enhancedIncentives withunintended negativeconsequences havebeen eliminatedOne priority will be to reducethe ‘Christmas tree effect’ –the tendency for everyoneinvolved in study design toadd their own ‘ornament’ tothe protocol, often resultingin studies that areoverburdened, unfocused,and expensive. Theorganizations’ cultureimplicitly rewards anyonewho is seen as having animpact on study design,while those who work tokeep studies streamlined arerarely recognized.The organization plans toestablish recognitionprograms, highlightingsuccess stories, and alsoplans to create individualemployee objectives withinits performancemanagement system that aredirectly tied to QbDimplementation.

The organization has determinedit is at Level 3 for Handover fromStudy Design to Execution.Scoring Example: Study ConductFactors:Handoverfrom StudyDesign toExecutionManagementof Risks toCTQsLevel 1Ad hocLevel 2EarlyLevel 3DevelopingIncompletetransfer ofresponsibilities tothose responsiblefor studyexecution andoversightTransfer iscomplete, butdirective ratherthan interactive(thrown over thewall)Transfer is completeand provides somebig-pictureunderstanding (butnot always enoughto facilitate problemsolving)Qualitymanagement nottied to risks toCTQsRisk-informedqualitymanagementloosely tied toCTQsChanges toprotocol or trialoversight oftennot based onaddressing risksto CTQsRisk-informedquality managementmoderately tied toCTQsSome changes toprotocol and trialoversight based onaddressing risks toCTQsContinuedrelevance of CTQssometimesassessed duringstudy conductLevel 4ImplementingLevel 5OptimizingList of CTQs,Full transfer to allFull transfer viarisks andstakeholders in apartnership model,way that facilitates includingmitigationsproblem solvingengagement from(each roleearlieststages ofprovided,butunderstands whatstudy and evenit needsrationaleto do and nformedquality implicationsquality managementmanagementdirectly and fully tiedalwaysdirectly notandto CTQsstrongly, but notclearfully, tied to CTQs CTQs regularlyMost changes toprotocol and trialoversight directlyaddress risks toCTQsassessed and riskmitigation strategiesupdated acrossstudy lifecycleAll appropriatestakeholdersengaged indecision-makingClinical operations staff andpartners (CRO, sites, vendors,etc.) are consistently provided alist of identified critical-to-qualityfactors (CTQs), their associatedrisks, and mitigation strategiesfor the study.However, because most of thesestaff do not directly participate indesign-stage discussions, theylack full insight into why theseCTQs are particularly relevantand don’t fully leverage insightsin their operational processes,plans and priorities.The organization is exploringopportunities for broaderengagement of operationalpartners in study design, as wellas better documentation andtrainings that concisely conveyunderlying rationale andpotential operationalimplications.

The organization has determinedit is at Level 2 for Managementof Risks to CTQs.Scoring Example: Study ConductLevel 1Ad hocLevel 2EarlyLevel 3DevelopingLevel 4ImplementingLevel 5OptimizingHandoverfrom StudyDesign toExecutionIncompletetransfer ofresponsibilities tothose responsiblefor studyexecution andoversightTransfer iscomplete, butdirective ratherthan interactive(thrown over thewall)Transfer is completeand provides somebig-pictureunderstanding (butnot always enoughto facilitate problemsolving)Full transfer to allstakeholders in away that facilitatesproblem solving(each roleunderstands whatit needs to do andwhy)Full transfer viapartnership model,includingengagement fromearliest stages ofstudy and evenprogram designManagementof Risks toCTQsQualitymanagement nottied to risks toCTQsRisk-informedqualitymanagementloosely tied toCTQsRisk-informedquality managementmoderately tied toCTQsRisk-informedqualitymanagementdirectly andstrongly, but notfully, tied to CTQsRisk-informedquality managementdirectly and fully tiedto CTQsCTQsoversight directlyaddress risks toCTQsstudy lifecycleFactors:Changes toprotocol or trialoversight oftennot based onaddressing risksto CTQsQbD and risk-basedoversight are largelySome changes toCTQs regularlyas parallelassessed and riskprotocolhandledand trialoversight based onMost changes tomitigation strategiesprocessesaddressingrisks toprotocol and trialupdated acrossContinuedrelevance of CTQssometimesassessed duringstudy conductAll appropriatestakeholdersengaged indecision-makingAlthough it uses a risk-basedmonitoring approach, trialoversight plans leverage genericKey Risk Indicators and QualityTolerance Limits that are notexplicitly derived fromidentification of critical-to-qualityfactors relevant to a specificstudy.The organization would like toreach a level of maturity at whichstudy teams design targetedmonitoring and other oversightplans to proactively addressthose risks to critical-to-qualityfactors (CTQs) that could not beeliminated by changing the studydesign.Processes will also be put inplace to regularly assess thecontinued relevance of CTQs,and the appropriateness ofassociated risk mitigationstrategies, during study conduct.

Scoring Example: Continuous ImprovementFactors:LessonsLearnedLevel 1Ad hocLevel 2EarlyLevel 3DevelopingInformal reviewanddissemination oflessons learnedat end of studyStudy ‘after-action’reviews QbDelements (e.g., rightCTQs, appropriatemitigation strategies,unanticipated risks)Lessons learnedoften inform futurestudies, butsubstantial barriersremain (e.g., dataincomplete, siloedor difficult toaccess)Lessons learned donot consistentlyinform future studiesContinuousImprovementMetricsQuality ofstudies isinconsistentlymeasured anddifficult topredictSome appropriateoutcome andprocess metricsidentified formonitoring QbDimplementation atorganizational levelLevel 4ImplementingLevel 5OptimizingRange ofappropriate metricstracked, thoughoutput notconsistently usedStrongLessonslearned are Organizationalcommitmenttosystematically andculture, technology,collaborativelysystems fullyensuringandeachcaptured and shared support rapidacross stakeholdersincorporationofstudy learnsfromlessons learned intoStudy designthose thatcamequalityplanning of allconsistentlyfuture trialsbefore,incorporateslessons butlearnedtechnological andQuality culturalconsistentlyMetrics regularlybarriersimproving acrossreviewed andpartner needorganizationsupdatedtobe in alignmenton meaningfulwith evolving strategicmetricsaddressedestablishedplan for QbDStudy qualitytending to improvewith input from broadrange ofstakeholdersimplementation thatincorporates allstakeholder needsand perspectivesConsistent qualityimprovements overlong termThe organization hasdetermined it is at Level 3 forLessons Learned.It has put processes in placeto document decisions madeduring study design aboutcritical-to-quality factors andassociated risks, as well asto review and assess thesedecisions at the end of eachstudy. These lessonslearned are consistentlycaptured in a standardformat (study team decisionlog) that facilitatesunderstanding by allfunctions/roles (not justQuality), including individualsnot involved with the study.However, this information isnot on a central, easilyaccessible repository; ittypically is not shared withoperational partners such asthe CRO; and in the rush toenroll the first patient, studyteams often fail to review thisinformation in planning theirtrial.

Scoring Example: Continuous ImprovementFactors:LessonsLearnedLevel 1Ad hocLevel 2EarlyLevel 3DevelopingLevel 4ImplementingLevel 5OptimizingInformal reviewanddissemination oflessons learnedat end of studyStudy ‘after-action’reviews QbDelements (e.g., rightCTQs, appropriatemitigation strategies,unanticipated risks)Lessons learnedoften inform futurestudies, butsubstantial barriersremain (e.g., dataincomplete, siloedor difficult toaccess)Lessons learned aresystematically andcollaborativelycaptured and sharedacross stakeholdersOrganizationalculture, technology,and systems fullysupport rapidincorporation oflessons learned intoquality planning of allfuture trialsRange ofappropriate metricstracked, thoughoutput notconsistently usedQuality consistentlyimproving acrosspartner organizationson meaningfulmetrics establishedwith input from broadrange ofstakeholdersLessons learned donot consistentlyinform future studiesContinuousImprovementMetricsQuality ofstudies isinconsistentlymeasured anddifficult topredictSome appropriateoutcome andprocess metricsidentified formonitoring QbDimplementation atorganizational levelStudy qualitytending to improveStudy designconsistentlyincorporates lessonslearnedMetrics regularlyreviewed andupdated in alignmentwith evolving strategicplan for QbDimplementation thatincorporates allstakeholder needsand perspectivesGood metrics inplace, thoughprimarily internallyfocused and notalways usedConsistent qualityappropriatelyimprovements overlong termThe organization hasdetermined it is at Level 3 forMetrics.It has identified a reasonableset of metrics that arerelevant to Quality byDesign, feasible to track viaan organizational dashboard,and are being reviewed atintervals by organizationalleadership.As a result, implementationof QbD is becoming moreconsistent, and the quality ofthe organization’s studieshas been improving.However, these metrics arenot shared with operationalpartners and not designed tohelp operational partnersimprove. There are alsotimes when the organizationuses the metricsinappropriately, such assetting expectations withstudy teams that all studiesshould be completed fasterthan the historical average.

QbD Maturity Model Overview CTTI has developed a Maturity Model to help organizations understand and implement a Quality by Design approach to clinical trials. The next few slides will walk through the main elements of this Maturity Model one at a time. Starting on slide 8, there are examples of how to use this tool to score QbD maturity