Presentation: Data Management And Integrity

Transcription

Data Management and IntegrityLaboratory PracticesMatt Davis and Gaye CammSenior Inspectors, Manufacturing Quality BranchMedical Devices and Product Quality Division, TGA21 November 2019

What is data integrity? Data Integrity is the extent to which data is:– Complete– Consistent– Accurate Throughout the Data lifecycle:– Initial generation and recording– Processing– Use– Retention, archiving, retrieval and destruction (PIC/S Good Practices for Data Management and Integrity PI 041)1

Creating the right environment Data management controls embedded in PQS– System design to ensure good DI practices– QRM approach to data integrity– Ongoing risk review data criticality/risk– Self Inspection Clear understanding of importance of data integrity at alllevels of the organisation Internal reporting is encouraged Mature, open management approach to data integrityRationalisation2

Risk management approach to data integrityDataCriticalityData Risk Data Criticality– CQA Batch release data cleaningrecords– Data relating to product quality/safety Data Risk– Vulnerability of data to alteration,deletion, recreation, loss or deliberatefalsification Outcome - Effective control strategy tomanage identified risks3

Data mustcorrectlyreflect theaction /observationmade Datacheckedwherenecessary Modifications explainedif not selfevidentPlus Data mustbepreserved inits unalteredstate If not, whynot CertifiedcopiesAccurate Data mustbe recordedat the time itwasgenerated Closeproximity tooccurrenceOriginal It must bepossible toread orinterpret thedata after itis recorded Permanent Nounexplainedhieroglyphics Properlycorrected ifnecessaryContemporaneous Clearlyindicateswhorecordedthe data orperformedthe activity Signed /dated Who wroteit / whenLegibleAttributableALCOA Principles CompleteConsistentEnduringAvailable4

Designing paper systems which reduce opportunities for falsificationWorkbooks, formscontrolled, verified‘true copy’ scansNo pencil, whiteout, soluble ink,SOP for correctionsand ure logsContemporaneousLegible /PermanentAttributableSystem design:documents in rightplace at right time,clocks on wall, controlof blank formsReflective of theobservation; Datachecking, raw dataverification5

Designing electronic systems which reduce opportunities for falsificationMetadata whichpermitsreconstructionData security,audit trails; backup; sys. validationAccurateOriginalUser accesscontrol; esignatures;metadataContemporaneousLegible /PermanentAttributableAuto-saving; stepwise recording;System clocksynchronisationData capture;manual data entry;source data &audit trail review6

Overarching DMDI policy ProceduresAudit TrailsSecure data retentionTime/date stampsTraceableCorrect movement of dataEmployee adherencePeriod reviewSecurityDetecting non-compliance ValidationAccurate and complete dataUnique user names/password/biometricsPrevent unauthorised changesIndependent reviewDetecting wrongful actsTrainingControl of outsourced activitiesCorrective actions7

Analytical laboratories – Common concerns & controls DI expectations:– Lab electronic systems– Data review processes– Audit review processes– Manual integration– Analyst training– Spreadsheet management– Test Injections8

Laboratory electronic systemsValidationConfigurationUser AccessData managementSoftware validationAudit TrailsSOPs for useraccess controlData review SOPsHardwarequalificationOS securityIndividual useraccessRaw dataverificationConfigurationmanagementData backup/archivingDefined userprivilegesExternal calculationtoolsChangemanagementTest methodconfigurationSystemadministratorAudit trail reviewPeriodic systemreviewE-signatures9

Data review processes Consider electronic and paper-based records Clear SOP required for data review– frequency, roles, responsibilities and approach to risk-based review ofdata (and metadata, including audit trails as relevant)– Ensure that the entire set of data is considered in the reported data,should include checks of all locations where data may have been stored,including locations where voided, deleted, invalid or rejected datamay have been stored Who-When-What-How:– Who collected/when was it collected/what was collected/how was the data collected?– Who, when, what, how data processed?– Who, when, what, how .data reviewed?– Who, when, what, how .data reported?10

Audit trail review processes Technical Controls to aid secondary review of Audit Trails– Identifying data that has been changed or modified– Review by exception – only look for anomalous or unauthorized activities– By limiting permissions to change recipes/methods/parameters, or locking access to specificparameters or whole recipes/methods may negate the need to examine associated audit trails indetail as changes can be easily observed, restricted or prohibited– Whatever activities are left open to modification need to be checked11

Audit trail reviews - Common issuesReasons for change or deletion of GMP-relevant data notdocumentedAnnex 11 §9: for change or deletion of GMP-relevant data thereason should be documented Method to record legitimate changes to data that needs to be consideredwhen doing audit trail review, or covered by SOP i.e. allowable changes tomethods. Explanation for ALL data recorded (complete data) including results thataren’t reported Deviations from standard procedures or atypical results should beinvestigated12

Reasons for changes to data May be valid reason for invalidating data and repeatingacquisition e.g. equipment malfunction, incorrectsample/standard preparation, power failure Need to be recorded, investigated and potentially implement CAPA for invalidruns, failures, repeats and other atypical data All data should be included in the dataset unless there is a documentedscientific explanation for their exclusion Possibly reviewed for trends at some timepoint May need a new SOP for laboratories in addition to standard OOS/OOTprocedures13

Audit trail challengesUse of paper copies for review of electronic systemsMany laboratories create paper copies of electronicrecords in the form of reports and rely on conduct theaudit trail review. These reports can be huge ( 80pages per chromatographic run when audit trails areincluded) and the risk that critical notifications in thesea of data may be lost is a significant risk.14

Recording of audit trail reviewHow do manufacturers demonstrate they havereviewed audit trails? Documentation of audit trail reviews should be performed in asimilar way to documenting any review process. Typically doneby signing the results as ‘reviewed’ or ‘approved’, following adata review SOP. For electronic records, this is typically signified byelectronically signing the electronic data set that has beenreviewed and approved15

Recording of audit trail reviewWhat about manufacturers who don’t haveelectronic signatures available? Hybrid approach, which is not the preferred approach, usingpaper printouts of original electronic records– Requirements for original electronic records must be met.– To rely upon these printed summaries of results for futuredecision-making, a second person would have to review theoriginal electronic data and any relevant metadata such asaudit trails, to verify that the printed summary is representative. It seems unreasonable to require specific evidence of exactly which recordsand metadata were looked at or opened (this would constitute an audit trail ofthe audit trail review)16

Manual integration controls Automatic integration should be default manual onlywhere absolutely required SOP for integration required– Define methods that can and cannot be adjusted– Document which actions are permissible, or restrict access to onlyallow the desirable actions– Document both original and manually integrated chromatograms– Electronic signatures/audit trail for manually integrated peaks Review results to ensure compliance– Review reported data against electronic raw data (including audittrails)17

Staff Training Essential that reviewers have a good knowledgeof how the computer system or software works, andhow the audit trail is designed and works togetherwith the data This may require specific training in evaluating the configuration settings andreviewing electronic data and metadata, such as audit trails, for individualcomputerized systems used in the generation, processing and reporting of data. Include training on system vulnerabilities such as overwritten or obscuredthrough the use of hidden fields or data annotation tools18

Excel spreadsheetsSpreadsheets for managing and presenting data, and their versatility and ease of use has led towide application. When data contained within the spreadsheet cannot be reconstructed elsewhereand isessential to GMP activity then the data governance measures need to be rigorous.Issues If the spreadsheet has multiple users it may be impossible to ascertain who (Attributable)made an entry, whether entries have been over-written and replaced (permanent), and whenthe data entries had been made (Contemporaneous). If the spreadsheet is not version-controlled and managed as a controlled document, thenthere may be different versions in use (Original). Where formulae and other functions are used there is potential for these to be corruptedwithout being detected (Accurate).19

Test injections All chromatographic systems need to equilibrate before they are ready for analysis. The time taken will typicallydepend on factors such as the complexity of the analysis, the age and condition of the column, anddetector lamp warm-up time. Generally there will be an idea of how long this will be from the methoddevelopment/validation/verification/ transfer work performed in thelaboratory and this should be documented in the analytical procedure. Prepare an independent reference solution of analyte(s) that will be used for the sole purpose of system evaluation.The solution container label needs to be documented to GMP standards and clearly identified for the explicit purposeof evaluating if a chromatography system is ready for a specific analysis. The analytical procedure needs to allow the use of system evaluation injections. Staff need to be trained in theprocedure. Inject one aliquot from the evaluation solution and compare with the SST criteria. Clearly label the vial in thesequence file as a system evaluation injection. If the SST criteria are met then the system is ready for the analysis. Upon completion of the analysis, document the number of system evaluation injections as part of the analytical reportfor the run.20

Microbiological laboratories – common concerns and controls General issues DI expectations– Note: All previous comments regardingcomputerised systems apply– Microbiology may present a greater DIrisk21

General issues observedManipulation ofdataNo testing conductedNot counting all coloniesIncomplete TestingSamples not taken or “lost” intransitNo reconciliation of samplesPoor test recordsNot recording all key test dataWorksheets ripped up andreplacedNo reconciliation of forms usedOOL data not being investigatedIncubation conditions incorrectLack of proper computerisedsystem securityResampling/retesting withoutjustificationUsing unvalidated test methodsColony morphology not matchingidentification resultsContributing causesCompetence/supervisionLack of effective controlsSecondary ChecksComputerised systemconfigurationOrganisational Culture /resources22

DI controls – Manual test methodsSampling ProceduresTest methodsIncubationSampling schedule/plansTraining of techniciansSample formsDetailed collection methods Test volumes/weights recordedIdentity of sampler recorded Calibrated equipment usedReference to all reagentsIncubation records maintainedReference to validatedMin/max incubation time defined andmethods/dilution factorsvalidatedSamples processed under cleanAll transfers/sub-culturing recordedconditions, e.g. LAFNegative controls for processed All incubated samples tagged andidentifiedsamplesIdentity of tester/equipment recordedReading resultsTechnicians trained in detection, enumeration andmorphology – clear SOPs, photosControlled environment for reading, light,magnificationCounting device used for coloniesClear acceptance criteria/limitsOOL & ID policy for manual recordingAll samples reconciledResults recordedCalculations applied correctlySecond checks and verification in accordance withquality risk management23

Summary Review DMDI guidance and TGA policy Develop DMDI policy for your organisation Risk based approach to systems and data– Data criticality– Review capabilities of electronic systems Incorporate DMDI controls into QMS (and review!)24

Where are you now? Do not know about the issueand unaware of the gapUNCONSCIOUSINCOMPETENCECONSCIOUSINCOMPETENCE Aware of the gap but notyet able to deal with it Getting a handle on theproblem but only with effortCONSCIOUSCOMPETENCEUNCONSCIOUSCOMPETENCE Good practice becomesautomatic25

Data Integrity is the extent to which data is: – Complete – Consistent – Accurate Throughout the Data lifecycle: . Clear SOP required for data review – frequency, roles, responsibilities and approach to risk-based review of data (and metadata, including audit trails as relevant) – Ensure that the entire set of data is considered in the reported data, should include .