Data Governance Maturity Model - OMES

Transcription

DATA GOVERNANCEMATURITY MODELThis document provides two examples of maturity assessmenttools. These tools can be used to assess the maturity level ofan organization’s data governance program and to developgoals to guide the work of the organization’s data governanceprogram.This document and the tools included herein are largely adapted from the University of Stanford’s Data Governance MaturityModel and the October 17, 2011, Data Governance at Stanford Newsletter published by the University of Stanford. Additionally,the maturity levels were borrowed from “The IBM Data Governance Council Maturity Model: Building a roadmap for effectivedata governance.”

ContentsPurpose of a Data Governance Maturity Model . 2Overview of the Data Governance Maturity Model . 3The Maturity Levels . 3The Component-Dimensions . 4The Data Governance Maturity Model . 6Guiding Questions for Each Component-Dimension . 6The Stanford Data Governance Maturity Measurement Tool . 6The Stanford Data Governance Quantitative Measurement Tool . 8Data Governance Maturity Model Qualitative Score Card. 12Using the Maturity Model to Plot for Success . 12Summary . 14Appendix A. The Basic Maturity Assessment. 15References . 18This publication is issued by the Office of Management and Enterprise Services as authorized by Title 62, Section34. Copies have not been printed but are available through the agency website. This work is licensed under aCreative Attribution-NonCommercial-NoDerivs 3.0 Unported License.1

Purpose of a Data Governance Maturity ModelA maturity model is one of the most valuable tools available for planning and sustaining a newstrategic program. Like the data governance (DG) program itself, the DG maturity model shouldbe customized around the unique goals, priorities and competencies of the organization. Themodel included below is the model developed by Stanford University’s Data Governance Office.It can be customized to meet the needs of your organization.A maturity model is a tool that is used to develop, assess and refine an expansive program.Because measurement of performance simply through return on investment (ROI) or reductionof cost is inappropriate for data governance programs, another method must be constructed toassess effectiveness. The Stanford Maturity Measurement Tool offers a robust qualitativeassessment along with quantitative measures to ensure a thorough DG assessment is possible.A significant benefit of utilizing a maturity model is that it can consistently measure the state ofa program over time. A DG program crosses functional boundaries and has a life span measuredin years rather than months. Stable metrics facilitate presentation of the DG program’saccomplishments to the sponsors, ensuring the sustainability of the program anddemonstration to the participants that their efforts are driving organizational change.The design of the maturity model also influences the strategic direction of the program. Amaturity model is made up of levels describing possible states of the organization where thehighest levels define a vision of the optimal future state.Because the full implementation and maturation of a DG program is a multiyear effort, theintermediate maturity states can be used to construct a program roadmap. The model not onlyfacilitates assessment of the DG program, but also focuses attention on specific areas whereactionable opportunities can be addressed rapidly (Stanford, 2011).2

Overview of the Data Governance Maturity ModelThe Stanford Maturity Measurement Tool contains both qualitative and quantitative metrics totrack the growth of the DG practice throughout the organization.Qualitative aspects describe characteristics of the organization at various levels of maturity.Because these are inherently subjective, the model is enriched with quantitative metrics thatcount activities performed, program participants and artifacts developed.Each component-dimension’s (more on this below) qualitative scale ranges from level one,representing the initial state of a data governance program, to level five, representing theobjective of DG in that area of focus. An in-depth description of each qualitative maturity levelis provided in the next section. The quantitative metrics are numeric measures that becomeapplicable at each level of maturity and may be used at all maturity levels moving forward.Advancement through qualitative maturity levels can take place over a long time; quantitativemetrics provide the ability to monitor intrastage growth through more granular measures(Stanford, 2011).The Maturity LevelsDeveloped by the Software Engineering Institute (SEI) in 1984, the Capability Maturity Model(CMM) is a methodology used to develop and refine an organization’s software developmentprocess and it can be easily applied to an organization’s DG program and processes. The CMMdescribes a five-level graduated path that provides a framework for prioritizing actions, astarting point, a common language and a method to measure progress. Ultimately, thisstructured collection of elements offers a steady, measurable progression to the final desiredstate of fully mature processes (IBM, 2007).3

At Maturity Level 1 (Initial), processes are usually ad hoc, and the environment is not stable.Success reflects the competence of individuals within the organization, rather than the use ofproven processes. While Maturity Level 1 organizations often produce products and servicesthat work, they frequently exceed the budget and schedule of their projects (IBM, 2007).At Maturity Level 2 (Managed), successes are repeatable, but the processes may not repeat forall the projects in the organization. Basic project management helps track costs and schedules,while process discipline helps ensure that existing practices are retained. When these practicesare in place, projects are performed and managed according to their documented plans, yetthere is still a risk for exceeding cost and time estimates (IBM, 2007).At Maturity Level 3 (Defined), the organization’s set of standard processes are used toestablish consistency across the organization. The standards, process descriptions andprocedures for a project are tailored from the organization’s set of standard processes to suit aparticular project or organizational unit (IBM, 2007).At Maturity Level 4 (Quantitatively Managed), organizations set quantitative quality goals forboth process and maintenance. Selected sub-processes significantly contribute to overallprocess performance and are controlled using statistical and other quantitative techniques(IBM, 2007).At Maturity Level 5 (Optimizing), quantitative process-improvement objectives for theorganization are firmly established and continually revised to reflect changing businessobjectives, and used as criteria in managing process improvement (IBM, 2007).The Component-DimensionsThe Stanford Maturity Measurement Tool focuses both on foundational and project aspects ofDG. The foundational components (Awareness, Formalization and Metadata) of the maturitymodel focus on measuring core DG competencies and development of critical programresources. Awareness: The extent to which individuals within the organization have knowledge ofthe roles, rules, and technologies associated with the data governance program.Formalization: The extent to which roles are structured in an organization and theactivities of the employees are governed by rules and procedures.Metadata: Data that 1) describes other data and IT assets (such as databases, tables andapplications) by relating essential business and technical information and 2) facilitatesthe consistent understanding of the characteristics and usage of data. Technicalmetadata describes data elements and other IT assets as well as their use,representation, context and interrelations. Business metadata answers who, what,where, when, why and how for users of the data and other IT assets.4

The project components (Stewardship, Data Quality and Master Data) measure how effectivelyDG concepts are applied in the course of funded projects (Stanford, 2011). Stewardship: The formalization of accountability for the definition, usage, and qualitystandards of specific data assets within a defined organizational scope.Data Quality: The continuous process for defining the parameters for specifyingacceptable levels of data quality to meet business needs, and for ensuring that dataquality meets these levels. (DMBOK, DAMA)Master Data: Business-critical data that is highly shared across the organization. Masterdata are often codified data, data describing the structure of the organization or keydata entities (such as “employee”).Three dimensions (People, Policies and Capabilities) further subdivide each of the six maturitycomponents, focusing on specific aspects of component maturation. People: Roles and organization structures.Policies: Development, auditing and enforcement of data policies, standards and bestpractices.Capabilities: Enabling technologies and techniques.It is imperative that the maturity model is finalized and adopted early in the rollout of the DGprogram and remains stable throughout its life. Thoughtful input from across the organizationwill help assure the model’s long-term fitness (Stanford, 2011).5

The Data Governance Maturity ModelGuiding Questions for Each Component-Dimension(Stanford, jectStewardshipData QualityMaster Data6PeoplePoliciesWhat awareness dopeople have abouttheir role within thedata governanceprogram?What awareness isthere of datagovernance policies,standards and bestpractices?How developed is thedata governanceorganization and whichroles are filled tosupport datagovernance activities?What level of crossfunctional participationis there in thedevelopment andmaintenance ofmetadata?To what degree aredata governancepolicies formallydefined, implementedand enforced?PeopleTo what degree aremetadata creation andmaintenance policiesformally defined,implemented andenforced?PoliciesWhat awareness dopeople have abouttheir role within thedata governanceprogram?What awareness isthere of datagovernance policies,standards and bestpractices?How developed is thedata governanceorganization and whichroles are filled tosupport datagovernance activities?To what degree has aformal master datamanagementorganization beendeveloped andassigned consistentresponsibilities acrossdata domains?To what degree aredata governancepolicies formallydefined, implementedand enforced?To what degree aremetadata creation andmaintenance policiesformally defined,implemented andenforced?CapabilitiesWhat awareness isthere of datagovernance enablingcapabilities that havebeen purchased ordeveloped?How developed is thetoolset that supportsdata governanceactivities and howconsistently is thattoolset utilized?What capabilities are inplace to activelymanage metadata atvarious levels ofmaturity?CapabilitiesWhat awareness isthere of datagovernance enablingcapabilities that havebeen purchased ordeveloped?How developed is thetoolset that supportsdata governanceactivities and howconsistently is thattoolset utilized?What capabilities are inplace to activelymanage metadata atvarious levels ofmaturity?

The Stanford Data Governance Maturity Measurement ToolData Governance Foundational Component MaturityPeople1Limited awareness ofpurpose or value of DGprogram.AwarenessPoliciesMost existing datapolicies areundocumented and theremay be inconsistentunderstanding of datapolicies within adepartment.CapabilitiesLittle awareness of DGcapabilities andtechnologies.A small subset of theorganizationunderstands thegeneral classes of DGcapabilities andtechnologies.2Executives are aware ofexistence of program. Littleknowledge of programoutside uppermanagement.Existing policies aredocumented but notconsistently maintained,available or consistentbetween departments.3Executives understand howDG benefits/impacts theirportion of the organization,knowledge workers areaware of program.Executives activelypromote DG within theirgroups.Common data policiesare documented andavailable through acommon portal. Moststakeholders are awareof existence of datapolicies that may impactthem.A small subset of theorganization is awareof the specific DGcapabilities that areavailable at theorganization.4Executives understandlong-term DG strategy andtheir part in it. Knowledgeworkers understand howDG impacts/benefits theirportion of the organization.Executives activelypromote DG beyond theimmediate group.All data policies areavailable through acommon portal andstakeholders are activelynotified wheneverpolicies are added,updated or modified.A targeted audiencehas been identifiedand a significantportion of thataudience is aware ofthe DG capabilitiesthat are available atthe organization.5Both executives andknowledge workersunderstand their role in thelong-term evolution of DG.Knowledge workersactively promote DG.A history of all datapolicies are maintainedthrough a common portaland all stakeholders aremade part of the policydevelopment process.A significant portionof the targetedaudience understandshow to utilize relevantDG capabilities thatare available at theorganization.6People12No defined rolesrelated to DG.DG roles andresponsibilities havebeen defined andvetted with programsponsors.FormalizationPoliciesNo formal DG policies.High-level DG metapolicies are definedand distributed.3Some roles are filledto support DG needsand participantsclearly understandresponsibilitiesassociated with theirroles.4DG roles are organizedinto reusable schemaswhich are designed tosupport specific dataand functionalcharacteristics. Thereis broad (butinconsistent)participation in DG.Data policies becomeofficial organizationdata policies andcompliance withapproved data policiesis audited.5DG organizationalschemas are filled asdefined, meetregularly anddocument activities.Compliance withofficial organizationdata policies is activelyenforced by agoverning body.Data policies aroundthe governance ofspecific data aredefined and distributedas best practices.CapabilitiesClasses of DGcapabilities are notdefined.Classes of DGcapabilities are definedand homegrowntechnical solutions areused within someorganizationalfunctions.Homegrown technicalsolutions are adoptedas best practices forsome classes ofcapabilities and madeavailable throughoutthe institution.All defined classes ofDG capabilities have anavailable solution.All defined classes ofDG capabilities aremandatory for assignedsystems or critical data.PeopleMetadataPoliciesCapabilitiesNo metadata relatedpolicies.Metadata is inconsistentlycollected and rarelyconsolidated outside ofproject artifacts.2Roles responsible forproduction of technicalmetadata onstructured data aredefined during systemdesign.Metadata bestpractices are producedand made available.Most best-practices arefocused on themetadata associatedwith structured data.Metadata templates areadopted to provide someconsistency in content andformat of capturedmetadata. Metadata isconsolidated and availablefrom a single portal.Capabilities focus oncapture of metadata ofstructured content.3The responsibility fordeveloping institutionalbusiness definitionsand storing them in acentral repository isassigned to andcontinually performedby subject matterexperts.Policies requiring thedevelopment of newmetadata as part ofsystem development(usually focused onstructured data) areadopted as official datapolicies.The collection of metadataon structured content isautomated and scheduledextracts are performed forselected systems.4Metadata collection/validationresponsibilitiesassigned to namedindividuals for allprojects.Policies requiring theregular auditing ofmetadata in specifiedsystems are adopted asofficial organizationdata policies andmetadata developmentas part of systemdevelopment isenforced.A centralized metadatastore becomes theprimary location for allinstitutional metadata.Metadata is automaticallycollected from mostrelational databasemanagement systems andvendor packaged systems.5A dedicated metadatamanagement group iscreated to strategicallyadvance metadatacapabilities and moreeffectively leverageexisting metadata.Metadata policy coversboth structured andunstructured (nontabular) data and isenforced.A metadata solutionprovides a single point ofaccess to federatedmetadata resourcesincluding both structuredand unstructured data.1Limited understandingof types and value ofmetadata.

Data Governance Project Component MaturityPeople2Business analysts drivedata requirementsduring design process.Definition ofstewardship roles andresponsibilities islimited.Maturity Level1Almost no welldefined DG orstewardship roles orresponsibilities. Datarequirements aredriven by theapplicationdevelopment team.345All stewardship rolesand structures aredefined and filled butare still functionallysiloed.The stewardshipstructures includerepresentatives frommultiple businessfunctions.The stewardship boardincludesrepresentatives fromall relevant institutionalfunctions.7StewardshipPoliciesLimited stewardshippolicies documented.CapabilitiesLimited stewardshipcapabilities areavailable.Policies aroundstewardship definedwithin a functional area.A centralized locationexists for consolidationof and/or access tostewardship relateddocumentation.Stewardship policies areconsistent betweenfunctions and areas.Workflow capabilitiesare implemented forthe vetting andapproval ofinstitutional definition,business metadata andapproval of otherstewardship relateddocumentation.Stewardship teams selfaudit compliance withpolicies.Stewardshipdashboards report dataquality levels and dataexceptions to supportthe auditing ofstewardshipeffectiveness.Compliance withstewardship policies areenforced for keyinstitutional data.A common stewardshipdashboard enablesmanaged issueremediation as part ofdata quality reportingand data exceptionreporting.People12345Individuals perform adhoc data qualityefforts as needed andmanually fix identifieddata issues.Identification of dataissues is based off itsusability for a specificbusiness task.A small group ofindividuals are trainedin and performprofiling to assessdata quality of existingsystems to establish abaseline or justify adata quality project.Downstream usage ofthe data is consideredin issue identificationprocess.People are assigned toassess and ensuredata quality within thescope of each project.Data quality expertsare identifiedthroughout theorganization and areengaged in all dataquality improvementprojects.A data qualitycompetency center isfunded and chargedwith continuallyassessing andimproving data qualityoutside of the systemdevelopment lifecycle.Data QualityPoliciesData quality efforts areinfrequent and drivenby specific businessneeds. These effortsare usually large onetime data cleansingefforts.Best practices havebeen defined for somedata quality relatedactivities and followedinconsistently.Profiling andde

Model and the October 17, 2011, Data Governance at Stanford Newsletter published by the University of Stanford. Additionally, the maturity levels were borrowed from “The IBM Data Governance Council Maturity Model: Building a