Review of Data Management Maturity Models
Review of Data Management Maturity Models
Alan McSweeney
Objectives
Review existing data management maturity models to identify core set of characteristics of an effective data maturity model
DMBOK (Data Management Book of Knowledge) from DAMA (Data Management Association) http://www.dama.org/i4a/pages/index.cfm?pageid=3345 MIKE2.0 (Method for an Integrated Knowledge Environment) Information Maturity Model (IMM) http://mike2.openmethodology.org/wiki/Information_Maturity_QuickScan IBM Data Governance Council Maturity Model http://www.infogovcommunity.com/resources Enterprise Data Management Council Data Management Maturity Model http://edmcouncil.org/downloads/20130425.DMM.Detail.Model.xlsx
Maturity Models (Attempt To) Measure Maturity Of Processes And Their Implementation and Operation
Processes breathe life into the organisation Effective processes enable the organisation to operate efficiently Good processes enable efficiency and scalability Processes must be effectively and pervasively implemented Processes should be optimising, always seeking improvement where possible
Growth in informal and ad hoc maturity models Lack rigour and detail Lack detailed validation to justify their process structure Not evidence based Lack the detailed assessment structure to validate maturity levels Concept of a maturity model is becoming devalued through overuse and wanton borrowing of concepts from ISO/IEC 15504 without putting in the hard work
How to know you are at a given level? How do you objectively quantify the maturity level scoring? What are the business benefits of achieving a given maturity level? What are the costs of achieving a given maturity level? What work is needed to increase maturity? Is the increment between maturity levels the same? What is the cost of operationalising processes? How do you measure process operation to ensure maturity is being maintained? Are the costs justified? What is the real value of process maturity?
October 23, 2013 6
Part 9
Vocabulary
Part 6
Guide to Qualification of Assessors
Part 7
Guide for Use in Process Improvement
Part 8
Guide for Determining Supplier Process Capacity
Part 3
Performing an Assessment
Part 4
Guide to Performing Assessments
Part 2
A Reference Model for Processes and Process Capability
October 23, 2013
Part 5
An Assessment Model and Indicator Guidance
7
Originally based on Software process Improvement and Capability Determination (SPICE) Detailed and rigorously defined framework for software process improvement Validated Defined and detailed assessment framework
Attribute Indicators
Parallel process reference model and assessment model Correspondence between reference model and assessment model for process categories, processes, process purposes, process capability levels and process attributes
10
Best Practices
Assessed By
Management Practices
Assessed By
Indicators are attributes whose existence that practices are being performed Collect evidence of indicators during assessments
12
Maturity Level 1
Maturity Level 2
Maturity Level N
Process Area 1
Process Area 2
Process Area N
Process 1
Process N
Process 1
Process N
Process N
Process N
Generic Goals
Specific Goals
Generic Practices
Specific Practices
Generic Practice 1
Generic Practice N
Specific Practice 1
Specific Practice N
Sub-Practice 1.1
Sub-Practice N.1
Sub-Practice 1.M
October 23, 2013
Sub-Practice N.M
13
Each process area has a number of processes Each process has generic and specific goals and practices
Specific goals describes the unique features that must be present to satisfy the process area Generic goals apply to multiple process areas Generic practices are applicable to multiple processes and represent the activities needed to manage a process and improve its capability to perform Specific practices are activities that are contribute to the achievement of the specific goals of a process area
October 23, 2013 14
Use sub-practices and practices to assess current state of key capabilities and identify gaps Allows effective decisions to be made on capabilities that need improvement
Implement Sub-Practices Sub-Practice(s) Assess Current Status and Assign Score
Implement Practices
Practice(s)
Implement Goals
Goal(s)
Processes
Practice(s)
Implement Practices
Sub-Practice(s)
October 23, 2013
Implement Sub-Practices
16
Process
Process
Process
Goal
Goal
Goal
Practice
Practice
Practice
Sub-Practice
October 23, 2013
Sub-Practice
Sub-Practice
17
Maturity Levels
Maturity levels are intended to be a way of defining a means of evolving improvements in processes associated with what is being measured
18
Staged or continuous
Staged method uses the maturity levels of the overall model to characterise the state of an organisations processes
Spans multiple process areas Focuses on overall improvement Measured by maturity levels
Continuous method focuses on capability levels to characterise the state of an organisations processes for process areas
Looks at individual process areas Focuses on achieving specific capabilities Measured by capability levels
19
20
Level 1 Level 2
Performed Managed
Level 3
Defined
Level 3 Level 2
Managed Defined
Level 1
Performed
Level 0
Incomplete
October 23, 2013
22
Process Link to Overall Organisation Objectives Processes Are Controlled and Predictable Continual SelfImprovement
Common Standards Exist That Are Customised Ensuring Consistency Standard Approach To Measurement Disciplined Approach To Processes
Level 2
Managed
Defined
Level 1
Initial
October 23, 2013
24
Maturity Level 1
Maturity Level 2
Maturity Level N
Process Area 2
Process Area N
Process 1
Process N
Process 1
Process N
Process N
Process N
Generic Goals
Specific Goals
Generic Practices
Specific Practices
Generic Practice 1
Generic Practice N
Specific Practice 1
Specific Practice N
Sub-Practice 1.1
October 23, 2013
Sub-Practice 1.M
Sub-Practice N.1
Sub-Practice N.M
25
Maturity Model
Maturity Model
Maturity Level 1
Maturity Level 2
Maturity Level 3
Maturity Level 4
Maturity Level 5
Process 2.1
Process 3.1
Process 4.1
Process 5.1
To be at Maturity Level N means that all processes in previous maturity levels have been implemented
Process 2.2
Process 3.2
Process 4.2
Process 5.2
Process 2.3
Process 3.3
Process 4.3
Process 2.4
October 23, 2013
Process 4.4
26
Level 4
Quantitatively Managed
Level 3
Defined
Level 2
Managed
Process Process
Level 1
Initial
Process Process
Process Process
Level 5
Optimising
Level 4
Quantitatively Managed
Level 3
Defined
Level 2
Managed
Process Process
Level 1
Initial
Process Process
Process Process
Maturity Level 1
Maturity Level 2
Maturity Level N
Process Area 2
Process Area N
Process 1
Process N
Process 1
Process N
Process N
Process N
Generic Goals
Specific Goals
Generic Practices
Specific Practices
Generic Practice 1
Generic Practice N
Specific Practice 1
Specific Practice N
29
De fi
ne ,D esi gn , Im
ple Secure, Store, Replicate Ad men and Distribute mi t, M nis e ter asu , S re, Present, Report, tan M Analyse, Model da an a rds ge ,G ,M ov on Preserve, Protect and ern it o Recover a n r, ce Co , F nt un rol d ,S Archive and Recall taf f, T rai na nd Delete/Remove
30
Data management maturity is about having the overarching skills to handle change, perform research, adopt suitable and appropriate new technologies and deliver a service and value to the underlying business There is no point in talking about Big Data when your organisation is no good at managing little data
De fi
ne ,D esi gn , Im
What Processes Are Needed To Implement Effectively the Stages in the Information Lifecycle?
ple Secure, Store, Replicate Ad men and Distribute mi t, M nis e ter asu , S re, Present, Report, tan M Analyse, Model da an a rds ge ,G ,M ov on Preserve, Protect and ern it o Recover a n r, ce Co , F nt un rol d ,S Archive and Recall taf f, T rai na nd Delete/Remove
32
Unstructured Data
Lifecycle Dimension
Information lifecycle management needs to span different types of data that are used and managed differently and have different requirements
Operational Data associated with operational/real-time applications Master and Reference Data maintaining system of record or reference for enterprise master data used commonly across the organisation Analytic Data data warehouse/business intelligence/analysisoriented applications Unstructured Data documents and similar information
34
How well do you implement information management? Where are the gaps and weaknesses? Where do you need to improve? Where are your structures and policies sufficient for your needs?
35
DAMA DMBOK
Data Governance Data Architecture Management Data Development Data Operations Management Data Security Management Reference and Master Data Management Data Warehousing and Business Intelligence Management Document and Content Management Metadata Management Data Quality Management
36
All very different All contain gaps none is complete None links to an information management lifecycle
37
Architect, Budget, Plan, Design and Specify Implement Underlying Technology Enter, Create, Acquire, Derive, Update, Integrate, Capture Secure, Store, Replicate and Distribute Present, Report, Analyse, Model Preserve, Protect and Recover Archive and Recall Delete/Remove Define, Design, Implement, Measure, Manage, Monitor, Control, Staff, Train and Administer, Standards, Governance, Fund
Accountability Roles & & Responsibility Structures Resource Commitment Standards & Disciplines
Security
Processes
Reporting
Certification
Training and accountability Design requirements Process and technology Access Control Identity Requirements Integration Evaluation & Measurement Remediation & Reporting
Business Value Organisational Reporting Awareness Consistency (Format & Semantics) Business Value Ownership (Roles & Responsibilities) Collection Automation Reporting Automation
39
40
Policy
Common Data Model Communication Plan Data Integration (ETL & EAI) Data Ownership Data Quality Metrics Data Quality Strategy Data Standardisation Executive Sponsorship Issue Identification
Technology
B2B Data Integration Cleansing Common Data Model Common Data Services Data Analysis Data Capture Data Integration (ETL & EAI) Data Quality Metrics Data Standardisation
Compliance
Audits Metadata Management Data Quality Metrics Data Analysis Security Issue Identification
Measurement
Data Quality Metrics Dashboard (Tracking / Trending) Data Analysis
Process/Practice
Audits Benchmarking Cleansing
Profiling / Measurement Common Data Model Metadata Management Communication Plan Cleansing Dashboard (Tracking / Trending) Data Analysis Data Capture Data Integration (ETL & EAI) Data Ownership Data Quality Metrics Data Standardisation Data Stewardship Executive Sponsorship Issue Identification Master Data Management Metadata Management Privacy Profiling / Measurement
Service Level Agreements B2B Data Integration Data Subject Area Coverage
Data Quality Strategy Master Data ManagementData Stewardship Data Standardisation Platform Standardisation Data Validation Data Validation Privacy Master Data Management Executive Sponsorship Profiling / Measurement Metadata Management Master Data ManagementRoot Cause Analysis Platform Standardisation Privacy Security Profiling / Measurement Security Security
41
42
DQ Requirements
Data Security Data Integration Standards Architecture RMD Data Security Management Controls and Procedures Users, Passwords, Match Rules and Groups Data Access Views and Permissions User Access Behaviour Information Confidentiality Audit Data Security
Process Data for Business Intelligence Tune Data Establish Golden Records Warehousing Processes Hierarchies and BI Activity and Affiliations Performance
DQ Service Levels Continuously Measure DQ Manage DQ Issues Data Quality Defects Operational DQM Procedures Monitor DQM Procedures
43
Mapping Enterprise Data Management Council Data Management Maturity Model to Information Lifecycle
Data Management Goals Corporate Culture Governance Model Data Management Funding Data Requirements Lifecycle Standards and Procedures Data Sourcing Architectural Framework Platform and Integration Data Quality Framework Data Quality Assurance
October 23, 2013 44
Architect, Budget, Plan, Design and Specify Implement Underlying Technology Enter, Create, Acquire, Derive, Update, Integrate, Capture Secure, Store, Replicate and Distribute Present, Report, Analyse, Model Preserve, Protect and Recover Archive and Recall Delete/Remove Define, Design, Implement, Measure, Manage, Monitor, Control, Staff, Training and Administer
45
Substantial differences in data maturity models indicate lack of consensus about what comprises information management maturity There is a need for a consistent approach, perhaps linked to an information lifecycle to ground any assessment of maturity in the actual processes needed to manage information effectively
46
More Information
Alan McSweeney http://ie.linkedin.com/in/alanmcsweeney
47