Blog
February 29, 2024Where Are We, and Where Are We Going?
- Author
- Joe Nelson
Data Governance Maturity-Analysis Frameworks Drive Awareness and Development
Whether looking for directions on your GPS or assessing your organization's need for a data governance and privacy program, the first step is knowing where you are. The maturity of a governance program refers to its current state, gleaned through an assessment and rated against a maturity model. Generally, this involves reviewing the organization's governance processes, whether they're limited or reach across the organization, and whether they face ongoing scrutiny and improvement.
These models – or frameworks – are like the GPS apps of the governance space; they provide objective benchmarks to assess where you are, where you could or should be, and how to get there. While some frameworks have devolved into marketing tools ("buy our tool to move up a level"), signs of a respectable framework include objectivity (in both criteria and method), tool and brand-agnostic solutions, and clear value delivered in moving from one level to another. A maturity analysis is similar to but distinctly different from a gap analysis; while maturity analyses plot a company's current state against an objective framework with the goal of moving from level to level, gap analyses plot the current state against an individually stipulated future state defined by best practices, specific risks, business goals, and more. In essence, a maturity analysis consists of several individual gap analyses measured against external benchmarks.
The many different models address different criteria, dimensions, and subject matter. Some are applicable to specific programs (such as the Capability Maturity Model's relevance to software development and Stanford's Data Governance Maturity Model), while others are much more general (such as the CMMI). This article provides a high-level overview of select maturity-assessment frameworks, which could or do apply specifically to data governance.
Capability Maturity Model (CMM)
The CMM is one of the original maturity models; it was developed in the 1980s at Carnegie-Mellon University, primarily to address challenges in software development for the Department of Defense. The CMM Appraisal Framework establishes five levels of an organization's maturity across several different measures, assessing how well the organization incorporates process into its business.
At the initial level, processes are ad hoc or chaotic, and are not defined or documented sufficiently to be repeatable. At the repeatable level, basic project management is in place, and the processes are defined and documented so they may be repeated. The defined level requires that an organization has further documented and integrated the process as a standard practice within the organization. The managed or capable level incorporates metrics to evaluate the performance of the process. Finally, the optimizing or efficient level integrates continuous improvement, such as by analyzing statistical variation in the process metrics.
These maturity levels represent states, present or aspirational; each is defined through Key Process Areas (KPAs), which, when met, achieve specific goals. The KPAs are predefined clusters of related activities that contribute to goals; they change at each level of maturity. In other words, when the Level 2 KPA that incorporates metrics into an adequately documented and integrated process, the organization would have met the managed (or capable) level for that given function within the organization.
The framework implements the assessment in three stages: planning, conducting, and reporting. Several different vendors offer appraiser training, though the framework is fairly straightforward and publicly available (along with many, many supporting resources). Appraisers are expected to have an in-depth understanding of the entity's business methods and processes, as well as expertise in the CMM’s components and processes. Throughout the appraisal process, common themes include ensuring a representative sample of the entity's processes (with regard to geography, teams, etc.), and maintaining transparency and traceability of the appraised data. The final initial report serves as a baseline – a current state – of the entity's current capability.
Most of the following models are progeny of the CMM. However, the CMM itself possesses many limitations. It is very rigid; as stated, it specifically addresses software development and uses pre-defined measures for KPAs. Critics assert that it allows for different models (and levels of maturity) for different functions, resulting in a lack of integration throughout the organization, which, in turn, leads to confusion and overlap. They also stated that it placed process over results, and that the processes was administratively costly.
ISACA's Capability Maturity Model Integration (CMMI)
In light of these limitations in the CMM, the Information Systems Audit and Control Association (ISACA) released the CMMI in 2002. The CMMI is included here because of its breadth, and because it addressed some (but not all) of the foregoing criticisms. It maintains similar maturity-level designations and KPAs (just "PAs" in the CMMI), but broadens them outside of the software-development world. For example, while a Level 2 KPA in the CMM is "software program planning," a similar PA in the CMMI is just "program planning." CMMI level 3 (defined) also focuses on organizational consistency, but goes further into tools and methods. Level 4 (managed) goes beyond merely incorporating metrics by incorporating sub-processes.
Overall, the CMMI is a viable option to assess data-governance maturity, as well as many other business functions. It goes beyond the software-development world of the CMM, and it provides a very thorough framework to work through many different areas, from business processes to engineering and beyond. Results may be objectively confirmed by publishing them in ISACA's PARS registry, where a public certification may be desired/required. Nevertheless, implementing the CMMI model properly requires access to ISACA's proprietary processes, which are completed by ISACA-certified appraisers, both at a significant cost. Beyond the financial cost, the CMMI remains administratively heavy, not very flexible (which is the essential nature of a truly objective maturity model), and does not offer specifics to address governance, as do some of the following options.
Gartner's Information Management Maturity Model
In late 2008, Gartner released its first version of the Enterprise Information Management maturity model (EIM), building upon earlier work in the BI space. Gartner's model went beyond CMM, in that it is truly data-centric; data governance – both as a concept and the implementation of data-governance bodies – is a key factor in becoming more mature.
Gartner's 2008 model borrows similar maturity levels as the CMM and CMMI, numbered 0-5: unaware, aware, reactive, proactive, managed, and effective. Characteristics of the EIM model flow across each of the maturity levels and include roles, policies, lifecycle, and quality. At each level, sample activities illustrate what's occurring in the enterprise at each of these maturity levels. Characteristics of the unaware level include the absence of governance, security, and accountability; data cannot be trusted; and most don't even know there's a problem. The enterprise is exposed to significant risk. Characteristics of the aware level reflect that the enterprise is beginning to understand the value of its information (both beneficial and risk-oriented). When the enterprise moves to Reactive, policies and procedures are developed, but buy-in is low. At the proactive level, however, the entity has reached full compliance with its standards, and data is being used to support decision-making; data governance is a part of every deployment project. Finally, at the managed level, a data-governance body defines best practices and resolves information issues.
More so than the previous examples, Gartner's 2008 model represents a roadmap on how an entity may grow through these levels. Gartner illustrates this by plotting maturity (X) against sophistication (Y), with an exemplar curve growing from less mature and sophisticated to more, with action items along the way. These range from "educating IT and business leaders" in the unaware/immature area to "propos[ing] business case" in the transition from reactive to proactive. It goes without saying that implementing EIM (and thus, developing a full-fledged data governance program) is no small feat (even more so the lower an entity lies on the maturity scale). Compared to the foregoing, however, Gartner's framework provides a much more data-governance-focused framework, with the characteristics providing concrete examples of movement towards a more data-governance mature organization.
DAMA's DMBoK2 Maturity Model
At roughly the same time (2009), the Data Management Association (DAMA International) released the Data Management Body of Knowledge (DMBoK); the second edition (DMBoK2), referenced here, was released in 2017. The DMBoK2 addresses all aspects of data management, including its own data management Capability Maturity Assessment (CMA).
While many sources reference DAMA DMBoK's Capability Maturity Model, DAMA (in the DMBoK) refutes that it actually provides one (Ch. 15, FN98: "McSweeney includes the DAMA-DMBOK as one of his maturity models, although the DMBOK is not structured as such"). The DMBoK2's chapter on maturity models may be just an amalgamation and description of other models, as well as the various components and steps to complete an assessment; to a degree, this "building off of predecessors" approach is exactly what other models have done. To that end, while it may lack the extensive materials of something like the CMMI, it still provides a loose framework, though one lacking extensive objective measurement criteria.
Components of the DMBoK2's framework references the DMBoK2’s eleven different dimensions of data management (the DMBoK2 Knowledge Areas) measured across six familiar maturity levels (none, initial/ad hoc, repeatable, defined, managed, and optimized). The dimensions/knowledge areas assessed at each maturity level include governance, architecture, modeling, storage and ops, security, data integration and interoperability, document and content management, reference and master data, data warehousing and business intelligence, metadata management, and data quality. Each of the different knowledge areas are measured as to activity/process, tools, standard, and people/resources.
DMBoK2 offers a kind-of hybrid framework/gap-analysis spiderweb plot, showing the results of a current-state compared to a desired-future-state. The plot suggests that an organization could have different levels of maturity across these different knowledge areas, making it much more flexible than several other frameworks. For those familiar with the DMBoK and seeking a more flexible approach to focus on the different dimensions/knowledge areas as they see fit, DAMA DMBoK2's model is a great solution, albeit one lacking extensive appraisal materials.
Stanford's Data Governance Maturity Model
In 2011, Stanford released a data-governance-specific data maturity model. It was adapted from IBM's model (another governance-specific model) and the CMM, and it was based on Stanford's own data governance program. onceptually, it is simpler than any of the foregoing; qualitative and quantitative measurements are made across six maturity components (awareness, formalization, metadata, stewardship, data quality, and master data), as each applies to three dimensions (people, policies, and capabilities). Key benefits of Stanford's model include its specific applicability to data governance and its flexibility.
Like other models, these different dimensions and criteria are assessed through a combination of investigation and interviews. The model provides sample questions for assessing the different component. Each component is then scored and averaged, which is the resulting maturity level for the organization. The model does not provide a specific definition of maturity, making it one of the more flexible models, though it does help identify strong and weak points of a data governance program. Unfortunately, not much more is publicly available about Stanford's model.
* * *
There are many more models that purport to assess an organization's capability maturity. Some may start to fall into the marketing trap; those listed above avoid this. Unfortunately, assessment guides and more in-depth descriptions have disappeared behind paywalls. Nevertheless, the foregoing should provide insight as to maturity models in general, whether a maturity assessment or gap analysis would better suit your company's needs, and which model(s) would be a good place to start.