Critical Data Elements: A Practitioner’s Perspective

Critical Data Elements: A Practitioner’s Perspective

Multiple authors and experts in the field, like Thomas Redman and David Loshin, have developed and discussed the concept of critical data (elements) over time, starting in 2008. DAMA-MDBOK2 also included this topic as part of the Data Quality Knowledge Area. I shared my first article on this topic in 2019 and later described the dependencies between critical data and data lineage in my book, “Data Lineage from a Business Perspective.”

Even though this concept has been known for a long time, data management professionals still discuss it extensively when implementing it in practice.

In this article, I share my insights and practical experience in implementing this concept, focusing on the following topics:

·????? The definition and objectives of critical data and critical data elements

·????? Data management capabilities that provide input and consume the results of critical data initiatives

·????? Step-by-step high-level methodology for implementing this concept

Definition and Objectives of Critical Data

DAMA-DMBOK2 provides only general characteristics of critical data, specifying its usage, such as regulatory reporting, financial reporting, business policy, ongoing operations, and business strategy. It also emphasizes that “specific drivers for criticality will differ by industry.”

One of the drivers of this concept was the?Basel Committee on Banking Supervision‘s standard number 239: “Principles for effective risk data aggregation and risk reporting” (BCBS 239 or PERDARR). This standard defined critical data as:

·????? “Data that is critical to enabling the bank to manage the risks it faces

·????? data critical to risk data aggregation and IT infrastructure initiative

·????? aggregated information to make critical decisions about risk”

In my practice, I apply the following definition of critical data:

Critical data is “data that is critical for managing business risks, making business decisions, and successfully operating a business.”

However, I must acknowledge that this definition is broad enough to be applied in the practical implementation of this concept. One of the challenges I observe within the community is that professionals often focus on implementing this concept without fully considering its underlying purpose.

I believe the fundamental goal of utilizing the critical data concept is prioritizing various data management initiatives. Occasionally, I observe discussions where critical data is seen as serving the purpose of either “prioritization” or “limitation.” Any data management initiative is time- and resource-consuming. Therefore, to make the initiative feasible, the organization must set priorities. However, it does not mean that only particular “critical” data must be managed properly. From this perspective,” prioritization” is the ultimate goal of using the “critical data” concept.

Input-Providing and Consuming Data Management Capabilities

DAMA-DMBOK2 has limited the consideration of critical data only to the Data Quality Knowledge Area. The challenge is that multiple data management capabilities provide input in defining critical data. In this article, I call them “input-providing.” You can consider the data quality capability as one of the recipients or consuming capabilities of the results of critical data implementation.

Figure 1 demonstrates the examples of input-providing and consuming capabilities.


Figure 1: Input-providing and consuming data management capabilities.
Figure 1: Input-providing and consuming data management capabilities.

I used the data quality capability as an example of the consuming capability.

A data governance capability must provide the requirements of regulations related to critical data, if applicable, and the definition and criteria for determining criticality. It should also outline the data management roles involved in the identification process. Additionally, this capability can establish the methodology for identifying and managing critical data.

Critical data must be established at least at two levels of data models—logical and physical—defined by a data modeling capability.

Data, application architecture, and metadata management must provide data flows and lineage to define critical data along data chains effectively.

Only by having these inputs can the data quality capability perform its tasks. For critical data elements, this capability enables gathering data quality requirements, building corresponding data quality checks, and measuring data quality.

Let me summarize the key steps in identifying critical data (elements) indicating relevant data management capabilities.

Step-by-Step High-Level Methodology for Implementing the Critical Data Concept

Step 1. Identify the objectives for using the critical data concept

Various business drivers demand different data initiatives. If your organization must comply with regulations that include critical data in their requirements, your objective is clear. However, the critical data concept must have a well-defined objective for other organizations. The key reason for this is that the objective drives the definition of critical data within the organization.

The primary objective of applying the critical data concept is prioritizing data-related initiatives. The different natures of these initiatives will lead to varying definitions of critical data. For example, personal data will be considered critical for an organization focused on improving the management of this data type. In contrast, customer data will take precedence in enhancing customer experience.

This task can be considered as a deliverable of data governance.

Step 2. Identify the critical use cases, reports, or dashboards

The identified business drivers and corresponding data initiatives have specific stakeholders and use cases. Reports and dashboards are typical examples of such use cases. The next logical step is to document these use cases and identify those critical to the data initiatives' objectives. In many organizations, there are hundreds or even thousands of reports. Therefore, the simplest way to understand the volume of reports your organization produces is to begin documenting them.

This exercise can take some time, but it is not the end of the story. It can be considered as a deliverable of data governance and data architecture.

Step 3. Analyze data elements in use cases and define the critical ones

Reports and dashboards are containers of information or data elements. These reports include leading business key performance indicators (KPI). The simplest examples of such KPIs are “Customer profitability” or “Monthly Net Revenue.” So, it would be recommended to identify all or some of these KPIs as critical data elements.

This step sounds simple, but it is not always so. Identifying (critical) data elements in tabular reports, in which each column represents a unique data element, is relatively simple. However, some reports, including regulatory ones, include complex multiple schedules. Very often, you can find a taxonomy of similar data elements there. Figure 2 presents a simplified example.


Figure 2: A simplified example of data elements’ taxonomy.

“Net Revenue” is a data element corresponding to a financial KPI and can be marked as critical. However, “Net Revenue” can also be segmented by customer group, such as “Corporate” or “Retail” segments. In this case, the customer segment becomes another data element with its taxonomy.

Read further: https://datacrossroads.nl/2024/09/02/critical-data-elements-a-practitioners-perspective/

About the author:

Dr. Irina Steenbeek is a well-known expert in implementing Data Management (DM) Frameworks and Data Lineage and assessing DM maturity. Her 12 years of data management experience have led her to develop the "Orange" Data Management Framework, which several large international companies successfully implemented.?

Irina is a celebrated international speaker and author of several books, multiple white papers, and blogs. She has shared her approach and implementation experience by publishing?The "Orange" Data Management Framework,?The Data Management Toolkit,?The Data Management Cookbook, and Data Lineage from a Business Perspective.

Irina is also the founder of Data Crossroads, a coaching, training, and consulting services enterprise in data management.?

To inquire about Irina's training, coaching, or participating in your company webinar or event, please email?[email protected]?or book a free 30-min session at https://datacrossroads.nl/free-strategy-session/


Gregory Clarke

Managing Director at Team Netsol

2 个月

Excellent article

要查看或添加评论,请登录

社区洞察

其他会员也浏览了