What you may still need to know about          
                             Data Quality

What you may still need to know about Data Quality


The current edition of Merriam-Websters defines “data” as “factual information (such as measurements or statistics) used as a basis for reasoning, discussion, or calculation.”?

The dictionary’s second definitions tell us that our data is represented digitally and can be “transmitted or processed.”

But it’s the third definition that points to a cautionary tale: Data includes both “useful and irrelevant or redundant information and must be processed to be meaningful.”? Meaning, you need to know or anticipate that the quality and integrity of your data’s salient information and insights - may be compromised. Not compromised by malicious security breaches, but by a lack of internal, data quality firewalls.

Data Quality meets Digital Transformation

While CIOs have embraced the goal of company-wide digital transformation, many need to better understand the critical importance of creating enforceable, enterprise-wide standards for good and consistent data quality. Decision makers need to acknowledge that poor and inconsistent data is the source of serious process degradation, and project cost overruns.

Efficient order-to-cash processes, timely product launches, or revenue-generating go-to-market initiatives risk being compromised when tied to inconsistent and unreliable master data, reference data and metadata. Or simply put, bad business data.

According to Gartner research, the “average fiscal impact of poor data quality on organizations is $9.7 million per year.”?To be clear, that’s per company.

The table below displays common occurrences of bad data quality and resulting business challenges:


?Employing a Data Quality Engine

Data issues can begin within in a single database. Consider for example, a siloed instance of a CRM solution where lax adherence to data entry standards can lead to inaccurate customer master records.

In this scenario, companies have the option to significantly address data quality issues through ongoing, rule-based DQ (Data Quality) automation. Standalone DQ solutions will identify duplicate master records, cleanse data attributes (i.e., address verification), and match/merge new and preexisting data records, enabling the creation of a “gold record”, or a single-version-of-truth.

Data Quality challenges shared across multiple systems

But bad data challenges exponentially increase when the CRM master record is unsystematically recreated using different data models across multiple systems and databases. Imagine the possible variances in data fields when invoicing, marketing, and contact management systems (to name a few), tailor the master records to their own business requirements, assigning their own field names, attributes, and global identifiers.

The classic result is the creation of multiple and varying versions of the same customer master record.


Quiz: How many Ron Smiths are there?


From a human perspective, we undoubtedly see three redundant systems version (Sys 1, 2 and 4), of the same Ron Smith living in Dumont, New Jersey.

Assuming each system assigns the three iterations of Ron Smith to a unique record ID, the same Ron Smith will be counted and treated as three different customers.

MDM: A holistic approach to Data Quality

?Master Data Management’s mission is to create guard-rails around data redundancy through a holistic data management process that merges multiple sources of enterprise data. By merging disparate forms of data, MDM is able to uniformly enforce data quality standards across all relevant business systems, databases, and data catalogs.

MDM provides a highly collaborative data quality environment, supporting teams of data stewards, system administrators and business users who are regularly called-on to maintain high standards of data quality by validating data changes either through automated data quality processes or manual intervention.

MDM also collaboratively orchestrates and synchronizes the latest data management technology, holistically supporting an entire data management life cycle:

Data Profiling: Discovers and assess the quality and consistency of your pre-MDM data initiative.

Data Modeling/template: Drives and governs a structural continuity across all data governance tasks, beginning with imported cross-system data merged into the MDM hub or repository.

DQ and cleansing tools: Validates, corrects and transforms data.

Workflow Approval Process: Addresses DQ exceptions and orchestrates/delegates rule-driven, expert intervention.

Inbound/Outbound integration: Provides real-time, on-demand or batch, inbound data for management, and dissemination of approved outbound data for trusted sharing and end-user consumption.

And then, of course, there's AI

Not surprisingly, top-tier data management vendors have adopted Generative AI to self-drive desired data quality outcomes.

Automated workflows are now infusing GenAI into profiling, cleansing, and data standardization, enabling quick identification of errors, data inconsistencies and duplications. AI-driven MDM solutions are using algorithms to detect anomalies and patterns. Something clearly beyond the scope of legacy master data management solutions.

Conclusion

Superior data quality is continuously earned through due diligence, validation techniques and DQ dedicated automation.

But even Merriam-Webster seems to know that.


要查看或添加评论,请登录

Charlie Greenberg的更多文章

  • Let's Forgo AI 'Hallucinations'

    Let's Forgo AI 'Hallucinations'

    Maybe it’s just a product marketer’s concern, but how ‘bout we stop using the term ‘hallucination’ to describe…

  • Five Best Practices to guide B2B Integration

    Five Best Practices to guide B2B Integration

    At some point when implementing new B2B Integration software you will wrestle with the yin and yang between your new…

  • APIs Drive Digital Business

    APIs Drive Digital Business

    Do you remember the old cable TV channel guide? The one using a scroll-down, near monochrome, tabular listing of…

    1 条评论
  • Accelerate the Speed and Ease of B2B

    Accelerate the Speed and Ease of B2B

    Money makes the world go around - so when payments are deferred or uncollected, that is money you cannot use. The…

  • Yes, Integrate! But Pick the Right Architecture.

    Yes, Integrate! But Pick the Right Architecture.

    Maybe not everyone in IT has noticed, but we reached the stage long ago where the very first conversation is no longer…

  • Raising the Bar on Integration

    Raising the Bar on Integration

    With the new innovation release of webMethods 10.4, webMethods is committed to not only retaining, but strengthening…

    1 条评论
  • Microservices: Playing it Cool

    Microservices: Playing it Cool

    We know API adoption is red hot. In fact, APIs actually have their own, beautifully produced television commercials.

  • webMethods B2B Integration Moves to the Cloud

    webMethods B2B Integration Moves to the Cloud

    CIOs have long sought a flexible and centralized, cost-effective cloud-based, B2B platform alternative. This is why we…

  • Stop File-Transfer Log-Jams!

    Stop File-Transfer Log-Jams!

    In our day-to-day activities, it is advisable to regularly clear out our personal and corporate email inboxes –…

  • Five Ways to Modernize Your Supply Chain

    Five Ways to Modernize Your Supply Chain

    Too many business people wake up each day worrying about the future of their company’s supply chain. So what is the…

社区洞察

其他会员也浏览了