A new framework for RevOps data quality

A new framework for RevOps data quality

Last month, we talked about the invisible twin of tech debt: “data debt.” Data debt started to accelerate as a result of the SaaS boom and builds with each new point product you introduce into your tech stack. The symptoms of data debt include stale data, duplicate data, and an increasing number of data silos. Reports take more effort to build, business decisions are harder to make, and RevOps (along with most of the management team) secretly distrusts the database.

As we concluded, data bankruptcy is not an option: you may be able to scrap your tech stack and start over, but you can rarely do that with your data. Any data debt you accumulate will eventually need to be paid off. The easiest way to do that? Don’t accumulate it in the first place.

Magical thinking never solved a business problem

The last 10 years has been an exciting time in RevOps and go-to-market (GTM) functions in general, especially if you’re fond of technologies. The explosion in GTM tech has been nothing short of breathtaking, but as fun as the ride has been, there are two stark criticisms and they are related:

  • The ROI for most GTM tech is unproven at best.?
  • Data quality has been neglected in favor of a willful ignorance (optimism?) that new technology will magically make bad quality data a non-issue.

Aggressive vendor messaging and a general lack of data management expertise in RevOps has somehow made us believe that these new technologies can work as expected even when RevOps data is of poor quality and siloed.

RevOps data quality drives every go-to-market investment

The current tech recession is shining a light on this issue. CFOs are demanding verifiable ROI for every GTM tech and data purchase. “Must-have” projects like ABX and CDP are not showing expected returns because of bad data quality. Today, more companies are focusing on RevOps data quality for two reasons:

  • To improve the ROI and time to value of all their GTM technology investments
  • To prepare for the “killer apps” in AI when they emerge, maybe as early as 2025

As the steward of GTM data, it is RevOps’ job to ensure that RevOps data is good quality and GTM-ready, as people, processes, and technology can’t work well without good data. To take control of data quality, RevOps must first have a deep understanding of what RevOps data quality means. If you simply Google the term data quality, you will get a very technical description of how IT would define data quality.?

However, data quality for RevOps is a much broader concept. In an upcoming “Guide to RevOps Data Quality” we outline a comprehensive data quality framework that RevOps can use to create a more scalable technology stack, streamline revenue processes, and deliver GTM metrics that can be counted on. Here is a sneak preview.

A three-tier data quality model

Because RevOps manages many processes and use cases, they often approach them as distinct challenges and buy technology to solve each problem independently. This ad hoc approach results in complexity; recent research shows enterprises use on average 91 pieces of technology in their GTM tech stack, producing crushing tech and data debt that inhibit efficient growth. RevOps needs a framework to tie common RevOps problems together: it should not be a surprise that RevOps data quality is the fabric spanning all of these use cases.

RevOps data quality is a much broader concept than the traditional definition of data quality that IT usually adopts. In RevOps, data quality can best be defined using a three-tier model:

  1. Technical quality
  2. Operational quality
  3. Strategic quality

There is hierarchical dependency among the three tiers. To achieve a high level of quality in the upper tiers, you must first achieve quality at the lower tiers.?

Technical quality - can you trust the data?

Technical quality is the most familiar definition, referencing the basic integrity of the data record itself:

  • Completeness
  • Accuracy

  • Recency
  • Format
  • Consistency
  • Integrity
  • Validity
  • Precision

Conceptualize technical quality as making the data “trustworthy.” If you can’t trust the accuracy or integrity of your data, anything you do with that data results in questionable outcomes. This is basically the proverbial “garbage in, garbage out” phenomenon at work. If you can’t trust the data, then you can’t trust any business process, reports, or analysis that use the data.

For example, if the address on your lead record is bad and your sales territories are geographically based, then your lead routing process won’t work very well. A hot inbound lead may bounce around within the sales team, or worse yet, languish in a black hole because the sales rep receiving the wrong lead has little incentive to correct the routing mistake, especially if this is a frequent occurrence.

Operational quality - can you take action on the data?

Having good technical data quality is just a starting point. End users must be able to use it to perform the necessary tasks. Databases containing data that is not usable still have bad data quality.

Operational quality comprises:

  • Related datasets that are properly linked
  • Data that is “ready to use” by systems and technologies that need it
  • Data that is assigned to the right users who are tasked to act on it
  • Unnecessary data that is hidden from the user to reduce noise

The best way to conceptualize operational quality is that it’s about making the data ready for users to take action on. This does not mean users can make the right decisions, which is the next level of strategic quality, only that they are able to take the necessary actions.?

Strategic quality - can you make the right decisions with the data?

Competitively, technical quality and operational quality are table stakes. They simply allow you to have a seat at the poker table. Strategic data quality is what allows you to differentiate and win.?

Strategic quality is about appending insights to the data: how relevant is the account/person/engagement? How valuable is it to the business? How effective is the campaign/program/vendor?

Sifting through infinite data with finite resources is not a recipe for efficient and scalable growth. Over 90% of the data a company possesses has minimal value; achieving scale and efficiency requires RevOps to isolate and enhance the 10% that is strategic so the GTM team can make the right decisions. Adding strategic quality to your data enables business users to allocate resources properly and compete more effectively, which can include:

  • Higher win rates
  • Shorter deal cycles
  • Lower customer acquisition costs
  • Higher margins
  • Lower churn
  • More upsell
  • More repeat buyers

Here is one example to help you understand the difference:

  • Technical quality: the lead record has accurate and up-to-date information
  • Operational quality: the lead record is assigned to the right sales rep?
  • Strategic quality: the lead record includes insight to help the sales rep prioritize and strategize the engagement

Beef up your data quality chops

Data management is one of the key responsibilities of RevOps. Are you surprised by the scope of RevOps data quality? Connect with me here and share your thoughts. You’ll be one of the first to receive the “Guide to RevOps Data Quality.”?

Hasanur Rahaman

Land your Full Dream Project First & Fast | Staff Augmentation | AWS | Migration | Serverless | Generative AI

10 个月

The proposed three-tier RevOps data quality model seems intriguing and could potentially offer a structured approach

Andrew Smith

Operations Consultant | Analytics, Automation, AI & Growth | Salesforce Certified Consultant | Make Automation Certified Consultant

10 个月

Great piece Ed King, thank you. I also find that prospects are trying to pilot as much tech as possible and collect as much data as possible in an attempt to be “revenue focused”. Inevitbly, they are feeling that tech isn’t doing what they want and because there is so much data; it’s so hard to actually use. This framework including data quality is so critical. I have gone back to my old Martech days and advising prospects and clients to start functional between Sales, Marketing and Service as a way to get teams to better understand what data, why and how. Too broad implementation seems to fracture quickly with different department having different views on data.

要查看或添加评论,请登录

Ed King的更多文章

  • My top 3 MOps-Apalooza takeaways

    My top 3 MOps-Apalooza takeaways

    MOps-Apalooza (MOpza) 2024 is now in the books, and what an event it was. The 2023 inaugural event was amazing, and the…

    2 条评论
  • RevOps can now build with AI

    RevOps can now build with AI

    I have heard the anticipation around AI’s potential to transform RevOps from many people in the RevOps world, and I…

  • Will AI result in net gain or loss for GTM teams?

    Will AI result in net gain or loss for GTM teams?

    Every major technology humans have invented has produced efficiency gains along with undesirable side effects. With…

    1 条评论
  • Data fracking in the third era of GTM data

    Data fracking in the third era of GTM data

    The first and second eras: from digitization to the third-party data boom If you trace the evolution of GTM data over…

    1 条评论
  • What RevOps can learn from election politics

    What RevOps can learn from election politics

    Every four years, we as a nation get consumed by election politics and things have gotten increasingly crazy these…

    1 条评论
  • Great RevOps is boring RevOps

    Great RevOps is boring RevOps

    I recently watched a TED Talk by Professor Martin Gutmann (https://youtu.be/DU06c7f9fzc?si=OhhCklp6o5vUgsLl) about why…

  • Getting good at banging your head against the wall

    Getting good at banging your head against the wall

    Lately I sound like a broken record when I say to our team, “I don’t want us to get really good at banging our heads…

    4 条评论
  • “Hire” an AI intern to improve your data quality

    “Hire” an AI intern to improve your data quality

    If you love technology, then now is truly an exciting time, with generative AI advancing at a pace faster than any of…

    1 条评论
  • The “sinking boat” approach to RevOps data quality

    The “sinking boat” approach to RevOps data quality

    With all that has changed since 2002, one trend we see in the RevOps world is a return to fundamentals, especially a…

    3 条评论
  • Measure before buying

    Measure before buying

    One of the best practices in software development is to have the engineer write the unit test before they write the…

社区洞察

其他会员也浏览了