Bridging the GTM Data Gap
Uniting Data and Go-to-Market Teams for B2B SaaS Success
The first step to solving a problem is to admit there is a problem and name it.
Data teams and Go-to-market (GTM) teams aren’t always focused on the same goals. A shiny object often prevents the data team from fixing important data, and the GTM team doesn’t always understand the impact of?asking a “quick question.”
Here are a few frequent issues among Data and GTM teams:
These teams could have better collaboration while achieving shared goals. An aligned team achieves business goals more effectively and efficiently.
GTM Tools have incomplete product data
GTM tools are focused on selling to and marketing to prospects. That means their core data set is based on leads, opportunities, and campaigns. A well-orchestrated GTM team will coordinate the data that flows from an initial campaign contact through to a converted lead and subsequent opportunity.
CRMs do a great job of tying leads, opportunities, and contacts to an account. But it’s easy for product data for a lead to be out of sync with aggregated product data for that account.?
When a lead takes an action in the product, how long does it take to:
For most companies, this happens only a few times a day, or as infrequently as daily.
That means that for any given product action by a lead, there might be hours of lag before that account has updated product information for that lead/contact.
Why don’t these tools have complete product data?
There are a few main reasons:
All of this means that product signals – the leading indicators of whether someone is engaging with your product in a meaningful way – are not making their way into the selling process.
It's really important to know if your prospect is using the product during the trial.
If you don’t have product data, you’re not engaging with the customer where they are. What if you could bring the information from product engagement into the sales cycle to prioritize those prospects who are using the product?
That’s the essence of PLG: finding the needle in the haystack signals that help you understand the next best action to take.
Metrics do not have a standard definition
That leads to the next disconnect when you think about how to measure the impact of the product data that you’ve discovered. Is it good that someone has logged into the product three times in the last week? Well, any solid engineer would tell you “it depends”. You need the definition of “active user” defined before you can set a reliable threshold of positive or negative activity.
领英推荐
For example, if a data team is using dbt to define the data set for users, you might need to join your users table with an activity table that logs significant activity (logins, actions, and other activity).
There is likely an automated job that counts and rolls up this activity into daily sessions and other counts several times a day.
And there is another process that selects an audience from this combined user view depending upon that count and whether it exceeds the threshold for an “active user.”
In this case, what’s an “Active user”? It could be any one of these things:
All of these definitions could be answers to the “active user question” and involve source tables and final models that might not be obvious to the GTM team but represent upstream decisions that need to be made for data management and transformation.
To answer this question, you need one place to go where everyone in the organization can agree on a definition for that metric. You don’t need a “metrics catalog” yet, but even a spreadsheet or a database table showing the name of a thing, its definition, how often it is captured, a high threshold value, a low threshold value, and instructions on what to do if either of those is breached is a great starting point.?
This amount of information will tell the GTM and Data teams the definition of a marketing-qualified lead (MQL):
Practically speaking, the information is assembled from multiple back-end sources into a transformed view that demonstrates the agreed-upon metric. This is how the process appears in dbt, a popular method of loading, transforming, and delivering data.
The end metrics (like our definition for MQL) end up in a view we can query from the cloud data platform (Snowflake, Postgres, or similar) but the definitions need to be applied upstream for the data team to be able to understand the metric as a query and to apply it to data sources, tables, and views.
This information gives teams the power to work together to address a problem using data as the measurement and justification for solving the problem. That’s much better than fighting over a definition.
But where do you find the actual answer to your query?
It turns out that another problem with the GTM and Data Teams is staring you in the face: the problem that the data in the GTM tools is starting to smell.
Your Warehouse has fresher data than your GTM tools
Your GTM tools have a physics problem. They cannot magically update themselves with the latest product data because their APIs are not built to handle product data. The best solution you have is to take the unique identifier for a person or an account and associate that information with that data in your data warehouse. This presents a problem when an event happens in the product, the information is updated in near-real time in your warehouse, and the GTM tools are struggling to keep up.
What’s a GTM operator to do? One thing to remember is that your warehouse does have fresh data - it just needs to know what are the conditions to take that data and hydrate your account and contact records with important information. Updating things every time someone takes action might be quite noisy and hard to understand, so one of the important tasks, when you establish a metrics catalog, is to think about the key events (or aggregated events) that require action by a human or a process.?
Here’s an example of a roll-up field (a count of aggregated events) that might not seem obviously difficult. When you count how many “hand raiser” events happened at a company during a day or a week, this means you need to review captured events and filter them for events that meet the criteria of a hand raiser (demo request, new account, or similar).?
Then, you need to match the ID of the user in your app with the ID of the user in the CRM, handling any cases where you need to find a missing user, create a new one, or merge existing users as needed to account for multiple emails from the same user. Another example of this challenge is when you roll up domains and workspaces that belong to the same account but might be associated with multiple opportunities. You need to be careful to alert the right rep.
If you know that the data you need can be combined and recombined and transformed, it’s easier to think of it as a product to be delivered to the GTM teams as needed. Delivering the right product at the right time to the right place makes your data shine, and gives you a “one in a row” win to help the GTM team gain trust that you are working on the right problem.
What’s the takeaway??It’s easy to forget that Data Teams and GTM Teams are trying to solve the same goal: to deliver accurate information to sellers and to the prospect in a timely manner that doesn’t slow things down. The solution resides in defining where product data lives and how to hydrate it into other apps; agreeing on how to work together on change control; and documenting shared metrics from the data warehouse into information the rest of the business can use.
Founder, Tether
1 年Great post! The worst (best?) part of GTM data is that every answer only leads to more questions!