The New Analytics Aggregators (part 1)

The New Analytics Aggregators (part 1)

In part one, I talk about shifting dynamics in the analytics landscape. In part two, I'll discuss why the data stack must be reimagined. In part three, I'll dive deeper into our approach at Breadcrumb to building the next generation analytics platform.

Analysis is the new media

The first wave of the internet brought the power for anyone to distribute content without cost. Traditional business moats in Media, built around integrated control over publishing and distributing content were commoditized overnight. New entrants like Netflix, Spotify, and AirBnB took advantage of the technology shift by aggregating distributed content. Massive fortunes in news, music, and video shifted from controlling the means of distribution to aggregating distributors and controlling the means of demand.


Comparison of new and old integrated value in media. Source: Stratechary by Ben Thompson

With the advent of GenAI, a similar shift is happening in the world of analytics.


Analytics value is concentrated in the work required to build data pipelines.

?

Data pipelines are fragmented, specialized, and complex.


The past decade of winners in the analytics space have gotten there by owning steps in the end to end analysis value chain - whether that be ingestion, transformation, or visualization. Each tool built a moat by integrating inputs that required both specialized software and humans trained on that software. Companies had no choice but to build their own data pipeline and pick a tool for each stage in the process. E.G. Fivetran for ingestion, DBT for querying, Alteryx for pipeline building, PowerBI for visualization. Business moats revolved around owning these slices and building technical communities on top of them.

Yet decision makers were still blocked from acting with data

Data teams in medium and large companies were able to wrangle big data, but the complexity of the system mean smaller teams were left out. Data teams failed to keep up with business user demand, resulting in data requests taking weeks or months. Often, by the time decisions makers get insights, they’re often no longer relevant.

?

Decision makers are blocked by data teams


For the first time in decades, AI agents hold the keys to empower decision makers.


AI agents will disrupt the value chain and commoditize data pipelines

In the same way the internet brought the marginal cost of distributing content to zero, the advent of AI agents is making the marginal cost of analysis zero. Though still early, data engineering and analysis tasks - once done solely by humans - can now increasingly be achieved with AI agents in real time. In mid-market and enterprise companies it takes weeks or months to ingest and transform data into personalized reports. Shifting human work to code automates analytics and reporting at the same time. What used to take months, can now take minutes.?

The analysis moat is shifting to data stack aggregators

As AI takes over human tasks in the analysis stack, value and defensibility shift from owning components of data pipelines to integrating them. This commoditizes them by shifting the ‘hard & valuable thing’ from building parts of the data stack to building the experience to interact with it.

?

The new standard in the data experience will be analytics aggregators

The new experience for non-technical users to work with data is… new. Instead of building for someone that needs to wrangle data, we’re building for someone that wants to understand how to increase revenue, discovery efficiencies, and cut costs - while abstracting traditional data tooling.

Shifting the user adds a new dimension to the value chain

This also creates the decision maker as an enhanced participant in analysis tools. The prior generation of analysis tools succeeded by selling to data teams by making it easier to perform data work. However, moving data tasks to AI creates a new opportunity to build data experiences for managers and decision makers where traditional tools were too complex.

Data aggregators are the future of analytics

I'm calling companies that digitize data workflows - tooling and services - Data Aggregators.

Breadcrumb is our bet on what this new data aggregation experience will look like. Our goals is to build the simplest and most intelligent analysis platform.

From when GenAI was first introduced, the Breadcrumb team has been focused on exploring what this new experience could be. We’re constantly iterating with customers to ensure we get there.

Let me know if you have any thoughts or opinions. I would love to hear what you think.

In part one, I talk about shifting dynamics in the analytics landscape. In part two, I'll discuss why the data stack must be reimagined. In part three, I'll dive deeper into our approach at Breadcrumb to building the next generation analytics platform.


Marcus Ellison

Founder, Breadcrumb.ai - the simplest, most intelligent analytics platform

We’ve reimagined the entire analysis process - from ingestion and transformation to visualization and delivery -?empowering teams to reduce manual work, accelerate decision making, and deliver customized insights to customers and accounts.

agree, there's a shift in dynamics of data analytics. new toolstack needs to be reimagined to stay competivite.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了