The Gravy Analytics Breach – A data-harvesting grey area?

The Gravy Analytics Breach – A data-harvesting grey area?

At the turn of 2025, the location data broker Gravy Analytics revealed a breach by Russian cybercriminals that is thought to have compromised tens of millions of mobile device coordinates and at least 300,000 personal email addresses.

Why does this story stand out from other data breaches to me?

The hackers’ methods are of less interest to me than the data collection methods used.

?Gravy Analytics appears to have collected none of its 17 billion daily location data signals directly from data subjects – instead, its database was supplied by a few data-collection legal grey areas. From a consumer point of view, there is little difference between this and an activity logging attack.

What are these data collection legal grey areas?

The short answer is that we still don’t know for sure exactly how Gravy Analytics collected this data. However, speculation amongst the FTC and the cybersecurity expert community have identified three likely methods:

1.???????? Real Time Bidding (RTB) When a webpage or an ad-supported app is opened, an automated auction takes place in milliseconds, with advertisers competing to win the ad space the user sees. A consequence of this auction is that data brokers can ‘listen in’ on the process, collecting device data such as demographic information, browsing history, location, and IP address.

Whether the bidder wins or even participates in the auction, they can collect this data, and the app user is completely unaware that it is happening.

2.???????? Software Development Kits (SDKs)

SDKs are bundles of tools used for building apps with complex functionalities, namely those that run ads. Some SDKs collect and transmit location data in the ‘background’ of apps they were used to build, without the awareness of the developers.

There is evidence that a Google SDK was delivering the ads that were silently collecting data from a huge number of apps, and that this functionality was used to extract the data.

3.???????? Data Sharing

There is every possibility that the data was simply bought: from the apps themselves, or on the vast and largely unregulated data market. Some popular apps have been accused in the past of selling user data to marketers and brokers, and though no specific company has yet been called out in this incident, the fact remains that most data subjects have no idea what third-party sharing is going on using their information.

What is the impact of the hack?

The location data collected is pseudonymous – that is, it can’t inherently be used to identify an individual. However, in aggregate, it is easy to infer who owns the device sending location signals and what that person is doing on a daily basis.

This information had been available to data brokers at a price. Following the hack, there is not even a financial barrier to accessing it. If a person uses an app implicated in the breach, data indicating where they live, work, and play is now exposed. Sensitive data is also involved: usage data of dating apps, for example, is collected; as are visits to hospitals and places of worship.

All this data was collected from unwitting, unconsenting data subjects. It can be used to identify individuals, and infringement of civil liberties and threats to physical safety are direct consequences of the hack.

The breach represents a violation of privacy that almost no country’s legislation accounts for directly, but that still puts data controllers at risk of enforcement action.

What should my organisation do about this?

There are two key protective actions that are made increasingly important by this evolving situation:

  • ·Action 1: it is vital for organisations to understand their full software supply chains. If SDKs are used in the development of your apps, there must be governance in place that supports your engineers in assuring that external packages are safe and secure (e.g., knowing if location data sharing is enabled by default and if there is hidden leaking of data to other organisations). If your app allows advertising on its platform, you must be aware of who is accessing the implicated data. Bid response filters must be configured to ensure that only buyers who comply with your privacy policies can participate in RTB auctions. While this doesn’t remove the risk entirely (passive participants in auctions can still collect data), it is a mitigation.


  • Action 2: the Gravy Analytics incident highlights the need to conduct business ethically. Brokers whose datasets seem ‘too good to be true’ should raise alarm. Due diligence must be performed on all third parties you engage with, because in breaches like this one the risk and consequences fall on the buyer as well as the collector of personal data.


Please get in touch if you’re like to talk more about how to implement governance and ethical frameworks that are easy for your teams to work against that can help protect you and your customers.

About the Authors

John Michaelides, is a Data Privacy, Security and Ethics Senior Principal with Slalom UK, a progressive consulting firm pioneering Modern Culture of Data and AI for All.

Bronwyn Burns-TIlney is Privacy and Ethics Consultant with Slalom UK, a progressive consulting firm pioneering Modern Culture of Data and AI for All.

要查看或添加评论,请登录

John Michaelides的更多文章

社区洞察

其他会员也浏览了