Sensitive Locations

Sensitive Locations

Do you work in a sensitive location?

On January 9th, the US Federal Trade Commission settled a case with data broker X-Mode Social.

Founded in 2013, X-Mode Social develops software that is bundled into more than 300 smartphone apps, according to Wikipedia. The company’s software collects geolocation data from smartphone GPS and Bluetooth devices and reports back to the company every few minutes with each user’s location and a unique identifier. For this information, Wikipedia states, the company pays app makers roughly 3 cents per US user per month. Data from internal users fetches just half a cent per user per month.

Geolocation data can be incredibly sensitive. Consider a woman who visits an abortion clinic. That doesn’t tell you for sure that the woman had an abortion—she might have been accompanying a friend, or she might just have gotten counseling, or she might have accessed other reproductive services. But in any event, this is probably information that the woman in question would probably want kept quiet. And because X-Mode Social’s software is so widely deployed—it’s in more than 300 apps, including dating apps, says Wikipedia—there’s a good chance that the company’s computers have records of many, many women who have sought reproductive health services.

The FTC’s settlement with X-Mode puts significant limits on X-Mode’s ability to share or sell what the FTC has termed “sensitive location information.” Sensitive locations, according to the FDA, includes medical and reproductive health clinics, places of religious worship and domestic abuse shelters.

According to the FTC’s complaint, X-Mode made it possible to tie so-called Mobile Advertiser IDs to locations that would tell you a lot about an individual, such as a person who visited a “Size Inclusive Clothing Store,” “Firehouses,” “Military Bases” and “Veterans of Foreign Wars.” They don’t know who you are, but they know where you go.

(X-Mode was acquired by Digital Envoy in 2021, which now operates under the name Outlogic.)

The Federal Trade Commission has become nation’s de-facto privacy regulator, arguing that it’s authority under the Federal Trade Act to prohibit “unfair or deceptive acts or practices in or affecting commerce” gives it the power to enforce those privacy policies.

If you think nobody reads these privacy policies, I’ve got news for you: they aren’t for consumers—they are for the lawyers at the FTC. So, it’s important that they be detailed and correct. If they aren’t detailed, companies can get in trouble for not disclosing something that is typically disclosed — that would be unfair. And if they are detailed but wrong, that’s deceptive.

The settlement doesn’t prohibit X-Mode from collecting sensitive information—in part because it’s hard to stop collecting data once you have data-collecting software distributed to millions of smartphones. But it does stop X-Mode from distributing or sharing these data.

This is a big deal, and because it is a settlement, there’s no appeal. That’s good for everybody involved except for Kochava, the largest seller of this kind of information. The FTC sued Kochava back in August 2022. That first suit was dismissed by a federal judge in May 2023. The FTC filed an amended complaint under seal in June 2023, and the complaint was just released last November.

Location information is truly some of the most sensitive information about us that our smartphones collect. It’s incredibly useful to allow some apps to collect these kinds of data: I worked at a company where they used a program called OpenPath to activate the prox card readers in the elevators. By letting the OpenPath app know my location by reading my Bluetooth radio, I programmed it to buzz me up to my floor without forcing me to take my smartphone out of my pocket. By letting my Google Home App know my location, it can automatically crank up the thermostat in my apartment when I get home and lower it when I leave. But these kinds of services are just too dangerous for many people without strong privacy protections—protections that go far beyond the FTC settlement, unfortunate.

The problem is that the FTC’s definition of “sensitive location information” is too limited. All locations are potentially sensitive. However, in the absence of stronger privacy legislation at the national level, this is the best we’re likely to get in the US for many years to come. This is why location information for people inside the US appears to be worth six times more than for people outside the US—there’s just so much more that companies can do with it.

Simson Garfinkel is the Chief Scientist at BasisTech, LLC. His most recent book, Law and Policy for the Quantum Age — which follows the history of computing, quantum computing, and the outlook for both business and national security —?is available as an audiobook from Audible.

Shannon Dempsey

SEO, Social Media and Content Marketer at Prose Piranha

10 个月

This is a great article, Simson. I didn't know many of these things. I tell all apps that they can only use my location info when I'm using the app & that's only, if they need to know it to function.

Andrew Oram

Editorial Board Member at Linux Professional Institute

10 个月

Very timely, Simson. Massachusetts is considering a broad "Location Shield Act" (H.357|S.148): https://www.aclum.org/en/ban-sale-location-data#:~:text=commonsense%20privacy%20reform.-,THE%20LOCATION%20SHIELD%20ACT,cell%20phone%20location%20information%20(H

要查看或添加评论,请登录

社区洞察

其他会员也浏览了