FTC Bans Mobile App Software Developer from Selling Sensitive Data, Requires Data Deletion
The Federal Trade Commission recently announced the acceptance, pending final approval, of an Agreement and Consent Order with X-Mode Social, which was rebranded as Outlogic in 2021 following a joint venture and transfer agreement. We will look at the background of the company’s business, alleged violations of the FTC Act, and proposed obligations under the Agreement and Order. This includes takeaways for other organizations to consider in their privacy risk management programs as the FTC’s active streak continues in 2024.
Background
X-Mode’s business includes providing a Software Development Kit (SDK) that mobile application developers at other companies could easily incorporate into their products. X-Mode’s SDK (branded as “XDK”) would collect precise geo-location information from users of apps on which it was installed and in turn provide the companies publishing those apps with revenue from X-Mode based on the number of daily active users. The geolocation accuracy was described as “70% accurate within 20 meters or less”. X-Mode would enrich the device user and geolocation data with other sources and classifications it developed or purchased separately. This included matching businesses to latitude and longitude to segment users based on the type of places they might have visited. X-Mode would then license (sell) the raw or enriched data to companies for uses such as product development and marketing.
As of January 2021, X-Mode advertised a reach of 60MM monthly active users representing 25% of the US adult population and up to 10% of users from countries including Canada, Mexico, Brazil, Japan, Australia, Singapore, the U.K, Spain, Italy, and France. In addition to the SDK and data licensing, X-Mode also developed their own apps, which was the company’s original business before discovering the value of their location data. These included Drunk Mode which was advertised as a tool for inebriated individuals to manage their desired or undesirable activities and Walk Against Humanity which prompted users with snarky motivations related to their daily fitness activities.
Specific Allegations
The complaint alleged X-Mode unfairly engaged in the following violations under the FTC Act:
(1) Selling sensitive data: X-Mode’s location data could easily be plotted to sensitive locations using publicly available map programs. This could include identifying specific healthcare facilities, places of worship, welfare organizations, or other locations that could infer sexuality, health conditions, or religious beliefs. ?Though X-Mode included in their customer terms that the data could not be used “to associate any user, device or individual with any venue that is related to healthcare, addiction, pregnancy or pregnancy termination, or sexual orientation…”, the contractual terms were considered insufficient given the availability of additional data and ease of matching that data.
(2) Failing to honor consumers’ privacy choices: Mobile devices include a unique identifier called a Mobile Advertiser ID or MAID. On Android devices form 2013 – 2021, when consumers enabled an option to “Opt-out of Ad Personalization”, the MAID would still be sent to the mobile application but would include a flag that the user had opted out of the use for advertising. From June 2018 until July of 2020, X-Mode allegedly did not honor this flag and provided the data to marketers.
(3) Inadequate notice regarding collection and use of location data from apps: For certain periods until at least August 2020, X-Mode failed to disclose to users of it’s own apps (Drunk Mode and Walk Against Humanity) how location data would be used. The privacy notices indicated data would be shared with X-Mode customers for advertising but did not disclose that the information was also sold to government contractors for national security purposes.
(4) Failing to validate whether app publishers gathered informed consent: X-Mode’s primary mechanism or control for ensuring that that third-party app publishers gathered consent was through contractual terms. The FTC considered this insufficient as additional methods for auditing could have been reasonably applied.
(5) Categorizing consumers based on sensitive characteristics for marketing purposes: X-Mode created custom audience segments based on sensitive data, including health information. In one example, X-Mode was contracted by a clinical research company to develop custom audiences of consumers who visited cardiology, endocrinology, or gastroenterology offices in Columbus, OH or specific infusion centers and spent 30 minutes to one-hour or more at those locations.
领英推荐
(6) Deceptive failure to disclose use of location data: The omission of the sharing and use of location data with government contractors for national security purposes was deceptive to users who would have otherwise reconsidered their use to the application or provision of consent.
(7) Providing the means and instrumentalities to engage in deceptive acts or practices: X-Mode primarily collects location data from third-parties who publish the XDK through their applications. This means X-Mode relies on those customers to provide notice and gather consent where required. X-Mode provided suggested language for those app publishers; however, it did not properly disclose how the geo-location data would be used, including use by government contractors for national security purposes, in those templates.
Agreement and Order
The proposed Order, pending approval, includes the following obligations which range from restrictions of certain activities to requirements to develop new internal monitoring programs and processes onto the and deletion of data that is core to the company. The order is effective for 20 years and includes the following requirements:
Takeaways
There is a lot to unpack and take away from this case for privacy, legal, compliance, security, product, marketing and, frankly, the C-suite. A few key items that can be applied to most organizations include the following at a minimum:
About the Author
Brian Segobiano, CIPP/E, leads the data, privacy, and cybersecurity practice for Epsilon Economics and Life Sciences, a consulting firm focused on expert witness, litigation, and compliance. He and his team focus on supporting organizations and outside counsel with complex internal or regulatory investigations (e.g., FTC, OCR), class action privacy litigations, and independent monitorships related to data protection orders. With a background in database coding and analytics, Brian also serves as a Data Protection Officer for global organizations looking to develop, maintain, or mature their data protection programs.