

Documents in an FTC lawsuit detail the stunning amount that data brokers know about you and everyone else.

By Anne Toomey McKenna, J.D.
Visiting Professor of Law
University of Richmond
Introduction
Kochava, the self-proclaimed industry leader in mobile app data analytics, is locked in a legal battle with the Federal Trade Commission in a case that could lead to big changes in the global data marketplace and in Congressโ approach to artificial intelligence and data privacy.
The stakes are high because Kochavaโs secretive data acquisition and AI-aided analytics practices areย commonplace in the global location data market. In addition to numerous lesser-known data brokers, the mobile data market includes larger players likeย Foursquareย and data market exchanges like Amazonโsย AWS Data Exchange. The FTCโsย recently unsealed amended complaintย against Kochava makes clear that thereโs truth to whatย Kochava advertises: it can provide data for โAny Channel, Any Device, Any Audience,โ and buyers can โMeasure Everything with Kochava.โ
Separately, the FTC is touting a settlement it just reached with data brokerย Outlogic, in what it calls the โfirst-ever ban on the use and sale of sensitive location data.โ Outlogic has to destroy the location data it has and is barred from collecting or using such information to determine who comes and goes from sensitive locations, like health care centers, homeless and domestic abuse shelters, and religious places.
According to the FTCย and proposed class-action lawsuits against Kochava on behalf ofย adultsย andย children, the company secretly collects, without notice or consent, and otherwise obtains vast amounts of consumer location and personal data. It then analyzes that data using AI, which allows it to predict and influence consumer behavior in an impressively varied and alarmingly invasive number of ways, and serves it up for sale.
Kochava hasย denied the FTCโs allegations.
The FTC says Kochava sells a โ360-degree perspectiveโ on individuals and advertises it can โconnect precise geolocation dataย with email, demographics, devices, households, and channels.โ In other words, Kochava takes location data, aggregates it with other data and links it to consumer identities. The data it sells reveals precise information about a person, such asย visits toย hospitals, โreproductive health clinics, places of worship, homeless and domestic violence shelters, and addiction recovery facilities.โ Moreover, by selling such detailed data about people, the FTC says โKochava isย enabling others to identify individualsย and exposing them to threats of stigma, stalking, discrimination, job loss, and even physical violence.โ
Iโm aย lawyer and law professorย practicing, teaching and researching about AI, data privacy and evidence. These complaints underscore for me that U.S. law has not kept pace withย regulation of commercially available dataย or governance of AI.
Mostย data privacy regulations in the U.S. were conceived in the pre-generative AI era, and there is no overarching federal law that addresses AI-driven data processing. There areย Congressional effortsย to regulate the use of AI in decision making, like hiring and sentencing. There are also efforts toย provide public transparency around AIโs use. But Congress has yet to pass legislation.
What Litigation Documents Reveal
According to the FTC, Kochava secretly collects and then sells its โKochava Collectiveโ data, which includes precise geolocation data, comprehensive profiles of individual consumers, consumersโ mobile app use details and Kochavaโs โaudience segments.โ
The FTC says Kochavaโs audience segments can be based on โbehaviorsโ and sensitive information such as gender identity, political and religious affiliation, race, visits to hospitals and abortion clinics, and peopleโs medical information, like menstruation and ovulation, and even cancer treatments. By selecting certain audience segments, Kochava customers canย identify and target extremely specific groups. For example, this could include people who gender identify as โother,โ or all the pregnant females who are African American and Muslim. The FTC says selected audience segments can be narrowed to a specific geographical area or, conceivably, even down to a specific building.
By identify, the FTC explains that Kochava customers are able to obtain the name, home address, email address, economic status and stability, and much more data about people within selected groups. This data is purchased by organizations like advertisers, insurers and political campaigns that seek to narrowly classify and target people. The FTC also says it can be purchased by people who want to harm others.
How Kochava Acquires Such Sensitive Data
The FTC says Kochava acquires consumer data in two ways: through Kochavaโs software development kits that it provides to app developers, and directly from other data brokers. The FTC says those Kochava-supplied software development kits are installed in over 10,000 apps globally. Kochavaโs kits, embedded with Kochavaโs coding, collect hordes of data and send it back to Kochava without the consumer being told or consenting to the data collection.
Another lawsuit against Kochava in Californiaย alleges similar charges of surreptitious data collection and analysis, and that Kochava sells customized data feeds based on extremely sensitive and private information precisely tailored to its clientsโ needs.
AI Pierces Your Privacy
The FTCโs complaint also illustrates howย advancing AI tools are enabling a new phaseย in data analysis. Generative AIโs ability toย process vast amounts of data is reshapingย what can be done with and learned from mobile data inย ways that invade privacy. This includesย inferring and disclosing sensitive or otherwise legally protected information, likeย medical records and images.
AI provides the ability both to know and predict just about anything about individuals and groups, even very sensitive behavior. It also makes it possible to manipulateย individualย andย group behavior, inducing decisions in favor of the specific users of the AI tool.
This type of โAI coordinated manipulationโ canย supplant your decision-making abilityย without your knowledge.
Privacy in the Balance
The FTC enforcesย laws against unfair and deceptive business practices, and it informed Kochava in 2022 that the company was in violation. Both sides have had someย wins and lossesย in the ongoing case. Senior U.S. District Judgeย B. Lynn Winmill, who is overseeing the case, dismissed the FTCโs first complaint and required more facts from the FTC. The commission filed an amended complaint thatย provided much more specific allegations.
Winmill has not yet ruled on another Kochava motion to dismiss the FTCโs case, but as of a Jan. 3, 2024 filing in the case, the parties are proceeding with discovery. A 2025 trial date is expected, but the date has not yet been set.
For now, companies, privacy advocates and policymakers are likely keeping an eye on this case. Its outcome, combined with proposed legislation and theย FTCโs focus on generative AI, data and privacy, could spell big changes for how companies acquire data, the ways that AI tools can be used to analyze data, and what data can lawfully be used in machine- and human-based data analytics.
Originally published by The Conversation, 01.12.2024, under the terms of a Creative Commons Attribution/No derivatives license.


