A Choice We Make in the Age of Technology: Even GoodRx (Allegedly) Makes Money Off Your Data
11:20 AM, Jan. 31, 2026
Perhaps Google, Meta, and other mega-cap technology companies can already distinguish between those who understand that personal data is a commodity and those who do not. Unless you have lived under a rock for the past two decades, you have probably heard of companies using—and selling—your personal data. Nearly everything can be reduced to a data point, transmitted to a remote data warehouse, and stored indefinitely for future use, replication, or dissemination. Despite state legislatures increasingly considering privacy and algorithmic-based protection bills, such practices continue to be a topic of litigation. Seemingly, as personal data continues to become the most valuable commodity in the digital economy, the gap between consumer autonomy and corporate incentive continues to widen.
It is also easy to identify the players with the strongest profit incentives. The next frontier in our digital age lies in algorithm development and AI. Unsurprisingly, many of these tech “fat-cats” are devoting countless resources to remain competitive amongst the rest of their pride. However, uncertainties remain. How are we to weigh autonomy and privacy with capitalism and technological advancement when the latter two remain largely indifferent to the moral cost they impose? Even in the case of health data (which is among the most sensitive categories of personal information), any line appears to be blurred.

For example, consider GoodRx, an online platform offering free prescription coupons (with optional paid subscriptions) that often save users hundreds if not thousands of dollars per pickup. Sounds like a good deal, right? For those relying on lower costs, the savings are real. But these benefits come at a cost: exposing highly personal information.
Right now, GoodRx, Criteo Corp., Meta, and Google are being sued for the alleged illegal sharing and utilization of GoodRx user medical data. Most recently, a district court judge in the Northern District of California denied a motion for preliminary approval of a class settlement. Such medical information was (allegedly) utilized for a variety of purposes, in particular targeted advertisement. But due to the deeply personal, sensitive nature of health data, sharing it raises more poignant ethical concerns over selling ordinary consumer data.
How are we to weigh autonomy and privacy with capitalism and technological advancement when the latter two remain largely indifferent to the moral cost they impose?
This was the second time in this case that the court denied a motion for preliminary approval of a class settlement. Judge Martínez-Olguín mentioned how “the parties have sweetened the pot in this most recent proposed settlement by adding [Criteo as a] defendant, [but] largely failed to address . . . previously identified concerns.” The newest settlement agreement was for a payment amount of $32 million, as opposed to $25 million previously. While the Court appeared to appreciate this, the rationale presented in the latest motion did not sufficiently match the settlement amount nor the number of people in the class that would predictably file a claim. In its ruling, the Court noted three key deficiencies to this point:
- (1) the lack of a claim-by-claim analysis to assess the fairness and sufficiency of the settlement;
- (2) the failure of Plaintiffs to identify any conducted discovery substantiating the validity of their claims; and
- (3) that if the court were to make a standard assumption on a claim filing rate here, the average recovery would not equate to the allegations underlying the 16 causes of action.
While the merits of this case have not been decided in full, this is yet another example of what is increasingly becoming contemporary business practice. Large corporations are amassing lawsuits alleging how their algorithmic-based products, which are fed by consumer data, cause harm—and the companies are settling. Even if Google and Meta are holding out on settlement talks to see how Criteo fares first, the common denominator is that such gargantuan companies are willing to suffer a slap on the wrist if it means they can train their algorithms to gain a competitive advantage. In other words, the lawsuits are simply a cost of doing business. It is time to consider alternative punitive measures or new statutory frameworks, taking into consideration whether such measures would be feasible while facilitating technological progress. However, so long as settlements are treated as a cost of doing business, courts will continue to police harms that legislatures and regulators struggle to constrain, and consumers—not companies—will continue to foot the bill for innovation.
P. Andrew Kinneberg
Andrew received his undergraduate degree from Appalachian State University, where he majored in Mathematics and minored in Computer Information Systems. After graduation, he worked as an IT Project Manager for almost three years. Now a 2L at the University of North Carolina School of Law, Andrew is a JOLT staff member and the Vice President of the student Tax Law Association.