
OkCupid's AI Data Deal: A Lesson in Privacy Oversight
- OkCupid transferred 3 million user photos to AI firm Clarifai in 2014 without consent or written agreement
- The transfer violated OkCupid's privacy policy, which limited photo use to matchmaking purposes only
- OkCupid executives held investment stakes in Clarifai at the time of the transfer, creating a conflict of interest
- FTC investigation took 11 years to resolve, with no financial penalties imposed on either party
Match Group's OkCupid handed over three million user photos to AI company Clarifai in 2014 for facial recognition training—without telling anyone, without a written agreement, and without any restriction on how those images could be used. The Federal Trade Commission announced this week that Clarifai has now deleted those photos and any models trained on them, resolving an investigation that began seven years ago but took until 2025 to conclude.
The transfer violated OkCupid's own privacy policy, which promised user photos would only be used for matchmaking purposes. According to the FTC's complaint, Clarifai received the dataset to train facial recognition technology that it could then sell commercially. The deal was brokered informally—no contracts, no data-sharing agreements, no usage restrictions.
What makes this particularly uncomfortable: OkCupid executives at the time held investment stakes in Clarifai, according to the FTC filing.
The agency doesn't specify which executives or how much they invested, but the conflict of interest is hard to miss. Dating app users don't expect their photos to become commercial AI training data. They certainly don't expect their dating app to be quietly seeding the portfolio companies of its own leadership.
Create a free account
Unlock unlimited access and get the weekly briefing delivered to your inbox.
This settlement isn't just about OkCupid. It's a window into what was happening across dating apps in the mid-2010s, when AI was hungry for faces and nobody was watching the supply chain. The real question isn't whether Clarifai will delete these models—it's what other datasets from that era are still circulating, and whether today's dating operators learned the lesson or just learned to hide the transfers better.
The FTC got a deletion order but couldn't impose fines, which means the deterrent here is reputational damage from public disclosure. That only works if operators care about their reputation more than they care about AI partnerships.
What the Settlement Actually Says
Clarifai did not admit wrongdoing as part of the settlement, standard language for FTC consent decrees. The company certified to the agency that it had deleted all OkCupid photos, related facial recognition data, and any models trained on that dataset. Crucially, Clarifai also certified that it 'had not shared the information with other parties', according to the FTC's statement.
That claim cannot be independently verified, and it's worth noting that facial recognition models from 2014 onwards were often benchmarked, shared in academic collaborations, or used to train subsequent models. The FTC appears to have taken Clarifai at its word.
The timeline here is telling. OkCupid transferred the photos in 2014. The FTC opened its investigation in 2019, five years later. The settlement was announced in January 2025, another six years after that. That's an eleven-year gap between the data transfer and resolution.
The FTC's authority to act was also limited. Because this violated the FTC Act's prohibition on deceptive practices—not a statute that carries civil penalty authority for first-time violations—the agency could only secure injunctive relief. Translation: Clarifai had to delete the data, but didn't have to pay anything.
The 2014 Data Gold Rush
What's striking about this case isn't that it happened—it's how casually it happened. No formal agreement. No legal review, apparently. No technical restrictions preventing Clarifai from doing whatever it wanted with the images once it had them.
This was the AI training data market in 2014: informal, relationship-driven, and entirely unregulated.
Dating apps had millions of faces, helpfully tagged with age, gender, and other attributes. AI companies needed exactly that. The transfer was obvious.
OkCupid was owned by IAC at the time, before the Match Group spin-off was completed in 2015. The executives with Clarifai investments aren't named in the FTC filing, but the dynamic is clear enough: individuals with decision-making authority at a dating app had financial interest in an AI company that needed training data. The dating app then provided that data, in apparent violation of its own user-facing promises.
This wasn't an isolated practice. Facial recognition systems developed in the 2010s were trained on datasets that often included scraped social media photos, images pulled from academic research sets, or—evidently—dating app archives. The difference here is that OkCupid's privacy policy explicitly limited photo usage to matchmaking, creating a clear legal hook for the FTC.
The question for dating operators watching this case is whether similar transfers from that era are still lurking. If you ran a dating app in 2013-2016 and had technical partnerships with AI vendors, machine learning researchers, or facial recognition startups, what data changed hands? Was it formally documented? Did it comply with your privacy policy at the time?
What This Means for Trust and Safety Teams
The compliance implications here cut two ways. First, the historical audit question: dating companies with data-sharing relationships predating GDPR and modern privacy frameworks should be reviewing what was transferred, to whom, and under what terms. The FTC's investigation was triggered in 2019, likely by a whistleblower or privacy researcher rather than proactive disclosure.
Second, the current policy question: what does your privacy policy actually allow, and does your operational practice match it? Dating apps today routinely use AI for moderation, fraud detection, recommendation algorithms, and photo verification. Some of those systems are built in-house. Others are licensed from vendors.
The regulatory environment has shifted significantly since 2014. The EU's General Data Protection Regulation established clear rules about consent, purpose limitation, and data transfers. The UK Online Safety Act imposes duties on user-to-user platforms that include dating apps, with Ofcom able to issue codes of practice around data handling. The FTC itself has become more aggressive about privacy enforcement under Chair Lina Khan's leadership.
But the fundamental tension hasn't changed. Dating apps collect extraordinarily personal data—photos, location, relationship preferences, private messages. That data is valuable not just for matchmaking but for AI training, advertising targeting, and third-party analytics. Members upload it expecting one use case. The business model often assumes others.
OkCupid users in 2014 had no way of knowing their photos would end up in a commercial facial recognition training set. Members today uploading to Tinder, Hinge, Bumble, or Grindr likely have no clear sense of where their data might go either, despite privacy policies that have grown longer and more detailed since 2014. The policies exist. Whether they communicate actual practice in plain language is a different question.
The FTC settlement with Clarifai offers a tidy resolution on paper: data deleted, models destroyed, case closed. But critics say the penalty-free deal fell short, as the images were in commercial use for at least five years, and possibly eleven, before deletion. That's a long time for faces to circulate. And this is just the one deal that the FTC found.
- Dating platforms should audit historical data-sharing arrangements from 2013-2016 before regulators discover them through whistleblowers or researchers
- The absence of financial penalties means the only deterrent is reputational damage—operators must decide whether they value brand trust over AI partnership opportunities
- Current privacy policies may not adequately communicate to users how their photos and data are used in AI training, vendor relationships, and commercial applications beyond matchmaking
Comments
Join the discussion
Industry professionals share insights, challenge assumptions, and connect with peers. Sign in to add your voice.
Your comment is reviewed before publishing. No spam, no self-promotion.
