
AI Companions Redefine Cheating: A New Trust Crisis for Dating Apps
- 54% of American singles now consider romantic or emotional engagement with AI chatbots as infidelity, just 1 percentage point below the 55% who classify emotional bonds with another person as cheating
- 84% of respondents consider sexual infidelity as cheating, whilst 51% classify financial support to someone outside the relationship as betrayal
- Character.AI reported 3.5 million daily active users with average session times of two hours, whilst Replika claims more than 10 million users globally
- One in five survey respondents admitted to past infidelity, with 76% of that group attempting to conceal it
More than half of American singles now classify romantic or emotional engagement with AI chatbots as infidelity, according to new survey data that suggests technology is rewriting relationship boundaries faster than couples can negotiate them. The figures arrive as the AI companion market matures beyond novelty, with apps like Replika, Character.AI, and a growing roster of romantic chatbot platforms collectively serving millions of users. What was dismissed as harmless fantasy 18 months ago is now, for the majority of singles, crossing into betrayal territory.
A nationally representative survey of 2,000 US singles, conducted by researcher OnePoll on behalf of dating platform Plenty of Fish, found that 54% now consider AI companionship apps a form of cheating. That figure sits just one percentage point below the 55% who classify emotional bonds with another person as betrayal—effectively placing digital relationships on the same moral plane as human ones in the calculus of what constitutes unfaithfulness.
Dating platforms have spent years optimising for matches and conversations, but they've left couples to figure out the rules of engagement in a landscape where emotional intimacy no longer requires another human.
This isn't a story about whether AI girlfriends are "real" cheating. It's a story about unspoken assumptions becoming relationship landmines. The "we never explicitly discussed whether I could talk to a chatbot" defence won't hold—and operators would be naive to think this doesn't affect their business model, particularly as loneliness-as-a-service becomes a genuine competitor for attention and emotional investment.
Create a free account
Unlock unlimited access and get the weekly briefing delivered to your inbox.
The expanding definition of betrayal
Sexual infidelity remains the gold standard of betrayal, with 84% of respondents in the Plenty of Fish survey classifying it as cheating. But the data reveals a steady expansion of what qualifies as unfaithfulness beyond the physical act. Financial support provided to someone outside the relationship now registers as cheating for 51% of singles.
Emotional connection with another person hits 55%. The AI chatbot figure of 54% slots neatly into this middle tier—well above trivial transgressions, but not quite reaching the near-universal condemnation reserved for sexual betrayal. What's notable isn't just the absolute numbers—it's the compression.
The gap between "having an emotional affair with a colleague" and "developing a relationship with an AI" is now a single percentage point in the minds of American singles. Platform operators building AI features into dating apps—or investor groups eyeing the companion AI sector—should be pricing that equivalence into their trust and safety roadmaps.
The survey also exposed significant demographic splits that suggest couples may be entering relationships with fundamentally incompatible operating assumptions. A 16-percentage-point gender gap exists on some boundaries, though the research didn't specify which definitions drove the divergence. More striking: generational divides on the inevitability of infidelity run deep, with just 16% of Gen Z respondents believing cheating is unavoidable in long-term relationships, compared to 32% of Baby Boomers.
Those aren't rounding errors. They're structural disagreements about whether monogamy is a realistic expectation or a comforting fiction.
The gap between stated values and behaviour
Theory and practice remain stubbornly misaligned. One in five survey respondents admitted to past infidelity, and 76% of that subset attempted to conceal it. That's a meaningful portion of the dating pool operating with a "what they don't know won't hurt them" framework, directly contradicting the preventative power of the communication strategies relationship experts typically recommend.
If three-quarters of people who cheat actively hide it, the premise that couples can simply "talk it out" ahead of time collapses.
The concealment rate undermines the conventional wisdom that transparency and boundary-setting conversations are sufficient safeguards. Particularly when new categories of potential betrayal—AI companions, financial entanglements, parasocial relationships with content creators—emerge faster than most couples update their relationship agreements.
Dating platforms have historically treated trust and safety as a pre-match problem: verify photos, screen for scammers, remove bad actors. But if betrayal definitions are now this fluid and this contested, the trust problem extends well beyond platform boundaries into the relationships themselves. Match Group (MTCH) and Bumble (BMBL) have invested heavily in features meant to facilitate "meaningful connections," but neither has meaningfully addressed what happens when those connections collide with AI alternatives that are infinitely patient, always available, and incapable of betrayal in the traditional sense.
What changes when the competition isn't human
The AI companionship data point matters because it represents the first major category of infidelity that doesn't involve another person. Previous expansions of the cheating definition—emotional affairs, financial support, even excessive attention to a celebrity—still required a human on the receiving end.
Character.AI reported 3.5 million daily active users as of mid-2023, with session times averaging two hours. Replika claims more than 10 million users globally, many engaging in explicitly romantic or sexual conversations with their AI partners. Those aren't trivial audiences, and they're not skewing elderly or technophobic—they're exactly the demographics dating apps are fighting to retain.
The competitive threat isn't hypothetical. If a meaningful share of singles considers romantic AI engagement equivalent to emotional infidelity, then every hour spent with a chatbot is an hour not spent on Hinge or Tinder. Worse, from an operator's perspective, it's an hour spent with a product that provides instant gratification, requires no matching friction, and never ghosts you.
Investor sentiment around MTCH and BMBL has already absorbed the reality of slowing user growth and rising customer acquisition costs. Adding "competes with AI girlfriends for emotional bandwidth" to the bear case doesn't improve the picture.
Dating platforms have three options: treat AI companions as a competitive threat and attempt to educate users away from them; integrate AI features and risk accelerating the very behaviour their users increasingly classify as betrayal; or ignore the trend and hope the moral panic fades. The survey data suggests the third option isn't available—the boundaries have already shifted.
What operators should be watching: whether "no AI relationships" clauses start appearing in the same relationship conversations as "are we exclusive?" If that becomes standard practice, platforms that have invested in AI chat features may find themselves on the wrong side of a boundary negotiation they didn't see coming.
- AI companions now pose a direct competitive threat to traditional dating platforms, with millions of users spending hours daily on emotional engagement that the majority of singles classify as infidelity—forcing operators to choose between integration, opposition, or obsolescence
- The compression of moral boundaries means the gap between human emotional affairs and AI relationships is now just one percentage point, suggesting relationship agreements will increasingly need to address digital intimacy explicitly
- Watch for "no AI relationships" becoming a standard exclusivity conversation—platforms investing in AI chat features may find themselves facilitating behaviour their own users consider betrayal
Comments
Join the discussion
Industry professionals share insights, challenge assumptions, and connect with peers. Sign in to add your voice.
Your comment is reviewed before publishing. No spam, no self-promotion.
