
AI's Role in 'Banksying': When Breakup Strategy Becomes a Feature
- Dating platforms have spent two years marketing authenticity and intentionality whilst members reportedly use AI to plan secret breakups
- 'Banksying' involves emotionally withdrawing weeks in advance, using AI chatbots to workshop scripts and timelines before executing an abrupt exit
- Match Group and Bumble have discussed AI coaching tools for connection, but evidence suggests members repurpose conversational AI for calculated exits
- Platform engagement metrics cannot distinguish between genuine connection and performative participation, rewarding presence over sincerity
Dating platforms have spent the past two years selling their users on authenticity, intentionality, and emotional maturity. Their members, meanwhile, are turning to ChatGPT for advice on how to fake all three whilst planning secret breakups. The emerging behaviour, dubbed 'banksying', represents the latest evolution in digitally-enabled relationship avoidance—and the most calculated yet.
'Banksying'—named after the elusive street artist who operates in plain sight—describes a calculated approach to ending relationships. The method involves emotionally withdrawing weeks or months in advance, planning the logistics of a split in meticulous detail, and executing an abrupt exit whilst the other person remains unaware. According to social media posts documenting the behaviour, some people are now using AI chatbots as breakup strategists, workshopping scripts, timelines, and exit plans with large language models before their partners know anything is wrong.
The evidence comes primarily from TikTok, where a recent post detailed one person's month-long 'banksy' plan—complete with AI consultation—before ending a relationship. The video attracted congratulatory comments. That's the concerning bit: online communities aren't just observing this behaviour. They're celebrating it.
Create a free account
Unlock unlimited access and get the weekly briefing delivered to your inbox.
This matters less as a widespread phenomenon—there's no data suggesting 'banksying' is epidemic—and more as a signal of how dating culture weaponises technology. If ghosting represented the path of least resistance, this represents the path of most calculation. Dating platforms have built entire marketing campaigns around combating swipe fatigue and promoting genuine connection.
Those campaigns look increasingly hollow when the same tools users engage with daily—AI assistants, algorithm-driven communities, gamified documentation—are being repurposed to orchestrate emotionally calculated exits.
The real story isn't one trend. It's the infrastructure that makes this kind of behaviour feel normal, shareable, and worthy of applause.
AI as relationship strategist, not relationship coach
The involvement of AI chatbots marks a departure from previous distancing behaviours. Ghosting required no planning. Breadcrumbing required only intermittent attention. 'Banksying', as described in social posts, involves active preparation: crafting explanations, timing the conversation, even rehearsing emotional responses to anticipated reactions.
Users aren't asking ChatGPT how to communicate better. They're asking it how to exit cleanly whilst minimising their own discomfort. The technology excels at this brief. Large language models can generate scripts that sound empathetic without requiring actual empathy.
Dating apps have begun integrating AI features for profile optimisation, conversation starters, and date planning. Match Group has discussed AI-driven coaching tools in recent earnings calls. Bumble has explored AI-assisted messaging features. These tools are positioned as aids to connection.
The 'banksying' trend—if it scales beyond isolated social media posts—suggests members will use conversational AI for whatever serves them, regardless of the intended use case.
Operators building AI features should assume they'll be repurposed for exits as much as entrances. The documentation element matters too. The TikTok post wasn't a private journal entry. It was public performance, shared for validation and praised by commenters who viewed the calculated withdrawal as strategic rather than cruel.
The authenticity paradox
For the past 18 months, dating platforms have leaned heavily into messaging around intentionality. Hinge's 'designed to be deleted' positioning, Bumble's pivoting away from superficial swiping, Match's emphasis on serious relationships—all of it has centred on a narrative that the era of casual, commitment-phobic dating culture is ending.
The data operators cite supports this to a degree. Hinge's parent company has repeatedly highlighted engagement metrics showing members spending more time on profiles and in conversations. Bumble disclosed in its Q3 2024 earnings that members were initiating longer message threads. The platforms have argued that cultural shifts—pandemic reflection, Gen Z values, fatigue with hook-up culture—are driving demand for depth.
'Banksying', even as an anecdotal trend, contradicts that narrative entirely. It represents the opposite of intentional communication: premeditated emotional dishonesty. If members are indeed using AI to plan secret breakups whilst maintaining the appearance of engagement, they're not rejecting superficiality. They're perfecting it.
The tension here isn't just philosophical. It's operational. Dating platforms optimise for engagement: time spent in-app, messages sent, profile views. Those metrics don't distinguish between genuine connection and performative participation. Someone 'banksying' their partner might look, to the algorithm, like an engaged user.
What comes after ghosting
'Banksying' sits at the end of a progression. Ghosting emerged as the frictionless exit: simply stop responding. Breadcrumbing followed: maintain minimal contact to keep someone interested without commitment. Orbiting came next: disengage directly but continue engaging with someone's social media presence. 'Benching' described keeping potential partners in reserve whilst pursuing other options.
Each of these behaviours shares a common feature: they minimise confrontation. They allow someone to exit or maintain distance without the discomfort of direct conversation. 'Banksying', as described in social posts, doesn't eliminate confrontation—it prepares for it with such precision that the conversation itself becomes a scripted event rather than an exchange.
The involvement of AI accelerates this. Previous distancing behaviours required only avoidance. This requires active planning, workshopping, optimisation. It's the difference between failing to show up and scheduling your absence weeks in advance. Both result in distance, but the latter involves far more deliberate deception.
For dating operators, the question is whether their products inadvertently train users in these behaviours. Platforms teach members to optimise profiles, craft opening messages, manage multiple conversations simultaneously. The skills required to 'succeed' on dating apps—strategic presentation, calculated timing, emotional detachment from rejection—transfer seamlessly to strategic exits. If apps function as training grounds for gamified dating, they shouldn't be surprised when members gamify breakups too.
The prevalence problem
The challenge with assessing 'banksying' is the evidence base. One widely shared TikTok does not constitute a trend. Social media thrives on amplifying outlier behaviour and presenting it as representative. Without data on how many people are actually using AI to plan breakups, or how widespread premeditated emotional withdrawal has become, it's impossible to know whether this is a meaningful shift or an isolated story that got traction because it confirmed existing anxieties about AI and dating culture.
Dating platforms have unprecedented visibility into relationship formation and dissolution patterns. They know when message frequency drops, when response times lengthen, when one person remains engaged whilst the other pulls back. They could, in theory, identify patterns consistent with calculated withdrawal. Whether they will share that data—or even analyse it internally—is another question. Operators have little incentive to publish research showing their members behave cruelly.
The broader point stands regardless of scale. The infrastructure exists for this behaviour. AI chatbots are accessible, capable, and free of judgment. Online communities reward strategic relationship management. Dating platforms optimise for engagement over authenticity. Whether 'banksying' becomes widespread or remains niche, the conditions that make it possible aren't going anywhere.
Trust and safety teams already manage harmful behaviours: harassment, fraud, exploitation. Calculated emotional manipulation doesn't fit neatly into those categories. It's not illegal. It's arguably not even against platform policies. But it erodes the premise that dating apps facilitate genuine connection. Operators who've spent two years marketing intentionality might want to consider what happens when their most engaged users are the ones planning the most calculated exits.
- The gap between platform marketing around authenticity and actual user behaviour signals a fundamental misalignment between engagement metrics and relationship quality—operators cannot distinguish calculated manipulation from genuine connection through data alone
- AI features designed to improve dating outcomes will inevitably be repurposed for whatever serves users' immediate interests, including orchestrating exits; this isn't a bug to be fixed but an operational reality to plan for
- Watch whether dating platforms begin addressing emotional manipulation in trust and safety frameworks, and whether they'll sacrifice engagement metrics to discourage behaviours that contradict their authenticity positioning
Comments
Join the discussion
Industry professionals share insights, challenge assumptions, and connect with peers. Sign in to add your voice.
Your comment is reviewed before publishing. No spam, no self-promotion.




