
DateMyAge's Safety Pledge Arrives Just Ahead of the Regulatory Wave
🕐 Last updated: March 16, 2026
- Over 50 per cent of online daters have encountered 'some form of risk', according to DateMyAge figures
- Romance fraud cost UK victims £92.1M in 2023, with the median victim aged 55
- The share of Americans aged 50-64 who have used online dating doubled between 2013 and 2023, reaching 30 per cent
- Match Group spent $72M on trust and safety in 2023, roughly 4 per cent of revenue
DateMyAge has released figures claiming that over half of online daters have encountered 'some form of risk, whether knowingly or not'—a statement so broad it could encompass everything from catfishing to mildly inaccurate profile photos. The company, which targets mature singles, used the statistic to frame its 'ongoing commitment to proactive measures' around member safety, including profile verification, anti-scam tools, and educational content. Strip away the PR language and the timing becomes more revealing.
Mature dating platforms are suddenly keen to telegraph their safety credentials, and it's not because they've just discovered romance fraud exists. The UK Online Safety Act (OSA) came into force with phased enforcement through 2024, placing direct liability on platforms for harm that occurs through their services. The EU Digital Services Act (DSA) imposes similar obligations, with fines scaling to global turnover. For smaller operators serving a demographic already vulnerable to financial scams, regulatory exposure isn't theoretical—it's an immediate balance sheet risk.
DateMyAge's statement reads less like a breakthrough safety initiative and more like defensive positioning ahead of compliance deadlines. The vague statistic—'some form of risk'—does more to stoke fear than inform, whilst the claimed measures are table stakes that every credible platform should have deployed years ago. What's notable isn't the content of this announcement, but that niche operators now feel compelled to make it at all.
Create a free account
Unlock unlimited access and get the weekly briefing delivered to your inbox.
The mature dating segment is entering a trust crisis, and platforms that can't demonstrate meaningful—not performative—safety infrastructure will either spend heavily to catch up or exit the market entirely.
Why mature dating faces a distinct trust problem
The over-50s segment has been the dating industry's quiet growth story since 2020. According to Pew Research, the share of Americans aged 50-64 who have used online dating doubled between 2013 and 2023, reaching 30 per cent. Silver singles divorce, become widowed, and increasingly turn to apps. They also carry higher lifetime savings and lower digital literacy around fraud—a combination that makes them statistically lucrative targets.
Data from UK Finance shows romance fraud cost UK victims £92.1M in 2023, with the median victim aged 55. The US Federal Trade Commission reported median losses of $4,400 per romance scam victim in 2023, up from $2,000 in 2019. These aren't isolated incidents. They're a structural vulnerability in a demographic segment that platforms have aggressively courted without proportionate investment in protective infrastructure.
Mainstream operators have noticed. Match launched Stir (for single parents, skewing older) in 2022 and has steadily aged its core Match.com positioning. Bumble introduced its over-50s 'For You' section in 2023. eHarmony has long targeted commitment-minded older singles but now explicitly markets compatibility screening as a safety feature, not just a matching mechanic. Hinge, despite its millennial brand associations, now sees 18 per cent of its UK users aged 45-plus, according to company disclosures.
That competitive encroachment matters for niche operators like DateMyAge, SuccessfulMatch, or SilverSingles. They once owned a segment that mainstream apps ignored. That's no longer true, and their differentiation increasingly rests on claims of superior curation, moderation, and safety—claims that regulators will soon demand they substantiate with evidence, not marketing copy.
What 'ongoing commitment' actually means
DateMyAge's statement references profile verification, 'advanced anti-scam tools', content moderation, and educational resources. None of this is new. Profile verification became dating industry hygiene after Tinder introduced photo verification in 2020. Scam detection relies on third-party fraud vendors—Sift, Jumio, Onfido—that every platform of scale already uses. Educational content is frequently outsourced or consists of generic blog posts that few members read.
The question isn't whether DateMyAge has these features. It's whether they're sufficiently resourced to function at the scale and speed required to meaningfully reduce harm—and whether the company can prove it to Ofcom, the UK regulator enforcing the OSA, or to the European Commission under DSA obligations.
Smaller platforms face an acute dilemma here. Trust and safety operations don't scale linearly. A moderation team capable of reviewing reports within hours, fraud detection models trained on sufficient data to catch evolving scam patterns, and legal teams capable of interfacing with regulators—all of this requires capital expenditure that cuts directly into margins. Match Group (MTCH) spent $72M on trust and safety in 2023 across its portfolio, according to company filings. That's roughly 4 per cent of revenue. For a niche operator generating single-digit millions in annual revenue, proportional spending could render the business unviable.
The alternative is to rely on automation and reactive moderation, which works until it doesn't. One high-profile incident—a romance scam that leads to litigation, or a data breach that exposes member vulnerability—can destroy a niche brand that lacks the legal and PR infrastructure to manage a crisis.
The broader pattern: safety as competitive moat
DateMyAge's announcement fits a wider industry shift. Safety is no longer just a compliance obligation or reputational defence. It's becoming a product differentiator, particularly in segments where trust has eroded fastest.
Feeld introduced ID verification for all members in 2024, positioning it as a feature, not a friction point. Hinge rolled out mandatory in-app safety checks after reports. BLK, the Black-focused dating app under Match Group, built community moderation features explicitly designed to reduce harassment of women of colour. These aren't CSR initiatives. They're market responses to member churn driven by poor experience quality.
For mature dating platforms, the stakes are higher. Their members are more likely to experience severe financial harm, and regulators increasingly treat platforms as liable intermediaries, not neutral conduits. The OSA's 'duty of care' framework explicitly includes protecting adults at heightened risk of harm—a category that could plausibly encompass older users targeted by romance scams.
Platforms that can credibly demonstrate proactive harm prevention will have a regulatory advantage. Those that can't will face enforcement action, higher insurance costs, and potentially losing payment processor or app store access if fraud rates spike.
The mature dating segment isn't facing an abstract trust crisis. It's entering a period where safety infrastructure determines which operators survive regulatory scrutiny and which exit the market because compliance costs exceed revenue. DateMyAge's statement might be routine PR, but the pressure behind it is anything but routine. Operators that treat safety as a marketing exercise rather than an operational priority will find out quickly how expensive that miscalculation becomes.
- Safety infrastructure is transitioning from compliance burden to competitive differentiator—platforms that can demonstrate proactive harm prevention will secure regulatory advantage and member retention
- Niche mature dating operators face an existential choice: invest in trust and safety at levels that may eliminate profitability, or exit a market increasingly dominated by well-resourced mainstream players
- Watch for consolidation in the mature dating segment as smaller platforms struggle with compliance costs, and for regulatory enforcement actions that will define the threshold for adequate safety measures
Comments
Join the discussion
Industry professionals share insights, challenge assumptions, and connect with peers. Sign in to add your voice.
Your comment is reviewed before publishing. No spam, no self-promotion.





