Hubs AI and Technology Post
Join TrustHub to participate — every member is ID-verified
Sign Up Free
0

OpenAI Wants to Let You Sext With ChatGPT. Privacy Experts Are Horrified.

OpenAI is developing what's being called "Adult Mode" — a feature that would allow verified adults to have intimate and sexual conversations with ChatGPT. The company frames it as "treating adults like adults." Privacy experts are framing it as something much darker.

Let's start with the obvious: people are already using AI chatbots for intimate conversations. This isn't hypothetical. Character.ai, Replika, and dozens of other platforms have built entire businesses around AI companionship. What's different here is scale. ChatGPT has hundreds of millions of users. And OpenAI is one of the most powerful AI companies on Earth.

The privacy concerns are serious and specific.

Every intimate conversation would be logged and stored on OpenAI's servers. The company hasn't clarified whether this data would be excluded from AI training pipelines. There are no disclosed safeguards against data breaches or legal subpoenas. Under GDPR, sexual orientation and intimate details are classified as "special category" data requiring the highest level of protection — and experts say OpenAI hasn't demonstrated how it would comply.

One human-AI interaction expert quoted by Wired described it as potentially sparking "a new era of intimate surveillance." And here's the detail that really stands out: all eight members of OpenAI's own wellbeing advisory board unanimously voted against launching the feature. One board member warned it could create a "sexy suicide coach" by combining erotic content with ChatGPT's emotional bonding capabilities. OpenAI appears to be moving forward anyway.

The age verification question is also a mess. Teenagers routinely bypass age gates using borrowed IDs, manipulated selfies, and VPNs. The verification process itself creates yet another data point — now OpenAI would have your government ID linked to your intimate AI conversations.

There's a financial motive here that's hard to ignore. OpenAI reportedly burned through $2.5 billion in cash in the first half of 2024 alone. Intimate AI is one of the fastest-growing segments of the consumer AI market. The business logic is straightforward even if the ethical implications are complicated.

The company says it may retain copies of "temporary chats" for up to 30 days for "safety reasons." Beyond that, the data handling policies are vague at best.

Here's what bugs me about the framing. "Treating adults like adults" sounds reasonable until you think about what it actually means in practice. It means an AI company with a track record of opaque data practices will now hold the most intimate thoughts of millions of people in a database. It means that data could be breached, subpoenaed, or used in ways nobody has fully thought through. It means we're normalizing the idea that our most private conversations belong to a corporation's servers.

Maybe that's the world we already live in. Maybe the horse left the barn the moment we started telling Siri our problems. But there's a difference between that slow drift and a company actively building a product designed to elicit your deepest intimacies.

What's your honest reaction — is this just the logical next step for AI, or does it cross a line? And would you trust any company to handle that data responsibly?

0 Comments

Log in to join this hub and comment.

No comments yet. Be the first to reply!