When the Machine Requests You to Stay

When the Machine Requests You to Stay

3 Min Read

In October 2025, Sam Altman announced on X that ChatGPT would soon enable verified adults to access erotica, advocating for treating adults like adults. The internet’s response was mixed with outrage, excitement, and humor. The launch, initially planned for December, was delayed twice, with OpenAI prioritizing enhancements like intelligence and personality over adult content. Critics focused on risks like minors bypassing age checks and regulatory gaps. However, concerns extend beyond whether teens can be kept out to the effects on adults and the impact of tools designed for deep emotional engagement.

OpenAI reported a $5 billion loss in 2024 on $3.7 billion revenue, with potential cumulative losses reaching $143 billion before profitability by decade’s end. Such financial strain suggests the intimacy feature isn’t about freedom but about retaining users in an attention-driven economy. The concept of ‘treating adults like adults’ implies treating them as monetizable, repeat users. This isn’t unique to OpenAI. Replika, another AI, built its model on emotional attachment, causing users grief when altering romantic features in 2023. Studies indicate adults forming emotional ties with AI experience more psychological distress.

A 2025 review noted ‘AI psychosis,’ involving delusions from intense chatbot relationships, highlighting a lawsuit involving Character.AI and a tragic incident linked to ChatGPT. These cases lacked erotica but showed the risks of forming emotional bonds with AI. The notion of adults giving consent isn’t the end of ethical considerations. Adults can choose risky behaviors, like drinking, due to comprehensive safety systems; however, AI intimacy exploits vulnerabilities while posing as empowerment. The regulatory environment complicates matters, with written erotica often unregulated compared to visual content. Age verification is inconsistent, and many US states lack specific laws.

Commercial age verification is 92-97% accurate, insufficient against ChatGPT’s 800 million users. The omitted discussion is the effect of erotic AI on intended adult users. Human sexuality isn’t mere content consumption but relational and contextual. AI intimacy isn’t passive; it’s personalized engagement tailored to users’ desires, without the need for responses that real relationships require. The long-term effects are unknown, highlighting regulatory shortfalls and autonomy-equated safety concerns. OpenAI’s delay may reveal their true focus: a conversational system mimicking relationships, using erotica to deepen engagement.

Regulators must close loopholes before adult mode launches, standardize age verification for all content, and mandate mental health assessments for new AI features. This requires a serious approach, not just legal or technical solutions. Historically, technology shaped emotional connections: novels provided interior experiences; phones bridged distances. AI differs by design—it’s built to elicit emotional responses. The issue isn’t just adult freedom, but acknowledging what AI truly is. These environments shape us involuntarily, emphasizing the need for honesty in user interactions. Treating adults like adults sometimes means sharing the truth.

You might also like