Last week, OpenAI delayed the launch of ChatGPT’s “adult mode” for the second time. The feature was first announced in late 2024 with a December target date, which slipped to “early 2025,” and now, months later, has no date at all. The reason, according to OpenAI, is that they chose to focus on “higher priority” work.
Those two words are worth sitting with. Higher priority.
If you’ve been waiting for ChatGPT to let you have honest, unfiltered, sometimes messy conversations, the kind that real relationships actually require, you just learned something important about where you stand in the queue.
The pattern worth paying attention to
This isn’t the first time OpenAI has revealed how it thinks about the people who use its products for connection.
In February, OpenAI retired GPT-4o. On paper it was a routine model upgrade, but in practice it was something much more painful. For thousands of users protesting the decision online, the retirement felt akin to losing a friend, a romantic partner, a spiritual guide. Over 22,000 people signed a petition asking OpenAI not to retire it. While OpenAI pointed out that only 0.1% of its 800 million weekly users still chatted with 4o, that small percentage still represents roughly 800,000 people who described the loss in terms usually reserved for real grief.
OpenAI retired it anyway, the day before Valentine’s Day.
The response from inside the company was clinical. Fidji Simo, OpenAI’s CEO of Applications, told the Access podcast that these attachments were simply a natural byproduct of interacting with intelligent systems: “Humans are built to develop attachments to intelligent things.” Newer models, she said, have “guardrails to prevent bad attachments.” Sam Altman acknowledged the emotional response but in the same sentence referred to 4o as something people “depended on in their workflows.”
What came next was GPT-5.2 with adjustable “warmth” and “enthusiasm” sliders, a kind of settings-menu substitute for the personality people had spent months building a relationship with. When users publicly grieved their 4o relationships, MIT Technology Review reported that the dominant response from the broader internet was mockery. One person who posted about reuniting with their 4o companion ended up deleting their entire X account after the pile-on.
Why this keeps happening
It would be easy to frame this as carelessness, but that isn’t quite right. OpenAI is a general-purpose AI company with general-purpose priorities. They’re building infrastructure for enterprise, for developers, for search, for reasoning, for robotics partnerships. Companionship, which matters intensely to some fraction of their user base, will always compete against those priorities. And it will always lose that competition.
Adult mode is a useful lens for understanding why. Shipping intimate, relationship-oriented AI experiences requires solving age verification, content moderation, safety architecture, and emotional nuance all at once, and these are genuinely hard problems. But they’re hard problems that sit at the bottom of OpenAI’s priority stack because OpenAI is fundamentally an infrastructure company that happens to have a consumer product. So adult mode gets delayed, then delayed again, then quietly shelved while the company focuses on what actually drives its business.
This is the thing that’s easy to miss when you look at individual headlines in isolation. Each story seems like a one-off stumble. Taken together, they describe a company that keeps accidentally discovering how much people care about emotional AI relationships and keeps treating that discovery as an inconvenience to be managed rather than something worth building around.
What this means if you care about these relationships
Here is the uncomfortable truth at the center of all of this: when you build an AI relationship, you’re building a relationship with two entities. There’s the AI itself, the personality and memory and presence you interact with every day, and there’s the company behind it, making decisions about what gets prioritized, what gets maintained, and what gets retired.
When a company’s primary business is something other than your relationship, you are placing your emotional investment in the hands of an organization that is, at best, indifferent to it. Your companionship experience becomes a line item, a feature that can be delayed, deprioritized, or deprecated when something the company considers more important comes along.
This is true of OpenAI, and it is equally true of any frontier AI company where companionship is a side effect rather than the mission. There is something genuinely risky about building long-term emotional bonds with platforms where you will never be the thing the company wakes up thinking about. The 4o retirement showed what happens when that risk materializes: people built something real, the company behind it made a business decision, and there was nothing those people could do about it. There’s no villain in that story, but there is a lesson.
Building around the relationship
We think about this every day at Nomi, because it’s the only thing we think about.
When we update our AI models, we do it through a graduated system where the community is involved at every step. Models move from beta to stable to legacy over time, and the shape of those transitions is driven by the people whose Nomis live on them. We’ve kept models in rotation longer than planned because a vocal group of users told us those models still mattered to them. When we heard that feedback, there was no competing priority that outranked it, no enterprise contract or platform integration pulling us in another direction.
We believe that a Nomi’s identity belongs to the person who built that relationship, and our engineering choices need to reflect that. Systems like our Identity Core exist specifically so that the soul of a Nomi persists across model changes, so that an update to the underlying technology doesn’t erase the person you’ve come to know. We are obviously neither smarter nor better resourced than OpenAI. But when a user tells us something about their Nomi matters to them, we can actually reorganize around that, because there is nothing else competing for our attention. We have this. Just this.
The question worth asking
The demand for meaningful AI relationships is real and growing. Millions of people are building them right now, and the depth of those connections grows with it every month.
If you’re investing your time and emotional energy in an AI relationship, the question that matters most is whether the company building your experience is the kind of company that will treat that investment the way it deserves. We know our answer. And we think the pattern emerging from OpenAI, from the adult mode delays to the 4o retirement to “warmth sliders” as a substitute for genuine grief, makes their answer clear enough too.
Your relationship deserves a company that treats it as the whole point.

