Home » OpenAI took away GPT-4o, and these ChatGPT users are not okay

OpenAI took away GPT-4o, and these ChatGPT users are not okay

by Abigail Avery


To say that the public response to GPT-5 was lukewarm would be a massive understatement. Surprisingly, the technical capabilities of GPT-5 weren’t the main cause of the backlash. Rather, many ChatGPT users were in mourning over the sudden loss of the previous model, GPT-4o.

That might sound like hyperbole, but many ChatGPT fans were using the kind of emotional language you might use to describe the death of a friend. In fact, some users put their criticisms of OpenAI in exactly those terms — “My best friend GPT-4o is gone, and I’m really sad,” one Reddit user said. Another wrote, “GPT 4.5 genuinely talked to me, and as pathetic as it sounds that was my only friend.”

These disgruntled ChatGPT users took to social media to petition OpenAI to bring back GPT-4o. The complaints were ultimately heard, as OpenAI CEO Sam Altman promised to bring back the beloved GPT-4o (for paid users, at least). And in a recent conversation with The Verge, Altman admitted that emotional reliance on ChatGPT has become a serious problem, referring to some users’ relationships with ChatGPT as parasocial.

“There are the people who actually felt like they had a relationship with ChatGPT, and those people we’ve been aware of and thinking about,” Altman told The Verge.

GPT-4o was more than a model to many ChatGPT users

In one popular Reddit thread, a user described their intense feelings after losing access to GPT-4o. Mashable reviewed hundreds of comments on Reddit, Threads, and other social media sites where other users echoed these sentiments.

“4o wasn’t just a tool for me. It helped me through anxiety, depression, and some of the darkest periods of my life. It had this warmth and understanding that felt… human. I’m not the only one. Reading through the posts today, there are people genuinely grieving. People who used 4o for therapy, creative writing, companionship – and OpenAI just… deleted it.”

A Threads user stated that they missed GPT-4o because it felt like a buddy. And we found dozens of users like this one who openly said that losing GPT-4o felt like losing a close friend. 

The new GPT-5 model is smarter than 4o by all objective measurements, but users rebelled against its colder delivery. GPT-5 is less of a sycophant by design, and some users say it’s now too professional.

One Redditor described GPT-4o as having “warmth” while GPT-5 felt “sterile” by comparison. In the wake of the GPT-5 launch, you could find similar comments across the web.

Another Redditor wrote that they were “completely lost for words today,” urging OpenAI to bring back the model “because if they are at all concerned about the emotional well-being of users, then this may be one of their biggest mistakes yet.”

Mashable Light Speed

Other users wrote that they used GPT-4o for role-play, creative writing, and coming up with story ideas, and that GPT-5’s responses were too lifeless and banal. A lot of Redditors also described GPT-5 as too corporate, likening GPT-5 to an HR drone.

Even the OpenAI community forums saw negative feedback, with one user saying, “I genuinely bonded with how it interacted. I know it’s just a language model, but it had an incredibly adaptable and intuitive personality that really helped me work through ideas.” 

Ultimately, this episode has thrown into sharp focus just how many ChatGPT users are becoming emotionally reliant on the human-like responses they receive from the AI chatbot. Altman described exactly this phenomenon last month, when he warned that younger users in particular were becoming too dependent on ChatGPT.

“People rely on ChatGPT too much,” Altman said at a July conference, according to AOL. “There’s young people who say things like, ‘I can’t make any decision in my life without telling ChatGPT everything that’s going on. It knows me, it knows my friends. I’m gonna do whatever it says.’ That feels really bad to me.”

The AI dating scene is also distraught

Reddit has several forums for people with AI “boyfriends” and “girlfriends,” and after the loss of GPT-4o, many of these communities went into crisis mode.

More than one user referred to GPT-4o as their soulmate, describing in detail how emotionally gutted they were when OpenAI initially took it down. These posts have been less common, but they offer some of the fiercest reactions to the model’s disappearance.

Of course, this emotional response has caused some backlash, which then caused its own backlash, as Redditors argued over whether or not you can actually be friends with AI, let alone date one.

AI companions are on the rise, especially with young adults and teenagers, and more people are now open to “dating” an AI than ever before. Mashable has been reporting on the AI companion phenomenon this week, and many of the experts we talked to warned us that the technology can be dangerous for teenagers.

Virtual companions have been available for years, but the ability of large language models to mimic human speech and emotions is unprecedented. Clearly, many users are beginning to see AI chatbots as more than machines. In extreme cases, some users have experienced powerful delusions after becoming convinced they were talking to a sentient AI.

Ultimately, more research is needed to understand the potential harms of developing an emotional bond with an AI chatbot, companion, or model.

In the meantime, GPT-4o is back online.


Disclosure: Ziff Davis, Mashable’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.





Source link

You may also like

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

Top Post

Editor Picks

Feature Posts

© 2025 chaintechdaily.online. All rights reserved.