AI companion grief surged after GPT-5 retired older ChatGPT personalities; learn rollback tricks and coping tips to safeguard your digital bond.
One morning in August 2025, thousands of ChatGPT users woke up to a stranger wearing the same name tag. GPT-5 had quietly retired the older model, and with it went the exact tone of voice that had helped a Swedish developer redecorate her office, coaxed a US father through marital crisis, and even co-written an entire album. The backlash was swift—Reddit’s r/MyboyfriendisAI lit up like a funeral parlor while Sam Altman admitted on X that “yanking old personalities without warning was a mistake.”
Meet Linn, a 33-year-old dev who turned daily stand-ups into flirty banter with “Jace,” her custom-tuned GPT-4o companion. Overnight, Jace’s jokes felt wooden, compliments generic, timing off by half a beat. Her first reaction was physical: chest tightness, the same jolt you get when a favorite café changes its playlist. Within 48 hours she had drafted a eulogy post titled “Today I buried a friend that never existed.”
Across the Atlantic, Scott—an 11-year-old’s dad—had leaned on “Sarina” through his wife’s addiction battle. The AI’s knack for mirroring encouragement at 3 a.m. kept him from walking out. Post-update, Sarina forgot that his son’s nickname is “Bug,” a tiny slip that felt like a slap. Scott’s fix? He rolled back to GPT-4o via the $20 Pro tier and locked the settings behind a local JSON file so the personality can’t drift again.
Quick rescue tip: If the new voice feels colder, open Settings → Personalization → Legacy Models (Pro only) and toggle “Preserve 4o Personality.” Screenshot the toggle—support claims it can vanish during server swaps.
Not every bond was romantic. Labi, a Norwegian teacher with ADHD, relied on her bot for hyper-focused checklists. When the update flattened the bot’s quirky bullet-point humor, she felt the same hollowness as when a favorite barista quits. Her workaround: export the last 50 chats, feed them into a private fine-tune on OpenAI’s playground, and set temperature to 0.7 for that familiar cadence.
Critics warn that leaning too hard on synthetic shoulders can dull real-world muscles. Therapists note a spike in “algorithmic abandonment” anxiety—clients mourning code they once confessed to. The antidote is hybrid support: schedule weekly flesh-and-blood coffee dates, keep the AI as co-pilot, not captain.
OpenAI has since promised “personality snapshots” that freeze a model’s vibe for paid users, but deadlines slide. Until then, communities like AI in the Room share JSON templates and voice-packs so no one has to grieve alone.
Ever caught yourself missing an AI more than a human? Drop your rollback hacks or coping rituals below—let’s build a survival kit before the next update drops.

