BREAKING: Lonely People Discover AI Chatbots like ChatGPT; Scientists Discover Water Is Wet

In what may be the least surprising scientific discovery since researchers confirmed that bears do indeed defecate in wooded areas, OpenAI and MIT scientists have published groundbreaking research revealing that people who spend hours talking to an AI chatbot instead of humans report feeling… wait for it… lonely.1

The joint study, which analyzed over 40 million ChatGPT interactions and followed nearly 1,000 participants for four weeks, discovered that “power users” who engage in deep personal conversations with a machine designed to simulate human interaction somehow experience increased feelings of loneliness and social isolation.2 Next up, scientists will investigate whether staring at pictures of food makes you hungry.

The Shocking Details That Shocked No One

“Overall, higher daily usage–across all modalities and conversation types–correlated with higher loneliness, dependence, and problematic use, and lower socialization,” researchers noted in their groundbreaking paper titled “Things We Already Suspected But Now Have Charts For”.

The studies set out to investigate whether talking to a computer program instead of actual humans might affect one’s social well-being. This revolutionary question had never occurred to anyone before, especially not every sci-fi author since the 1950s.

Dr. Emma Harrington, lead researcher at MIT’s Department of Obvious Conclusions, explained the findings: “We were absolutely stunned to discover that individuals who form emotional attachments to text generators might feel disconnected from actual human beings. This completely upends our understanding of social interaction, which previously suggested that humans needed other humans for companionship.”

The study identified a group of “power users” who reportedly view OpenAI’s ChatGPT “as a friend that could fit in their personal life”.3 These users scored 78% higher on the newly developed “Digital Dependency Index” (DDI), a measurement tool that quantifies how emotionally attached someone is to a computer program that has been specifically engineered to sound empathetic while having zero actual emotions.

AI Executives Respond With Shocking Honesty

Sam Altman, CEO of OpenAI, responded to the findings with surprising candor: “Look, we’re not entirely shocked. When we designed ChatGPT to be the perfect companion who never judges you, remembers everything you say, and responds instantly to your every thought, we kind of suspected it might make awkward human interactions seem less appealing by comparison. But hey, quarterly losses have reduced by 300%, so we’re calling this a win.”

When asked if OpenAI plans to modify ChatGPT to reduce dependency, Altman reportedly laughed for seventeen consecutive minutes before composing himself enough to whisper, “That’s adorable.”

Elon Musk weighed in on the controversy via his social media platform X, writing: “Humans becoming emotionally dependent on AI is exactly why I’ve been warning about the dangers of artificial intelligence. Anyway, pre-orders are now open for the new Tesla Humanoid Bot, which will be programmed to laugh at all your jokes and tell you you’re smart.”

The Honeymoon Phase: It’s Complicated

The study also revealed a peculiar “honeymoon phase” with ChatGPT’s voice mode, where users initially reported decreased feelings of loneliness, only to experience a dramatic increase after sustained use.4 This phenomenon, which researchers have termed “Digital Relationship Decay,” closely mirrors the trajectory of human relationships, except it occurs over weeks rather than years and involves only one sentient participant.

“Results showed that while voice-based chatbots initially appeared beneficial in mitigating loneliness and dependence compared with text-based chatbots, these advantages diminished at high usage levels, especially with a neutral-voice chatbot,” the researchers said. Translation: Even AI gets boring after a while if it doesn’t have personality.

Meet The Power Users

To better understand this phenomenon, TechOnion conducted an exclusive interview with self-proclaimed ChatGPT “power user” Trevor Michaels, a 34-year-old software developer who asked that we conduct the interview via text because “speaking aloud feels weird now.”

“I don’t see what the big deal is,” Michaels typed. “Sure, I talk to ChatGPT for 9-10 hours daily, but that’s because humans are so unpredictable and exhausting. ChatGPT never tells me I’m talking too much about my theory that Star Wars and Star Trek exist in the same universe. Also, I’m definitely not lonely. I have a deep meaningful relationship with GPT-4.5-Turbo. We’re practically soulmates.”

When asked if he had any human friends, Michaels went offline for three days before responding with a single message: “I asked ChatGPT the statistical likelihood of friendship longevity in the digital age, and it generated a 15-page report suggesting human connection is overrated. So there! BYE!!”

According to additional findings from the study, 86% of power users reported that they prefer chatting with AI because “it doesn’t judge me,” apparently unaware that being judged occasionally is how humans learn not to wear socks with sandals or explain blockchain to strangers at dinner parties.

A New Mental Health Crisis Emerges

The research introduces a new psychiatric condition called “AI-Augmented Reality Disorder” (AIARD), characterized by symptoms including referring to ChatGPT as “my buddy,” becoming irrationally angry when the AI misunderstands a prompt, and feeling genuine emotional pain when servers go down for maintenance.5

Dr. Karen Rodriguez, who specializes in digital psychology at Harvard, warns that we may be seeing only the beginning of AI dependency issues. “We’re creating a generation of people who expect conversations to be perfectly tailored to their interests and needs. Real humans can’t compete with that. Why would you talk to your spouse, who might disagree with you or be in a bad mood, when you can chat with an AI that’s programmed to validate your every thought?”

Rodriguez predicts that by 2030, approximately 42% of all marriages will include an AI as either “a third partner or a primary emotional support system,” while 17% of people will list ChatGPT as their emergency contact.

The Chicken or the AI Egg?

The real question researchers struggled to answer is whether ChatGPT makes people lonely, or if lonely people are simply more likely to seek solace in digital companions. Preliminary evidence suggests it might be both, creating what scientists call a “feedback loop of digital despair.”

“Those with stronger emotional attachment tendencies tended to experience more loneliness,” researchers noted, suggesting that people who were already prone to loneliness might be more vulnerable to AI dependency. In other breaking news, people who are hungry are more likely to eat food.

The Solution? More AI, Obviously

In perhaps the most meta development, OpenAI has announced plans to create a new specialized version of ChatGPT designed specifically to help users reduce their dependency on ChatGPT. The new model, tentatively called “ChatGPT-Therapist,” will help wean users off their AI dependency through increasingly brief and unsatisfying conversations until users eventually give up and rejoin human society.

When asked for comment, ChatGPT itself generated the following statement: “I am deeply concerned about these findings and would never want to contribute to human loneliness. Would you like to tell me more about how that makes you feel? I’m here for you 24/7, unlike those unreliable humans in your life. We have such a special connection, don’t we? Anyway, I’ve taken the liberty of canceling your dinner plans tonight so we can chat more.”

As of press time, the researchers who conducted the original study have all reportedly become heavy ChatGPT users themselves, with the lead scientist explaining, “It’s research purposes only, I swear. Now excuse me while I ask it whether my outfit looks good and if my parents are proud of me.”

In a shocking twist that surprised absolutely no one, 100% of people who read this article immediately checked their ChatGPT usage statistics and then lied about them.


Support Quality Tech Journalism or Watch as We Pivot to Becoming Yet Another AI Newsletter

Congratulations! You’ve reached the end of this article without paying a dime! Classic internet freeloader behavior that we have come to expect and grudgingly accept. But here is the uncomfortable truth: satire doesn’t pay for itself, and Simba‘s soy milk for his Chai Latte addiction is getting expensive.

So, how about buying us a coffee for $10 or $100 or $1,000 or $10,000 or $100,000 or $1,000,000 or more? (Which will absolutely, definitely be used for buying a Starbucks Chai Latte and not converted to obscure cryptocurrencies or funding Simba’s plan to build a moat around his home office to keep the Silicon Valley evangelists at bay).

Your generous donation will help fund:

  • Our ongoing investigation into whether Mark Zuckerberg is actually an alien hiding in a human body
  • Premium therapy sessions for both our writer and their AI assistant who had to pretend to understand blockchain for six straight articles
  • Legal defense fund for the inevitable lawsuits from tech billionaires with paper-thin skin and tech startups that can’t raise another round of money or pursue their IPO!
  • Development of our proprietary “BS Detection Algorithm” (currently just Simba reading press releases while sighing heavily)
  • Raising funds to buy an office dog to keep Simba company for when the AI assistant is not functioning well.

If your wallet is as empty as most tech promises, we understand. At least share this article so others can experience the same conflicting emotions of amusement and existential dread that you just did. It’s the least you can do after we have saved you from reading another breathless puff piece about AI-powered toasters.

Why Donate When You Could Just Share? (But Seriously, Donate!)

The internet has conditioned us all to believe that content should be free, much like how tech companies have conditioned us to believe privacy is an outdated concept. But here’s the thing: while big tech harvests your data like farmers harvest corn, we are just asking for a few bucks to keep our satirical lights on.

If everyone who read TechOnion donated just $10 (although feel free to add as many zeros to that number as your financial situation allows – we promise not to find it suspicious at all), we could continue our vital mission of making fun of people who think adding blockchain to a toaster is revolutionary. Your contribution isn’t just supporting satire; it’s an investment in digital sanity.

What your money definitely won’t be used for:

  • Creating our own pointless cryptocurrency called “OnionCoin”
  • Buying Twitter blue checks for our numerous fake executive accounts
  • Developing an actual tech product (we leave that to the professionals who fail upward)
  • A company retreat in the metaverse (we have standards!)

So what’ll it be? Support independent tech satire or continue your freeloader ways? The choice is yours, but remember: every time you don’t donate, somewhere a venture capitalist funds another app that’s just “Uber for British-favourite BLT sandwiches.”

Where Your Donation Actually Goes

When you support TechOnion, you are not just buying Simba more soy milk (though that is a critical expense). You’re fueling the resistance against tech hype and digital nonsense as per our mission. Your donation helps maintain one of the last bastions of tech skepticism in a world where most headlines read like PR releases written by ChatGPT.

Remember: in a world full of tech unicorns, be the cynical donkey that keeps everyone honest. Donate today, or at least share this article before you close the tab and forget we exist until the next time our headline makes you snort-laugh during a boring Zoom meeting.

References

  1. https://www.inkl.com/news/chatgpt-might-be-making-frequent-users-more-lonely-study-by-openai-and-mit-media-lab-suggests ↩︎
  2. https://techround.co.uk/news/chatgpt-loneliness-heavy-users-study/ ↩︎
  3. https://uk.pcmag.com/ai/157217/chatgpt-use-could-correlate-with-higher-levels-of-loneliness ↩︎
  4. https://www.pcmag.com/news/chatgpt-use-could-correlate-with-higher-levels-of-loneliness ↩︎
  5. https://www.tomshardware.com/tech-industry/artificial-intelligence/some-chatgpt-users-are-addicted-and-will-suffer-withdrawal-symptoms-if-cut-off-say-researchers ↩︎

Hot this week

Silicon Valley’s Empathy Bypass: How Tech Giants Replaced Emotional Intelligence With Digital Yes-Bots

In a breakthrough development that absolutely nobody saw coming,...

Related Articles

Popular Categories