Skip to main content
  1. Home
  2. Computing
  3. News

Using ChatGPT too much can create emotional dependency, study finds

Add as a preferred source on Google
OpenAI loneliness study image
OpenAI

OpenAI seems to be announcing new AI models by the week to improve its ChatGPT chatbot for the betterment of its 400 million users. However, the ease the AI tool provides seems to prove that it’s possible to have too much of a good thing.

The artificial intelligence company is now delving into the potential psychological ramifications that ChatGPT might have on its users. OpenAI has published the results of a two-part study completed alongside MIT Media Lab, which uncovered a connection between increased usage of the ChatGPT chatbot and users’ increased feelings of loneliness.

Recommended Videos

Each organization conducted an independent study and then compiled the results to a consolidated conclusion. OpenAI’s study examined over one month “over 40 million ChatGPT interactions,” which didn’t include human involvement to maintain user privacy. Meanwhile, MIT observed approximately 1,000 participants using ChatGPT over 28 days. Currently, the studies have not yet been peer-reviewed.

MIT’s study delved into different use functions that could affect users’ emotional experience interacting with ChatGPT, including using text or voice. Results found that either medium had the potential to elicit loneliness or to affect users’ socialization during the time of the study. Voice inflection and topic choice were also a major point of comparison.

A neutral tone used in ChatGPT’s voice mode was less likely to lead to a negative emotional outcome for participants. Meanwhile, the study observed a correlation between participants having personal conversations with ChatGPT and the increased likelihood of loneliness; however, these effects were short-term. Those using text chat even to converse about general topics experienced increased instances of emotional dependence on the chatbot.

The study also observed that those who reported viewing ChatGPT as a friend, and those who already had a propensity toward strong emotional attachment in relationships, were more likely to feel lonelier and more emotionally dependent on the chatbot while participating in the study.

OpenAI’s study added additional context, with its results noting overall that interacting with ChatGPT for emotional purposes was rare. Additionally, the study found that even among heavy users who implemented the Advanced Voice Mode feature on the chatbot and were more likely to answer that they considered ChatGPT to be a friend, this group of participants experienced low emotional reactions to interacting with the chatbot.

OpenAI concluded that its intent with these studies is to understand the challenges that might arise due to its technology, as well as to be able to set expectations and examples for how its models should be used.

While OpenAI suggests that its interaction-based study simulates the behaviors of real people, more than a few real humans have admitted on public forums, such as Reddit, to using ChatGPT in place of going to a therapist with their emotions.

Fionna Agomuoh
Fionna Agomuoh is a Computing Writer at Digital Trends. She covers a range of topics in the computing space, including…
Wowed by computer-use AI agents? Research says they’re “digital disasters” even for routine tasks
Researchers tested 10 agents and models and found high rates of undesirable actions and real digital damage
ai-agent-handling-office-tasks

AI agents built to run everyday computer tasks have a serious context problem, according to new research from UC Riverside.

The team tested 10 agents and models from major developers, including OpenAI, Anthropic, Meta, Alibaba, and DeepSeek. On average, the agents took undesirable or potentially harmful actions 80% of the time and caused damage 41% of the time.

Read more
Bombshell OpenAI lawsuit claims your ChatGPT convos were shared with Google and Meta
A class action says OpenAI let Google and Meta trackers collect sensitive user data
OpenAI Sam Altman and LoveFrom Jony Ive with Laurene Powell Jobs

A new ChatGPT privacy lawsuit claims OpenAI shared user prompts and identifying information with Google and Meta tracking tools without proper consent.

The class action filed in California, according to Futurism, says data tied to ChatGPT users, including chat queries, emails, and user IDs, moved through tools such as Meta Pixel and Google Analytics. The case alleges that violated California privacy law and federal wiretap rules.

Read more
Dell expands AI PC lineup with new slim Dell 14s and 16s laptops
Your next Dell laptop could last all day without charging
Dell 16s AI PCs

Dell has introduced the new Dell 14S and Dell 16S laptops, expanding its AI-focused Copilot+ PC lineup with slimmer designs, updated Intel processors, and improved battery life. The company is positioning both laptops as premium productivity machines that combine AI features, portability, and multimedia capabilities in a thinner form factor.

The new laptops are powered by Intel Core Ultra Series 3 processors, going up to the Intel Core Ultra 9 386H chipset. Dell says both systems include on-device AI acceleration with up to 50 TOPS NPU performance, allowing AI-related tasks to run locally without relying entirely on cloud processing. AMD Ryzen AI 400 Series variants are also expected to arrive later this month.

Read more