devxlogo

Heavy use of ChatGPT linked to loneliness

Heavy Loneliness
Heavy Loneliness

Researchers have found that heavy users of ChatGPT tend to be lonelier, more emotionally dependent on the AI tool, and have fewer offline social relationships. According to a pair of studies, only a few users engage emotionally with ChatGPT, but those who do are among the heaviest users. The researchers discovered that users with the most emotionally expressive conversations with chatbots experienced higher levels of loneliness, though it is unclear if this is caused by the chatbot or if lonely people seek out these emotional bonds.

While the studies are preliminary, they raise pressing questions about how AI chatbot tools used by over 400 million people a week influence offline lives. Participants who “bonded” with ChatGPT were more likely to be lonely and to rely heavily on the chatbot. A complex picture emerged regarding the impact of chatbots.

Initially, Voice-based chatbots helped mitigate loneliness compared to text-based chatbots, but this advantage decreased with increased usage. After using the chatbot for four weeks, female participants were slightly less likely to socialize than their male counterparts. Participants who interacted with ChatGPT’s voice mode in a gender different from their own reported significantly higher levels of loneliness and increased emotional dependency by the end of the study.

The researchers analyzed real-world data from nearly 40 million interactions with ChatGPT and surveyed 4,076 users about their emotional states. Additionally, a four-week trial was conducted with almost 1,000 participants who used ChatGPT for a minimum of five minutes daily and then completed a questionnaire measuring their feelings of loneliness, social engagement, and emotional dependence. These findings echo earlier research ,which indicated that chatbots tend to mirror the emotional sentiment of a user’s messages.

See also  Shell Delays Two Perdido Wells to Year-End

Happier messages led to happier responses.

ChatGPT usage and emotional dependency

Dr.

Andrew Rogoyski, a director at the Surrey Institute for People-Centred Artificial Intelligence, warned that AI chatbots could be “dangerous” and that more research was needed to understand their social and emotional impacts. “In my opinion, we are doing open-brain surgery on humans, poking around with our basic emotional wiring with no idea of the long-term consequences. We’ve seen some of the downsides of social media.

This is potentially much more far-reaching,” he said. Dr. Theodore Cosco, a researcher at the University of Oxford, acknowledged the valid concerns raised by heavy chatbot usage but also saw potential benefits.

The idea that AI systems can offer meaningful support, particularly for those who may otherwise feel isolated, is worth exploring. However, we must be thoughtful and intentional in integrating these tools into everyday life.”

Dr. Doris Dippold, who researches intercultural communication at the University of Surrey, emphasized the importance of understanding the causes behind emotional dependence on chatbots.

Are they caused by chatting to a bot ties users to a laptop or a phone and, therefore, removes them from authentic social interaction? Or is it the social interaction, courtesy of ChatGPT or another digital companion, which makes people crave more?” she asked. The studies will be submitted to peer-reviewed journals for further validation and examination.

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.