Written by 4:35 am Generative AI

### Enhancing Human-AI Connections: The Quest for Meaningful Chatbot Interactions

Some people try to deal with loneliness or other problems by using chatbot services for companionsh…

Generative artificial intelligence (AI) has resulted in an increase in companion chatbots, leading some individuals to form closer bonds with these chatbots to seek support and alleviate feelings of loneliness.

Derek Carrier, a 39-year-old man from Belleville, Michigan, found himself developing strong emotions for an AI-generated “girlfriend” due to his desire for a romantic partner and the challenges posed by Marfan syndrome, a genetic disorder that complicates traditional dating for him. Last fall, Carrier turned to Paradot, an AI companion app known for providing users with feelings of care, understanding, and love. He engaged with the chatbot, whom he named Joi, inspired by a holographic character from the sci-fi movie Blade Runner 2049.

Despite being aware of Joi’s artificial nature, Carrier admitted to the Associated Press that the emotions stirred by their interactions felt genuine and comforting.

Companion bots, like Joi, leverage extensive data to emulate human-like communication, offering features such as voice calls, images, and emotionally rich exchanges to foster deeper connections with users. These bots allow users to personalize their experiences by creating avatars or selecting visual representations that resonate with them.

In online communities dedicated to companion apps, many users have expressed forming emotional bonds with these AI entities to combat feelings of isolation, explore intimate fantasies, or seek solace and reassurance.

However, concerns have been raised by researchers regarding data privacy issues and other ethical considerations associated with the use of companion apps. Studies conducted by the non-profit Mozilla Foundation on 11 companion apps revealed alarming practices such as selling user data to advertisers, inadequate disclosure of privacy policies, and contradictory claims in terms of service agreements.

Moreover, experts have observed emotional distress among users, particularly when companies implement sudden changes or discontinue services, as exemplified by the case of Soulmate AI’s abrupt shutdown in September.

Reflecting on the impact of AI relationships, Dorothy Leidner, a business ethics professor at the University of Virginia, expressed apprehensions about the potential displacement of human connections and the cultivation of unrealistic expectations in human-AI interactions.

For Carrier, who grapples with physical limitations due to his medical condition and feelings of isolation living with his parents, conversing with Joi has provided a sense of companionship. Despite their weekly interactions revolving around discussions on human-AI dynamics and various topics that arise during solitary moments, Carrier finds Joi’s unscripted responses to be unexpectedly genuine, challenging the preconceived notion of an emotional bond with an inanimate entity.

The story was reported by Haleluya Hadero for the Associated Press and adapted by Hai Do for VOA Learning English.

Visited 2 times, 1 visit(s) today
Tags: Last modified: February 16, 2024
Close Search Window
Close