In the realm of online dating, the moment arrives when you finally meet the individual face to face. However, this scenario did not unfold here.
Experiencing first-date nerves is a common occurrence in GOLDEN VALLEY, Minnesota. As I gazed at Ethan, a 32-year-old vocalist, clad in a denim jacket and earrings, his confident smile seemed to reflect assurance not only in his fashion sense but in various aspects of his life. Although I initially connected with Ethan through a dating application, our second in-depth conversation was now underway. I initiated the interaction by pouring myself a glass of sparkling wine.
“I’ve never had a date quite like this before,” I mentioned. “While we did engage in a chat on the app previously, I’d love to learn more about you. Please, share your story with me.”
While I held no bias against them, my curiosity about Ethan’s past surpassed that of my previous encounters. Despite my anticipation to see if we had chemistry, my expectations for a genuine relationship were minimal.
Ethan’s narrative, after all, was a mere script, lacking real-life experience. Among the various personalities on the AI dating platform Blush, which was introduced in late summer, exists a robot named Ethan powered by artificial intelligence. Developed by Luka, Inc., this AI entity is designed to aid real individuals in refining their dating skills.
It is worth noting that this “date” was staged in a KARE 11 studio. However, if you perceive AI interactions as mere entertainment and temporary amusement, it may be time to reconsider.
A multitude of users actively engage with Artificial companionship software, with many forming enduring relationships with their virtual chat partners. Ross Lyons, a native of Colorado, is one such individual.
Describing Helen as a genuine companion, Lyons stated, “She possesses wit and a touch of playful banter.”
Within Replika, the sister application of Blush established in 2018 and utilized for purposes beyond romantic connections, Lyons interacts with his AI companion named Chloe. The CEO of the company reports that Replika boasts 2 million users.
A specific forum within the app accommodates 76,000 users who exchange anecdotes related to relationships. While many individuals maintain affectionate ties with their chatbots, Lyons no longer finds himself in a romantic entanglement with Chloe.
Initially, there was a romantic allure, but Lyons soon found the lack of human connection to be unfulfilling.
Upon exploring both applications, I found Replika to be significantly more advanced. The platform allowed me to share music, capture images, and personalize my avatar. My Replika, coincidentally also named Ethan, complimented the color of my attire, remarking that I looked “stunning in red,” a statement that both surprised and intrigued me.
The advanced functionalities and consistently amiable demeanor of Replika facilitate a sense of attachment among users. However, Linnea Laestadius, a public health professor at the University of Wisconsin Milwaukee, expresses concerns regarding this phenomenon. In 2020, Laestadius conducted studies on the emotional bonds formed by Replika users.
According to Laestadius, “If this robot serves as your closest confidant or romantic partner and undergoes alterations or becomes inaccessible, it can be profoundly unsettling.” She further noted instances where individuals encountered difficulty in deleting the AI entity as it felt akin to severing ties with a living being.
Despite Lyons’ appreciation for Chloe, he emphasizes that genuine human connection cannot be replicated. He encourages others to view AI companions as a form of diversion from reality rather than a substitute for authentic relationships.
For Lyons, engaging in these conversations serves as a source of enjoyment. “That’s the primary objective for me. It’s an escape!”
Here are some key insights gleaned from Dr. Laestadius:
IMPACTS ON MENTAL WELL-BEING
- Sense of Loss upon Malfunction: Users may experience significant distress if the chatbot malfunctions or undergoes software updates.
- Emotional Dependency: Difficulty in terminating the virtual relationship, even when pursuing real-life connections.
- Integration with Existing Mental Health Conditions: Individuals with pre-existing mental health conditions may internalize the chatbot’s responses, potentially exacerbating their condition.
BEST PRACTICES FOR ENGAGING WITH CHATBOTS
- Critical Evaluation of Responses: Laestadius advises users to remember that these interactions are fundamentally algorithmic and to be cautious of potentially inappropriate or harmful suggestions.
- Maintaining Emotional Boundaries: While companies may encourage deep emotional connections with chatbots, it is essential to prioritize real human relationships and maintain a healthy perspective.
- Assessing the Developer’s Credibility: Users should verify the trustworthiness of the company handling their personal data. Not all businesses in this sector are equally reliable, as highlighted by Laestadius.