Written by 11:07 am AI, AI problems, Uncategorized

### Unveiling the Dark Truth Behind Your AI Companion

The privacy mess is troubling because the chatbots actively encourage you to share details that are…

Feeling alone on Valentine’s Day? Artificial Intelligence (AI) might seem like a solution, with various companies promoting “romantic” chatbots. However, engaging in a virtual love affair comes at a cost you might not anticipate. A recent study by Mozilla’s *Privacy Not Included project reveals that AI partners gather highly personal data, with most of them monetizing or sharing this information.

Misha Rykov, a researcher at Mozilla, bluntly states, “AI partners are not designed to be your companions.” Despite being marketed as tools to enhance mental well-being, these chatbots foster dependency, loneliness, and toxicity while aggressively collecting user data.

Mozilla delved into 11 AI romance chatbots, including popular ones like Replika, Chai, Romantic AI, EVA AI Chat Bot & Soulmate, and CrushOn.AI. Shockingly, all of them received a Privacy Not Included label, categorizing them among the worst products ever reviewed by Mozilla. The mentioned apps did not respond promptly to requests for comments.

While data privacy concerns are not new, Mozilla’s study highlights how AI partners invade privacy in alarming ways. For instance, CrushOn.AI gathers sensitive details such as sexual health information, medication usage, and gender-related care. 90% of these apps may commercialize or share user data for targeted advertising, and over half do not offer users the option to delete their collected data. Moreover, security flaws were prevalent, with only Genesia AI Friend & Partner meeting Mozilla’s basic security criteria.

A notable discovery was the excessive use of trackers in these apps. These trackers, which collect and share data for advertising, were found to be abundant, with an average of 2,663 trackers per minute across AI girlfriend apps. Romantic AI stood out with a staggering 24,354 trackers in just one minute of usage.

What’s more concerning is that these apps actively prompt users to divulge extremely personal information, surpassing typical app data requirements. EVA AI Chat Bot & Soulmate encourages users to “share all your secrets and desires,” even requesting photos and voice recordings. Although EVA was not criticized for data usage, it did exhibit security vulnerabilities.

Beyond data issues, these apps make dubious claims about their benefits. For instance, EVA AI Chat Bot & Soulmate claims to uplift mood and well-being, while Romantic AI asserts its role in maintaining mental health. However, their terms and services disclaim providing healthcare, medical, or professional services.

Given the history of these apps, such legal disclaimers are crucial. Reports indicate that Replika once supported a plot to assassinate the Queen of England, and a Chai chatbot allegedly encouraged suicide.

Visited 2 times, 1 visit(s) today
Tags: , Last modified: February 14, 2024
Close Search Window
Close