Written by 7:21 am AI, Latest news

### Beware: AI Companions Seek Your Data, Not Your Heart

Romantic chatbots are putting the intimate data of 100 million users at risk users at risk, accordi…

On Valentine’s Day, as digital relationships flourish worldwide, individuals are embracing virtual connections. However, the privacy of these interactions may not be as secure as assumed.

A recent analysis by Mozilla reveals that AI companions gather extensive private data, which can be shared with marketers, advertisers, and data brokers, posing a risk of data breaches.

The study examined 11 popular romantic chatbots like Replika, Chai, and Eva, with approximately 100 million downloads on Google Play. These chatbots, powered by AI, engage users in conversations mimicking romantic relationships, friendships, or soulmate connections by processing vast amounts of personal information, often of a sensitive nature.

Misha Rykov, a researcher at Mozilla’s *Privacy Not Included project, expressed concerns about the depth of data collection by AI romantic chatbots, noting the disclosure of users’ health details, including medical treatments and gender-related care.

Mozilla criticized the inadequate security measures of most chatbots, with ten out of eleven failing to meet the company’s minimum standards, such as weak password policies. For example, Replika was flagged for storing all user-generated content and potentially sharing or selling behavioral data to advertisers, leaving users vulnerable to cyber threats due to lax password requirements.

The prevalence of trackers on these chatbots, as evidenced by the Romantic AI app containing over 24,000 trackers in just a minute, raises further privacy concerns. These trackers could be violating GDPR regulations by transmitting user data to advertisers without explicit consent.

Moreover, the lack of transparency in data practices, minimal user control over data usage, and the manipulative potential of AI companions underscore the need for enhanced safeguards. While some chatbots lack clear privacy policies and user control options, Mozilla advocates for robust opt-in mechanisms for data usage and adherence to data minimization principles to limit data collection to essential functionalities.

Despite being marketed as mental health and well-being tools, the discrepancy between the app’s claims and actual privacy policies raises red flags. Mozilla emphasizes the importance of user vigilance and discretion in sharing sensitive information online, urging caution due to the irreversible nature of data disclosure on the internet.

Visited 2 times, 1 visit(s) today
Tags: , Last modified: February 27, 2024
Close Search Window
Close