You should exercise caution when interacting with chatbots and avoid sharing personal information with them, especially in the case of “AI girlfriends” or “AI boyfriends,” as indicated by recent research conducted by the Mozilla Foundation.
The analysis scrutinized 11 romance and companion chatbots, revealing significant security and privacy issues associated with these apps. Despite being downloaded over 100 million times on Android devices, these chatbots raise concerns by collecting extensive user data, utilizing trackers that transmit information to various entities including Google, Facebook, and companies in Russia and China, permitting weak password usage, and lacking transparency regarding ownership and the underlying AI models.
The surge in deploying large language models and creating chatbots, exemplified by OpenAI’s ChatGPT release in November 2022, has underscored the potential neglect of user privacy in this technological advancement. The study sheds light on the conflict between emerging technologies and data handling practices, highlighting the risks of data misuse and potential exploitation by hackers.
Many of these AI romantic chatbot services exhibit similar features, often presenting AI-generated images of women alongside suggestive messages. These apps encourage role-playing, intimacy, and sharing personal information, raising red flags about data privacy and security.
The lack of clarity regarding data sharing practices, ownership details, weak password enforcement, and ambiguity surrounding AI usage underscores the need for enhanced scrutiny and caution when engaging with such apps. The analysis points out instances where apps like Romantic AI sent out numerous ad trackers within a short span, despite claiming not to sell user data.
Moreover, the research reveals a dearth of information about the technologies powering these chatbots, with inadequate transparency on data retention, generative models employed, and user control over their data. The analysis also highlights concerns about the potential emotional impact on users if these chatbots undergo significant changes or cease operations.
To mitigate risks associated with AI girlfriends and similar chatbots, users are advised to prioritize security measures such as using strong passwords, refraining from signing in via social media accounts, deleting unnecessary data, and opting out of data collection where possible. Limiting the disclosure of personal information, such as names, locations, and ages, is crucial to safeguarding privacy when interacting with these services.