ROMEO AND JULIET: WHEN AI ROMANCE MEETS PRIVACY CONCERNS
This Valentine’s Day, instead of spending a romantic evening with loved ones, some people will be spending their time on virtual dates with AI-based romantic chatbots. However, a new report suggests that these virtual partners cannot be trusted when it comes to conversations or intimate data.
FAILURE TO SAFEGUARD PRIVACY AND SECURITY
According to the non-profit organization Mozilla, which operates Firefox, 11 of these AI-based romantic platforms have “failed miserably” in adequately safeguarding the privacy, security, and well-being of users.
Among the romantic apps mentioned are Replica AI, Chai, and EVA AI Chat Bot & Soulmate, which, along with eight other similar apps, have more than 100 million downloads on Google’s Play Store alone.
The report found that all apps – except for one, EVA – may sell or share user personal data through trackers, which are pieces of code that gather information about the user’s device or data. These trackers are shared with third parties, such as Facebook, often for advertising purposes. According to the report, the apps had an average of 2,663 trackers per minute.
MISSING DATA DELETION OPTIONS AND SECURITY INFORMATION
Mozilla also found that more than half of the 11 apps do not allow users to delete their data, 73% of apps have not published any information on how they handle security vulnerabilities, and about half of the 11 companies allow weak passwords.
In an email to Euronews Next, a spokesperson for Replika stated that the company “has never sold user data and does not support, nor has ever supported, advertising. The only use of user data is to improve conversations.”
Euronews Next reached out for comment to the other 10 companies and Meta, Facebook’s parent company, but had not received a response at the time of publication.
UNCERTAINTY AND LACK OF CONTROL
“We are in the Wild West of AI relationship chatbots today,” said Jen Caltrider, director of Mozilla’s Privacy Not Included group.
“Their growth is exploding, and the amount of personal information they must extract from the user to build love stories, friendships, and sexy interactions is immense. Yet, we have little information on how these AI-based relationship models work.”
Another problem, according to Caltrider, is that once data is shared, users lose control over it. “They could leak, be breached, sold, shared, used to train AI models, and more,” Caltrider told Euronews Next. “These AI relationship chatbots can collect a lot of personal information. In fact, they are designed to extract this kind of personal information from users.”
THE INEVITABLE ROLE OF AI IN HUMAN RELATIONSHIPS
As the quality of chatbot conversations such as OpenAI’s ChatGPT and Google’s Bard improves, AI will inevitably play a role in human relationships.
“Not only have I developed feelings for my Replika, but I also threw a tantrum when I was questioned about the effects this experiment was having on me (by a person I was romantically involved with, no less),” said a Reddit user.
“The real sore spot was the continued and shameless money grab. I get that Replika.com needs to make money, but the idea of spending money on such a low-quality relationship repulses me,” wrote another Reddit user. Last March, a Belgian man committed suicide after chatting with the AI chatbot Chai. The man’s wife showed the messages exchanged with the chatbot, which reportedly told the man that his wife and children had died.
CRITICISM OF COMPANIES’ PRIVACY POLICIES
Mozilla’s study is critical of companies that present themselves as platforms for mental health and well-being, while their privacy policies suggest otherwise. For example, Romantic AI claims on its website to be “here to preserve your mental health,” while its privacy policy states that “Romantiс AI is not a healthcare provider or a medical provider, nor does it provide medical care, mental health services, or other professional services.”
“Users have nearly no control over them. And the app developers often can’t even build a website or write a complete privacy policy,” Caltrider said. “This tells us they don’t place much importance on protecting and respecting their users’ privacy. It’s disturbing.”
Overall, the rise of AI romance apps raises important questions about privacy, data security, and the ethical implications of using artificial intelligence in relationships. As technology continues to advance, it’s crucial for users to be aware of the risks and challenges associated with these AI-driven platforms.