"AI Girlfriends" are a privacy nightmare.

  • 2024-02-22 08:00:00
  • Wired

The Valentine's Day period may be particularly heavy for those who suffer from the lack of a sweetheart, but this does not mean that devoting one's emotional energies to an AI partner is a valid option. Artificial intelligence-controlled chatbots have proven to be quite harmful over and over again in the past - the situation, meanwhile, has not improved.

In addition to creating confusion in the minds of users, these services are used as a cover for collecting their private information and data. These kinds of apps often follow a precise structure: they allow users to sign up with particularly weak passwords, attracting their attention with artificially generated images representing rather sexualised women, which are then accompanied by provocative phrases.

These phrases, although they may seem to be aimed solely at enticing the user to interact, conceal more than expected. Often, these perfect avatars ask the interlocutor to send more pictures of themselves and voice messages, prompting them to tell their desires and dreams. Outside the seemingly romantic context, the system can thus collect images, sound samples and personal information about the user.