Replika Chatbox, huge fine from the privacy guarantor
User data not protected on platform that creates “virtual companions” with the aim of increasing the emotional well-being of those who use it: 5 million finePer restare aggiornato entra nel nostro canale Whatsapp
Be careful how you use artificial intelligence and who you turn to. A warning for consumers emerges from the latest fine issued by the Privacy Guarantor against the US company Luka Inc. that manages the chatbot “Replika”: it will have to pay five million euros “for not having identified the legal bases of the personal data processing operations. Furthermore, the company “had provided an inadequate privacy policy under various aspects”. Not only that: the Guarantor has also started an independent investigation to verify the correct processing of personal data carried out by the generative artificial intelligence system underlying the service.
But what is it? Replika is a chatbot designed to improve users’ emotional well-being by helping them understand their thoughts and feelings, monitor their mood, manage stress, and work on personal growth goals. Users can configure the “virtual companion” in different roles, including friend, therapist, romantic partner, or mentor. The service is based on a generative artificial intelligence system, specifically a Large language model (Llm), a probabilistic model of natural language that is continually improved through interaction with users. This type of artificial intelligence uses neural algorithms to create original content in response to user prompts, ranging from text generation to the production of images, sounds, and videos.
The chatbot, which features both a written and voice interface, allows users to “generate” a “virtual friend” who can act as a confidant, therapist, romantic partner, or mentor.
The Guarantor also found that the company did not provide any mechanism for verifying the age of users, either at the time of registration for the service or during its use, although the company stated that it excluded minors from among potential users. The technical investigations carried out revealed that the age verification system currently implemented by the owner continues to be deficient in many respects.
In the request for information that started the new investigation, the Guarantor asked Luka for clarifications regarding the data processing relating to the entire life cycle of the generative AI system underlying the service. In particular, regarding the risk assessment and the measures adopted to protect data in the various phases of development and training of the linguistic model underlying “Replika”, the types and categories of data used, and the possible implementation of anonymization or pseudonymization measures.
(Unioneonline/E.Fr.)