Stop ChatGPT, the best known of relational artificial intelligence software, until it complies with privacy regulations .

The Guarantor for the protection of personal data has ordered, with immediate effect, the temporary limitation of the processing of data of Italian users against OpenAI, the US company that developed and manages the platform.

At the same time, the Authority opened an investigation. In the provision, the Guarantor notes the lack of information to users and all interested parties whose data is collected by OpenAI , but above all the absence of a legal basis that justifies the massive collection and storage of personal data for the purpose of "training " the algorithms underlying the functioning of the platform .

Among other things, the information provided by ChatGPT does not always correspond to the real data, thus determining an inaccurate treatment of personal data.

Not only that: although the service is aimed at people over the age of 13, the Authority points out that there is no filter to verify the age of users and this exposes "minors to absolutely unsuitable answers with respect to their degree of development and self-awareness" .

OpenAI, which does not have an office in the Union but has designated a representative in the European Economic Area, must communicate within 20 days the measures undertaken in implementation of what is requested by the Guarantor, under penalty of a fine of up to 20 million euros or up to 4% of the annual global turnover.


© Riproduzione riservata