The Privacy Guarantor has started an investigation against OpenAI, the American company which in recent weeks announced the launch of a new artificial intelligence model, called 'Sora', capable, as announced, of creating dynamic, realistic and imaginative scenes, starting from a few textual instructions.
Considering the possible implications that the 'Sora' service could have on the processing of personal data of users located in the European Union and in particular in Italy, the Authority asked OpenAi to provide a series of declarations.
Within 20 days - it is explained in a note -, the company will have to specify whether the new artificial intelligence model is a service already available to the public and whether it is or will be offered to users located in the European Union, in particular in Italy.
OpenAI will also have to clarify a series of elements to the Guarantor: the methods of training the algorithm;
the data collected and processed to train it, especially if it concerns personal data;
whether among these there are also particular categories of data (religious, philosophical beliefs, political opinions, genetic data, health, sexual life);
what sources are used.
In the event that the service is or will be offered to users located in the EU, the Guarantor has asked the company in particular to indicate whether the methods envisaged for informing users and non-users and the legal bases for the processing of data provided by those accessing the service are compliant with the European Regulation.
Reproduction reserved © Copyright ANSA