The Organization of Consumers and Users (OCU) recommended this Friday to deactivate the option to share data in ChatgPT, aware that the information that is shared, even in some payment versions, could be being used to train artificial intelligence (AI) of these tools belonging to the Openai company.
This was warned in a statement in which he pointed out that the only chatgpt versions that guarantee, by default, the privacy of the user shared data are the payment versions Team (25 dollars/month) and Enterprise (negotiated price).
The rest, the free version and the other two paid versions, plus ($ 20/month) and pro (200 dollars/month), can use the registered conversations and even shared files for users for users for users Improve the accuracy and capacity of these AI models. In this regard, although he said that “they clarify that they do not use this information for commercial purposes,” he recommends deactivating this option.
To deactivate this function, he explained that it is necessary to follow four steps, this is to access the Personal Account of ChatgPT from https://chatgpt.com/, and then click on the profile photo and select configuration. In the emerging menu, go to the data control options and, then, Disable the box ‘Improve the model for all’.
“However, although deactivating the training option of three of the Chatgpt versions helps to protect privacy, many data will continue to be collected for reasons OPERATIVE, SECURITY AND LEGAL“He opted. Among them, he mentioned the basic data of the account, the use data, the technical data, the metadata of the session and the data necessary to comply with the regulations and respond to legal requirements.
New regulations
On the other hand, the OCU recalled that since February the European Union regulations on AI prohibits several practices in your user interactionwhich could affect other uses, such as manipulation to people to exploit the vulnerabilities of the user by age, disability or socio -economic condition or the classification of people based on their social behavior or personal characteristics.
For the novelty of these measures, the OCU urged administrations to inform consumers, so that they can denounce any practice that could threaten their fundamental rights.
In parallel, he requested “sufficient resources” for the new Spanish Agency for Artificial Intelligence (Aesia), the agency that will be responsible for ensuring compliance with the regulations and protection of users.