Concern for data privacy not only covers emails, social networks or web pages. The increasingly used, AI models also have a lot to say about it. In fact, the question is simple: Should we worry about the use that chatgpt and others give our data? Short answer: Yes.
In a nutshell, the information we give to chatgpt can be used to train the model. This means that it can be used to generate an answer to another person’s question. If this worries you, you can deactivate the training of the model with your data.
In conversation with Javier F. Saavedra, Global General Manager of T2ó One, a company specialized in digital technology, explains that “Privacy by using AI tools as chatgpt is a very important, delicate and unfortunately unknown topic currently. It is critical to know when the LLM models use or not, our conversations and information to improve the quality of their answers. ”
While deactivating training mode can solve your own concerns, there are still certain standards in the field of privacy, such as data protection standards in Europe (GDPR), and The question is: how does Chatgpt meet them?
The main privacy regulation is the GDPR, the General Data Protection Regulation of the EU. This law allows natural persons to maintain control over their own personal data. In addition, it protects the data from improper use by organizations. One of the essential parties of the GDPR is the right to oblivion, which means that you can request an organization that Eliminate your personal data. And this is where chatgpt usually fails.
Deleting someone’s personal data would be a fairly complex task, especially with an intensively trained model as chatgpt. It is very difficult to identify what data delete and where to find them. Besides, AI systems like chatgpt cannot forget how humansbut they can adjust the importance of the data in their knowledge. In short, they cannot delete your data, but they can choose not to use them.
In this way, fulfilling the GDPR seems a real challenge for OpenAi. The company has already been accused of data violations by a Polish citizen. In addition, Italy has already banned chatbot for almost a month due to concerns about the data. Therefore, there is still work to be done.
We return to the initial question: Is it public what you ask of Chatgpt? “On the one hand, none of our data is openly shared with other users, that is, if I ask Chatgpt for the opinion of another person, the model will respect our privacy -confirms Javier F. Saavedra -. However, according to the version used, our data will be used or not as training datasets. The recommendation is to use the versions of APIS (interfaces), the Chatgpt Team and Enterprise model in which privacy is guaranteed. For individual versions, the use of the temporal version (Temporary Chats) is recommended to prevent the model from keeping our information. It is each user’s responsibility to understand privacy and transparency policies before deciding the tool to use “.
These tools can be configured as follows:
Log in chatgpt.
Click your name in the lower left corner.
Click “Configuration and Beta”.
Click “Data Controls”.
Deactivates “Chat and training history”.
Chatgpt will automatically create a new chat.
Now you can chat without training the model and your indications will not be stored in the sidebar of the left history.
This will undoubtedly help a little in the protection of your data.