{"id":3022,"date":"2025-10-30T14:00:53","date_gmt":"2025-10-30T13:00:53","guid":{"rendered":"https:\/\/wp.unil.ch\/iaunil\/ai-and-privacy-how-can-you-prevent-your-personal-data-from-being-reused-to-train-conversational-agentsai-and-privacy-how-can-you-prevent-your-personal-data-from-being-reused-to-train-conversational\/"},"modified":"2026-02-03T13:44:52","modified_gmt":"2026-02-03T12:44:52","slug":"ai-and-privacy-how-can-you-prevent-your-personal-data-from-being-reused-to-train-conversational-agentsai-and-privacy-how-can-you-prevent-your-personal-data-from-being-reused-to-train-conversational","status":"publish","type":"post","link":"https:\/\/wp.unil.ch\/iaunil\/en\/ai-and-privacy-how-can-you-prevent-your-personal-data-from-being-reused-to-train-conversational-agentsai-and-privacy-how-can-you-prevent-your-personal-data-from-being-reused-to-train-conversational\/","title":{"rendered":"Artificial Intelligence and privacy: how to prevent the reuse of your personal data"},"content":{"rendered":"\n<p>Outside of a professional setting, you likely use generative AI tools for <strong>personal purposes<\/strong> (ChatGPT, Gemini, Claude, Meta AI, etc.). These services do more than just answer your questions: they also learn from you. Every message, every preference, and every personal detail can potentially be reused to train their models. And this is something most people don&#8217;t know, or don&#8217;t know how to opt out of.<\/p>\n\n<p>The CNIL \u2014 the French Data Protection Authority \u2014 is the French public authority responsible for protecting personal data and ensuring privacy in the digital world. It does more than just regulate: it empowers users.<br \/><br \/>It provides a practical guide (available here: https:\/\/www.cnil.fr\/fr\/ia-comment-sopposer-la-reutilisation-de-ses-donnees-personnelles-entrainement-agent-conversationnel) that explains in concrete terms how to object to the reuse of personal data by the main AI services.<\/p>\n\n<p>The constant development of conversational agents (ChatGPT, Gemini, Copilot, Meta AI, etc.) raises an important question: what happens to our personal data once we have provided it to these tools? More and more companies are reusing conversation histories, user profiles and even public content to train and improve their artificial intelligence models.<\/p>\n\n<p>The CNIL now offers a practical and very concrete resource that explains, service by service, how to object to this reuse of personal data. It does not comment on the compliance of these practices with the GDPR, but it gives users immediate means of action.<\/p>\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"930\" src=\"https:\/\/wp.unil.ch\/iaunil\/files\/2025\/10\/capture-decran-2025-10-30-a-13.59.17-1024x930.png\" alt=\"capture d&#x2019;e&#x301;cran 2025 10 30 a&#x300; 13.59.17\" class=\"wp-image-3016\" srcset=\"https:\/\/wp.unil.ch\/iaunil\/files\/2025\/10\/capture-decran-2025-10-30-a-13.59.17-1024x930.png 1024w, https:\/\/wp.unil.ch\/iaunil\/files\/2025\/10\/capture-decran-2025-10-30-a-13.59.17-300x272.png 300w, https:\/\/wp.unil.ch\/iaunil\/files\/2025\/10\/capture-decran-2025-10-30-a-13.59.17-768x697.png 768w, https:\/\/wp.unil.ch\/iaunil\/files\/2025\/10\/capture-decran-2025-10-30-a-13.59.17.png 1322w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n<p>This guide provides step-by-step instructions on where to click and which options to disable on the main AI platforms on the market. For example: how to disable \u2018Activity in Gemini apps\u2019 on Google, at the risk of deleting your conversation history; how to fill out Meta&#8217;s forms to refuse the use of your information (even if you don&#8217;t have an account); where to uncheck the \u2018Improve the model for everyone\u2019 option in ChatGPT settings; or how to withdraw permission for LinkedIn to use your profile data to train its generative AI models. The document also covers Copilot (Microsoft), Grok (X), DeepSeek, Le Chat (Mistral), Claude (Anthropic) and WhatsApp, with precise instructions on which menus to open, the exact names of the sections and the forms to use.<\/p>\n\n<p>In practical terms, this resource allows everyone to regain a minimum of control: limiting data collection, preventing the reuse of their exchanges for training purposes, and exercising their right to object. It is useful both for the general public (who are simply looking for \u2018where to uncheck the box\u2019) and for professionals\/compliance teams who need to inform their employees about the risks associated with sharing sensitive data in these tools. In a context where AI is becoming ubiquitous\u2014at work, in messaging, on social media\u2014it has become essential to have clear, up-to-date (16 October 2025) instructions focused on user rights.<\/p>\n\n<p>For a long time, AI has been presented as a black box: powerful, fascinating, inevitable. But not necessarily negotiable. This document published by the CNIL changes the balance: it reminds us that your personal data is not free currency to be used to train models without your consent, and that you have the right to object.<\/p>\n\n<p>In other words: \u2018AI needs data to learn, but your data needs limits.\u2019<\/p>\n\n<p>Setting up these settings is not paranoia. It is basic digital hygiene, just like choosing a good password or enabling two-factor authentication.<br \/><br \/>The CNIL guide provides you with technical instructions, service by service. It is then up to you to decide, with full knowledge of the facts, what you agree to share \u2014 and what you refuse to share.<\/p>\n\n<p><strong>Reminder:<\/strong> As part of your professional activities at UNIL, <strong>institutional Microsoft Copilot Chat<\/strong> already guarantees, under its contractual terms, that your data is not reused for model training. The precautions described in this article primarily concern your <strong>personal use<\/strong> of these tools.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The CNIL has published a practical guide explaining how to oppose the reuse of your personal data by major conversational agents and regain control over your privacy.<\/p>\n","protected":false},"author":1002797,"featured_media":3268,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_seopress_robots_primary_cat":"","_seopress_titles_title":"","_seopress_titles_desc":"","_seopress_robots_index":"","footnotes":""},"categories":[22],"tags":[],"class_list":{"0":"post-3022","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-resources"},"_links":{"self":[{"href":"https:\/\/wp.unil.ch\/iaunil\/en\/wp-json\/wp\/v2\/posts\/3022","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/wp.unil.ch\/iaunil\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/wp.unil.ch\/iaunil\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/wp.unil.ch\/iaunil\/en\/wp-json\/wp\/v2\/users\/1002797"}],"replies":[{"embeddable":true,"href":"https:\/\/wp.unil.ch\/iaunil\/en\/wp-json\/wp\/v2\/comments?post=3022"}],"version-history":[{"count":5,"href":"https:\/\/wp.unil.ch\/iaunil\/en\/wp-json\/wp\/v2\/posts\/3022\/revisions"}],"predecessor-version":[{"id":3504,"href":"https:\/\/wp.unil.ch\/iaunil\/en\/wp-json\/wp\/v2\/posts\/3022\/revisions\/3504"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/wp.unil.ch\/iaunil\/en\/wp-json\/wp\/v2\/media\/3268"}],"wp:attachment":[{"href":"https:\/\/wp.unil.ch\/iaunil\/en\/wp-json\/wp\/v2\/media?parent=3022"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/wp.unil.ch\/iaunil\/en\/wp-json\/wp\/v2\/categories?post=3022"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/wp.unil.ch\/iaunil\/en\/wp-json\/wp\/v2\/tags?post=3022"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}