The UNIL IT Center provides the university community with a protected institutional version of Microsoft 365 Copilot Chat, deployed under the CASA-EES agreement. It is currently the only generative language model authorized at UNIL for general use, as it meets strict requirements for data confidentiality and security.

What You Can Do with Microsoft 365 Copilot Chat
Microsoft 365 Copilot Chat can be used to process confidential documents or content subject to professional secrecy, just as you would in OneDrive, Teams, or Outlook. This version is built on the same contractual security standards as other Microsoft 365 services deployed at UNIL. For these use cases, data protection is ensured as long as you remain within the Microsoft tenant; do not include any instruction in the prompt that could trigger an external (online) search.
Sensitive Data: Do Not Enter
Certain sensitive data as defined by Swiss law, such as medical records, identifiable health information, or confidential HR data, must not be entered into Microsoft Copilot. Processing of such data requires specifically validated systems — offline where applicable — approved by UNIL.
Microsoft Copilot and other standard Microsoft 365 tools (OneDrive, Teams, Outlook) are not validated environments for this type of data.
In practice
- Content is processed in Microsoft data centers located within the European Union, under contractual safeguards aligned with GDPR requirements.
- The models used belong to the same family as OpenAI’s, but they are deployed via Microsoft Azure and were not trained on the same data as ChatGPT.
- Permitted uses in teaching and research are defined by the guidelines of your faculty or school. Please ensure strict compliance with these directives.
How to access it?
- Go to copilot.cloud.microsoft.com.
- Log in with your UNIL credentials.
- Check for the ‘Protected’ shield next to your profile: it guarantees that your queries are neither stored nor used for training AI models.