Microsoft 365

14 February 2024

Microsoft Copilot: the challenges for Data Security

Microsoft Copilot: the challenges for Data Security

Microsoft Copilot is the generative artificial intelligence that lets you easily access the information you need through the Microsoft 365 environment. It is a revolutionary tool for productivity: no need to waste time searching for information or gathering documents relating to a project. All you have to do is ask Microsoft Copilot: it carries out the research, analyses it, and provides you with a summary, a ready-to-send email based on your request!

So there are challenges between Copilot and data security, particularly when it comes to access governance.

With more than 2 billion pieces of data created every day in the Microsoft 365 environment, Microsoft recommends ensuring that the appropriate users or groups have the right access to relevant content within your organisation.

Managing risks linked to access, rights and sharing is therefore essential before deploying Copilot AI. So, how should you prepare your information system, your data, before using Microsoft Copilot? What are the associated risks? What solutions should you implement for trouble-free and secure use?

3 security concerns Microsoft Copilot AI

On the agenda: 


1. Copilot and data security: the risk of overexposure

Microsoft Copilot is a generative AI tool that makes it easy for users to find documents relevant to their work. However, it can also expose sensitive data if it is not properly protected (rights, storage, etc.).

Data overexposure with Microsoft Copilot is the main risk for the security and confidentiality of information assets.

The intelligent assistant relies on files, messages, calendars and contacts with other people, internal or external to the organisation. However, if sharing and permissions settings are not properly configured or controlled, there may be unauthorised or unwanted access to data.

Some examples which could pose problems and lead to financial, legal or even reputational consequences:

  • An incorrectly stored document: a financial plan, pay slips, the strategic plan, should not be accessible to everyone unless they are saved in a space where the rights are open to the entire company. Copilot could offer it to any employee when they perform a matching search.
  • It could also give access to a document that was thought to be well hidden in sub-directories.
  • Incorrectly configured sharing: with too large a group, sharing with the entire company, with an external guest.

To avoid the risks associated with data over-sharing in Microsoft 365, data governance policies and procedures must be adopted that ensure and govern the protection of the organisation's information assets. It is also important to know how to effectively manage data access, sharing and volumes and how to ensure that the environment and information assets are under control. Finally, it is necessary to regularly check that important documents are properly secured and shared with the appropriate recipients.

Case studies, webinar, guides.... discover our resources center


2. Microsoft Copilot and sensitive data

One of Copilot's greatest strengths is that it does not override existing permissions. If a user does not have access to a specific document, Copilot will not suggest it, no matter how relevant the request.

However, you must be vigilant when configuring rights and authorisations, especially for SharePoint sites, Teams groups, or strategic or confidential documents in order to restrict data access to authorised persons.
TIP: appoint and involve at least one member responsible for managing the rights of a group, team or sensitive site.

According to the Just-Enough-Access principle, Microsoft also recommends using Microsoft Purview for sensitive data and implementing data loss prevention (DLP) and suspicious user activity detection systems.

Microsoft Purview guarantees compliance with classification labels when Copilot queries sensitive documents. To do this, you must first have defined a classification strategy and trained users to systematically classify confidential data.


3. Copilot: preparing for data security

Accurate data access management is therefore essential for properly controlling data security and confidentiality, and data exposure in the company's information system. Who has access? Are the permissions and rights all compliant and legitimate?

Some prerequisites for deploying COPILOT securely:

  • Apply the principle of least privilege
  • Analyse the risk of data overexposure
  • Map and classify sensitive data (need to know)
  • Review, correct, clean rights regularly
  • Empower Teams and Sharepoint owners
  • Raise awareness, support internal use
  • Involve users in rights review mechanisms prior to mass adoption

IDECSI supports its customers in this revolution thanks to a “DETOX for Microsoft 365 campaign to effectively manage and clean risky permissions:

  • Risk audit & visualisation
  • Review of rights and large-scale remediation
  • Measuring and monitoring the performance of this DETOX action

DETOX for M365 : check-up, correction and results


Our articles

These articles may
interest you

Microsoft Copilot data access secure
Microsoft 365

Microsoft Copilot: 5 steps to secure data access

Lire l'article
Copilot: Microsoft's new generative AI
Microsoft 365

Copilot: Microsoft's new generative AI

Lire l'article
Microsoft Loop: the latest features of the collaborative hub
Microsoft 365

Microsoft Loop : new collaborative hub

Lire l'article

Data protection, let's discuss your project?


Contact us
video background