The arrival of the generative AI Microsoft Copilot increases productivity, research and information access capabilities in the Microsoft 365 environment and therefore reinforces the need to properly govern and control data access within all Microsoft apps Teams, SharePoint, OneDrive, Outlook, and more.
Accurate management of access, rights and sharing seems essential before deploying Copilot AI to ensure that users only have access to appropriate data and avoid content oversharing.
Here are 5 tips for making data access secure when deploying Copilot:
Before deploying Copilot, companies must carry out an analysis and a snapshot of their data and information assets. The challenge is to understand how information stored in OneDrive or SharePoint is shared inside and outside the organization, where the data is located, who handles the most sensitive and strategic data, how the data is shared, and who has access to what.
Analysing risks when deploying a tool like Microsoft Copilot is essential in order to identify the potential risks and threats to which the information system may be exposed and to minimise the attack surface.
The Microsoft Copilot tool relies on the permissions or access policies put in place. This means that it will not offer any document or information to a person who does not have the right to access it. However, the risk of unauthorised, malicious access may occur if rights and authorisations are not configured in a compliant manner.
It is therefore necessary to pay particular attention to accesses and authorisations by, for example, mapping data accesses, rights and permissions in order to identify sensitive and critical points and correct them.
Visualising critical points on information assets allows the company to clearly see the risk of data overexposure data or non-compliant access in order to establish an appropriate action and remediation plan.
In a "secure by design" approach, once the risks have been analysed, it is important to be able to minimise the attack surface and remedy critical points.
When deploying the Copilot tool, ensure you have processes to identify potentially overshared content and notify data owners to individually remediate or automate it.
One of the main difficulties of this type of management is its evolving nature, in perpetual motion. Every day new files are created and shared, new permissions are granted.
Is there any sharing controls (e.g. default sharing link, link expiration, site owner sharing approvals)? How to investigate changes, audit regulary?
Generative AI is a powerful and innovative tool that enables considerable productivity gains. To get maximum benefits, these new uses must be supported with practical cases and training. It is also essential to regulate these uses by adding limits and making users responsible for the risks, particularly around data security and the consequences of poor configuration/sharing, for example.
Implementing the Copilot tool is a guarantee of trust in end users but it requires a secure approach. To protect against the risks of data leaks and malicious intent, it is useful to implement an effective Data Access Governance strategy in which the user is a stakeholder. This process includes data inventory, cleaning, stakeholder engagement and user training.
Read more: Microsoft Copilot: the challenges for Data Security
The new DETOX dynamic audit solution for Microsoft 365 gives our customers the resources to effectively prepare for the mass adoption of the Microsoft Copilot generative AI tool.
It is an “all-in-one” solution that aims to avoid overshared content by eliminating dangerous, risky or obsolete access, thanks to a dynamic audit and automate remediation.
The solution includes an audit phase (collecting and analysing meta data) with the results output via simple and clear dashboards (tenant status, risks, problem areas to be corrected).
The great strength of the DETOX solution is that it provides for mass remediation by data owners. Users who have critical points to correct are targeted with a revalidation and verification campaign. They receive a personal dashboard, MyDataSecurity, and check and correct points that require action (validation or correction). First audit, remediate and then track the changes to keep the environement compliant and secure.