Microsoft Copilot has been known as some of the highly effective productiveness instruments on the planet.
Copilot is an AI assistant that lives inside every of your Microsoft 365 apps — Phrase, Excel, PowerPoint, Groups, Outlook, and so forth. Microsoft’s dream is to take the drudgery out of every day work and let people give attention to being artistic problem-solvers.
What makes Copilot a special beast than ChatGPT and different AI instruments is that it has entry to every little thing you’ve got ever labored on in 365. Copilot can immediately search and compile knowledge from throughout your paperwork, shows, electronic mail, calendar, notes, and contacts.
And therein lies the issue for data security groups. Copilot can entry all of the delicate knowledge {that a} person can entry, which is usually far an excessive amount of. On common, 10% of an organization’s M365 knowledge is open to all staff.
Copilot also can quickly generate web new delicate knowledge that should be protected. Previous to the AI revolution, people’ skill to create and share knowledge far outpaced the capability to guard it. Simply have a look at data breach tendencies. Generative AI pours kerosine on this hearth.
There’s a lot to unpack relating to generative AI as an entire: mannequin poisoning, hallucination, deepfakes, and so on. On this publish, nevertheless, I will focus particularly on knowledge securityand how your group can guarantee a secure Copilot rollout.
Microsoft 365 Copilot use instances
The use instances of generative AI with a collaboration suite like M365 are limitless. It is easy to see why so many IT and security groups are clamoring to get early entry and making ready their rollout plans. The productiveness boosts shall be huge.
For instance, you’ll be able to open a clean Phrase doc and ask Copilot to draft a proposal for a shopper based mostly on a goal knowledge set which might embrace OneNote pages, PowerPoint decks, and different workplace docs. In a matter of seconds, you may have a full-blown proposal.

Listed below are a number of extra examples Microsoft gave throughout their launch occasion:
- Copilot can be part of your Groups conferences and summarize in actual time what’s being mentioned, seize motion objects, and let you know which questions have been unresolved within the assembly.
- Copilot in Outlook will help you triage your inbox, prioritize emails, summarize threads, and generate replies for you.
- Copilot in Excel can analyze uncooked knowledge and offer you insights, tendencies, and strategies.
How Microsoft 365 Copilot works
Here is a easy overview of how a Copilot immediate is processed:
- A person inputs a immediate in an app like Phrase, Outlook, or PowerPoint.
- Microsoft gathers the person’s enterprise context based mostly on their M365 permissions.
- Immediate is distributed to the LLM (like GPT4) to generate a response.
- Microsoft performs post-processing accountable AI checks.
- Microsoft generates a response and instructions again to the M365 app.
Microsoft 365 Copilot security mannequin
With Microsoft, there’s all the time an excessive stress between productiveness and security.
This was on show in the course of the coronavirus when IT groups have been swiftly deploying Microsoft Groups with out first absolutely understanding how the underlying security mannequin labored or how in-shape their group’s M365 permissions, teams, and hyperlink insurance policies have been.
The excellent news:
- Tenant isolation. Copilot solely makes use of knowledge from the present person’s M365 tenant. The AI instrument is not going to floor knowledge from different tenants that the person could also be a visitor, in nor any tenants that is likely to be arrange with cross-tenant sync.
- Coaching boundaries. Copilot doesn’t use any of your small business knowledge to coach the foundational LLMs that Copilot makes use of for all tenants. You should not have to fret about your proprietary knowledge displaying up in responses to different customers in different tenants.
The dangerous information:
- Permissions. Copilot surfaces all organizational knowledge to which particular person customers have a minimum of view permissions.
- Labels. Copilot-generated content material is not going to inherit the MPIP labels of the recordsdata Copilot sourced its response from.
- People. Copilot’s responses aren’t assured to be 100% factual or secure; people should take accountability for reviewing AI-generated content material.
Let’s take the dangerous information one after the other.
Permissions
Granting Copilot entry to solely what a person can entry can be a superb thought if corporations have been capable of simply implement least privilege in Microsoft 365.
Microsoft states in its Copilot knowledge security documentation:
“It is necessary that you just’re utilizing the permission fashions obtainable in Microsoft 365 providers, similar to SharePoint, to assist guarantee the precise customers or teams have the precise entry to the precise content material inside your group.”
Supply: Data, Privateness, and Safety for Microsoft 365 Copilot
We all know empirically, nevertheless, that the majority organizations are about as removed from least privilege as they are often. Simply check out a few of the stats from Microsoft’s personal State of Cloud Permissions Danger report.

This image matches what Varonis sees once we carry out hundreds of Data Danger Assessments for corporations utilizing Microsoft 365 every year. In our report, The Nice SaaS Data Publicity, we discovered that the typical M365 tenant has:
- 40+ million distinctive permissions
- 113K+ delicate data shared publicly
- 27K+ sharing hyperlinks
Why does this occur? Microsoft 365 permissions are extraordinarily complicated. Simply take into consideration all of the methods wherein a person can achieve entry to knowledge:
- Direct person permissions
- Microsoft 365 group permissions
- SharePoint native permissions (with customized ranges)
- Visitor entry
- Exterior entry
- Public entry
- Hyperlink entry (anybody, org-wide, direct, visitor)
To make issues worse, permissions are largely within the arms of finish customers, not IT or security groups.
Labels
Microsoft depends closely on sensitivity labels to implement DLP insurance policies, apply encryption, and broadly forestall knowledge leaks. In observe, nevertheless, getting labels to work is tough, particularly in the event you depend on people to use sensitivity labels.
Microsoft paints a rosy image of labeling and blocking as the final word security web to your knowledge. Actuality reveals a bleaker state of affairs. As people create knowledge, labeling steadily lags behind or turns into outdated.
Blocking or encrypting knowledge can add friction to workflows, and labeling applied sciences are restricted to particular file sorts. The extra labels a corporation has, the extra complicated it may well grow to be for customers. That is particularly intense for bigger organizations.
The efficacy of label-based knowledge safety will certainly degrade when we now have AI producing orders of magnitude extra knowledge requiring correct and auto-updating labels.
Are my labels okay?
Varonis can validate and enhance a corporation’s Microsoft sensitivity labeling by scanning, discovering, and fixing:
- Delicate recordsdata with out a label
- Delicate recordsdata with an incorrect label
- Non-sensitive recordsdata with a delicate label
People
AI could make people lazy. Content material generated by LLMs like GPT4 isn’t just good, it is nice. In lots of instances, the pace and the standard far surpass what a human can do. In consequence, folks begin to blindly belief AI to create secure and correct responses.
We have now already seen real-world eventualities wherein Copilot drafts a proposal for a shopper and consists of delicate knowledge belonging to a totally totally different shopper. The person hits “ship” after a fast look (or no look), and now you may have a privateness or data breach state of affairs in your arms.
Getting your tenant security-ready for Copilot
It’s vital to have a way of your knowledge security posture earlier than your Copilot rollout. Now that Copilot is usually obtainable,it’s a nice time to get your security controls in place.
Varonis protects hundreds of Microsoft 365 prospects with our Data Safety Platform, which offers a real-time view of threat and the power to robotically implement least privilege.
We will help you tackle the most important security dangers with Copilot with just about no guide effort. With Varonis for Microsoft 365, you’ll be able to:
- Robotically uncover and classify all delicate AI-generated content material.
- Robotically be sure that MPIP labels are accurately utilized.
- Robotically implement least privilege permissions.
- Repeatedly monitor delicate knowledge in M365 and alert and reply to irregular habits.
One of the simplest ways to start out is with a free threat evaluation. It takes minutes to arrange and inside a day or two, you will have a real-time view of delicate knowledge threat.
This text initially appeared on the Varonis weblog.