What Copilot actually does with your data (and why that matters)

Microsoft 365 Copilot isn't a standalone AI tool – it's an AI layer built directly on top of the Microsoft Graph. The Microsoft Graph is the API that connects everything in your Microsoft 365 environment: emails, calendar events, Teams messages, SharePoint documents, OneDrive files, meeting transcripts, and more. When an employee asks Copilot a question, it reasons across all of that data – everything they have permission to access – to generate an answer.

That's what makes it useful. It's also what makes it a governance problem if you haven't prepared.

Copilot doesn't create new permissions. It doesn't bypass security controls or do anything technically improper. It simply makes the data an employee already has access to far more accessible and far easier to query – in plain English, conversationally, in seconds. The issue is that in most organisations, employees have access to considerably more data than they actively use or are even aware of. Copilot changes that equation significantly.

The SharePoint permissions problem most businesses don't know they have

Ask most IT managers whether their SharePoint permissions are in good shape and you'll usually get a confident answer. Dig into the actual permissions, and a different picture tends to emerge.

Files shared broadly to speed up a project that ended two years ago. Old intranet sites that were never locked down. HR documents sitting in a library that was accidentally left open to all staff. Former employees and contractors who still have active accounts. SharePoint sites created by teams who didn't have a clear governance policy to follow – so they just gave everyone access to make things easier.

This is normal. It happens in virtually every organisation that's been using SharePoint for more than a few years. It's not a sign of negligence – it's a natural consequence of a platform that makes sharing easy and makes auditing hard.

The practical consequence, once Copilot is deployed: a junior employee can ask "what does [colleague] earn?" and – if salary data happens to be in a SharePoint folder accessible to all staff, which happens more often than HR would like to admit – they may get an answer. Copilot doesn't create that problem. It just makes an existing problem conversational.

What happens when you deploy Copilot without preparing the data

Businesses that deploy Copilot without doing the permissions groundwork first typically experience a predictable set of problems in the first weeks.

Users discover documents they didn't know existed – sometimes documents that shouldn't be visible to them at all. Copilot surfaces content from across the organisation, and that content doesn't always respect informal understandings about what's confidential. The informal rule that "everyone knows not to look at the finance folder" doesn't mean much when Copilot is summarising it in response to a reasonable-sounding question.

Outputs become inconsistent or unreliable. If your SharePoint contains outdated, duplicated or poorly structured content – which is common – Copilot generates outputs that reflect that chaos. Outdated policy documents, superseded price lists, old project plans – all of it is fair game. Garbage in, garbage out applies as directly to AI-assisted tools as it does to any data system.

Adoption suffers. When early users encounter unexpected or inaccurate results, word spreads quickly. The productivity gains Copilot is supposed to deliver don't materialise because trust in the tool breaks down before it's properly established. Recovering from a bad first impression is significantly harder than getting the deployment right first time.

The pre-deployment checklist: what needs to be in order first

There's no single step that resolves data governance for a Copilot deployment. It's a combination of several things, most of which are good practice regardless of Copilot.

SharePoint permissions audit. Start with what's over-shared. The Microsoft 365 admin centre includes sharing reports and site usage reports that give you a starting point. Third-party tools – ShareGate is commonly used – provide more granular visibility. The goal is to identify sites, libraries and individual documents that are accessible to a broader group than they should be, and to tighten those permissions before Copilot goes live.

Remove orphaned access. Former employees, past contractors, and project teams whose projects ended months or years ago represent a category of access that's almost always broader than it should be. These accounts should be identified and either disabled or have their permissions properly scoped down. This is straightforward but often overlooked, because it requires someone to actually do it rather than just know it needs doing.

Review Teams channel membership. Teams channels and their associated SharePoint sites inherit permissions from each other. A Teams channel that was set up as a shared channel or with broad membership gives Copilot access to all the files posted in that channel for anyone who's a member. Channel membership reviews are often missed in SharePoint-focused audits.

Establish data ownership. Before rollout, identify who owns what data across each business area – HR, finance, operations, sales. Data owners are responsible for access decisions and for ensuring their content is appropriately structured and labelled. Without named ownership, governance decisions don't get made.

Document your data processing activities. Copilot processes personal data – employee data, customer data in emails and Teams messages, meeting content. Under GDPR, this needs to be documented in your record of data processing activities. Your Copilot deployment should be assessed as part of your data protection compliance, not treated as a purely IT decision.

Microsoft Purview and sensitivity labels: do you need them?

Microsoft recommends deploying Microsoft Purview – previously Azure Information Protection – alongside Copilot. Purview allows you to classify content using sensitivity labels (Confidential, Highly Confidential, Internal Only, and so on) and to define rules that control how Copilot can interact with labelled content. You can restrict Copilot from referencing documents labelled as Highly Confidential in responses, for example, or prevent it from summarising content in certain libraries.

The honest answer on whether you need it: it depends on how sensitive your data is and how mature your existing governance is.

For organisations handling genuinely sensitive content – HR records, board materials, client-confidential documents, financial data – sensitivity labels add a meaningful layer of control that complements permissions-based governance. If someone with access to a Highly Confidential document asks Copilot about it, you can restrict what Copilot will do with that content even if the permissions technically allow access.

For smaller organisations with simpler data environments, Purview adds cost and complexity that may not be justified. The permissions audit approach – getting your access controls right before deployment – is the more practical first step, and for many businesses it's sufficient.

The key point is that Purview doesn't replace permissions governance; it complements it. Deploying Purview without fixing the underlying permissions problem doesn't solve the issue – it just adds labels to a chaotic permission structure.

User training and adoption: the step that gets skipped

Technical preparation is necessary. It isn't sufficient.

Copilot behaves differently from any tool most employees have used before. It's not a search engine, it's not a chatbot in the conventional sense, and it doesn't always make clear when it's working from limited or uncertain information. Users who go into it without any guidance tend to either over-trust outputs – treating Copilot-generated summaries as authoritative without verifying them – or under-use it, sticking to familiar workflows because the new tool feels unpredictable.

Good Copilot deployment includes training before go-live: what the tool does, what it doesn't do, when to trust it and when to verify, and how to prompt it effectively for common tasks. It also includes clear guidance on what's appropriate to ask – and an acknowledgement that some outputs will need checking, particularly early on.

It's also worth setting expectations internally about what Copilot will and won't deliver. It accelerates many tasks meaningfully – drafting, summarising, finding information – but it's not a replacement for human judgement on anything that matters. Framing it accurately from the start produces better adoption outcomes than overselling it and managing disappointment later.

Route B helps businesses prepare their Microsoft 365 environment for Copilot – from data governance and permissions reviews to deployment and training.

Get in Touch