Microsoft Copilot is delivering on the promise of artificial intelligence (AI), giving small- and medium-sized businesses (SMBs) access to enterprise-like tools for productivity and efficiency. However, understanding exactly what data Copilot uses, how to best use Copilot search, and how secure the data Copilot uses is – that’s another story. Read on to learn more about best practices for SMBs and the managed service providers (MSPs) serving them, when it comes to Copilot search and data.
Answering your top questions about Copilot data, functionality, and security
While Copilot unlocks benefits for SMBs such as quick content creation, email drafting, and meeting summarization, implementing a tool like this highlights some data security previously not seen as a priority for small businesses. Now that Copilot can review your company’s data, this brings on an entirely new set of questions around data and security.
However, Microsoft’s commitments to privacy, security, compliance, and responsible AI practices should help alleviate security concerns MSPs or SMBs may have about how their data is used for Copilot. To ease your mind further, we answer some of the top questions we’ve received about how organizational data is used for Copilot.
Question 1: Is my data training the large language model (LLM) behind Copilot?
The answer to this question is no. When you use Copilot with commercial data protection, users signed in with their Entra ID will not have their chat data saved, as chat prompts and responses are discarded. In addition, no one at Microsoft views your data, and chat data is not used to train the LLMs that power Copilot. In short, Copilot will protect your personal and company data.
Question 2: If I search next to my coworker, will our answers be the same?
No, because the data that Copilot searches against will not be the same. Microsoft 365 Copilot lets you review the data that it’s allowed to use as well as the data the users have access to from the organization. This is why Pax8 believes data security, including access management, is the first and most important step in AI readiness.
When prepping for AI, another important step to take is to restrict SharePoint search. This enables you to:
- Reduce risk of oversharing while keeping the same permissions for what users can access
- Maintain user access to OneDrive, files, and calendars
- Restrict SharePoint search for non-Copilot users
Essentially, this step keeps your users’ current access while preventing them from reaching specific SharePoint sites they haven’t been granted access to, thereby increasing your security posture.
Question 3: What is the semantic index?
The semantic index for Microsoft 365 Copilot is a powerful feature that enhances search capabilities by connecting with your personal and organizational data via Microsoft Graph.
The semantic index sits on top of Microsoft Graph and interprets user queries to produce contextually relevant responses, giving users more effective results. When you search, the semantic index connects you with relevant information based on your connections between content and people in your network. It uses mathematical representations (vectors) to understand relationships between data points, such as words or images.
Creating vectorized indices enables conceptual understanding – helping identify what users are looking for and granting access to relevant organizational content. Microsoft Graph indexes content and signals from most Microsoft 365 applications in your tenant.
Question 4: What are the security implications of Copilot and the semantic index?
The semantic index follows the security and policies of Microsoft Graph. User queries, whether through search or in Microsoft Copilot, always operate within the user’s security context. Only content accessible to the user is shown, ensuring data privacy and compliance.
The semantic index respects all organizational boundaries within your tenant. It’s built on Microsoft’s comprehensive approach to security, compliance, and privacy. Microsoft Copilot for Microsoft 365 operates with multiple protections, including blocking harmful content and detecting protected material. And it maintains compliance commitments, including GDPR and EU Data Boundary.
What this means is that the semantic index ensures only relevant and secure information is surfaced. And the information it provides should evolve over time as Microsoft expands Copilot’s capabilities.
What if my Copilot question was not answered here?
At Pax8, we have developed a number of resources to help MSPs and SMBs succeed in their Copilot journeys.
- Visit the Pax8 Copilot page
- Watch Copilot Academy sessions
- Check out the Copilot labs from Microsoft
- Attend our Monthly Microsoft Updates to learn and ask questions on the latest products and features
If you have other questions or concerns, contact your Pax8 representative. We’re here to help you and your clients succeed with Copilot.