Share via


Service components for Copilot guidance for Sensitive and Regulated customers

The Microsoft 365 Copilot configuration and planning guide is intended for sensitive and regulated customers in Australia and New Zealand sensitive. This guide aligns with the Australian Signals Directorate (ASD) Blueprint for Secure Cloud configuration guidance for Microsoft 365.

This article examines service components and the service boundary, which are essential for understanding Copilot’s security and compliance capabilities, including in terms of the ASD Blueprint.

Service components

Microsoft Global Network

The Microsoft Global Network is a vast infrastructure that supports the Microsoft 365 service boundary by ensuring secure and efficient data flow. It includes networks or Internet Service Providers (ISPs) peering with the Microsoft Global Network and more than 150 peering locations, which help optimize connectivity and reduce latency. This network ensures that user connections to Microsoft 365 services take the shortest possible route to the nearest Microsoft Global Network entry point, maintaining high performance and reliability.

The Microsoft Global Network hosts the Bing API service, which grounds responses with the latest information from the web, and other Microsoft services that agencies can connect to via encrypted flows.

Microsoft 365 service boundary

Microsoft 365 has a service boundary that encompasses the various Microsoft 365 services within it, ensuring they adhere to Microsoft 365’s architectural, software engineering, security, compliance, and privacy standards and controls. This boundary delineates the scope of the Microsoft 365 service from the wider Microsoft Cloud offerings and maintains a consistent standard across its components. Systems inside the boundary are subject to Microsoft 365’s rigorous compliance standards.

Microsoft 365 Copilot operates within the Microsoft 365 service boundary, making it an integral part of the Microsoft 365 suite alongside other services such as SharePoint, Exchange, Teams, Planner, Microsoft 365 Search, and others. Microsoft 365 Copilot is a core online service, which makes it subject to the strongest set of security and compliance contractual commitments within the Microsoft Product Terms.

Copilot Orchestrator

The Copilot Orchestrator, represented as Microsoft 365 Copilot, manages all relevant interactions. The Copilot Orchestrator isn't an AI, but a coordinator across AI models, data sources, and plugins that comprise the Copilot experience.

The Copilot Orchestrator receives the interaction from the end user and initiates a Retrieval-Augmented-Generation (RAG) process to generate a natural language response, which is processed through several steps before being delivered back to the user.

For more information about the Retrieval-Augmented-Generation process, see Microsoft’s Azure OpenAI Service online documentation. Microsoft has documented the data privacy commitments for Microsoft 365 Copilot and information about the web search component.

Large language models

Large language models (LLMs) are advanced AI tools that interpret and generate text in a way that resembles natural human language. They're trained on vast amounts of text data, learning statistical relationships that enable them to perform a wide range of language-related tasks. LLMs can generate coherent and contextually relevant text, translate languages, summarize content, answer questions, and even help with creative writing or code generation.

Microsoft 365 Copilot uses a combination of large language models, which we detail in our Transparency Note.

Microsoft hosts the large language models used by Microsoft 365 Copilot in an instance of the Azure OpenAI Service. This instance of the Azure OpenAI Service is exclusively for Microsoft 365 and is within the Microsoft 365 service boundary. Microsoft doesn't train the large language models on customer data, nor does the Azure OpenAI Service retain customer data. Microsoft 365 Copilot automatically inherits your organization’s security, compliance, and privacy policies set in Microsoft 365, along with the content access permissions of the end user who initiates the interaction.

Microsoft Graph

The Graph is an integral part of every organization’s Microsoft 365 tenancy. It's where the search index resides, and it's the mechanism that controls access to customer's content. The Graph is the gatekeeper to content and plays an integral role in all Microsoft 365 interactions. It enforces the security permissions present on a customer’s content and provides secure, compliant, auditable access to that content.

Microsoft 365 Copilot accesses content through the same mechanism that users interact with when they access their files or perform a search on a day-to-day basis.

In this way, Microsoft 365 Copilot can only access content the end user already has access to. Therefore, the large language model can't decide to access content the user doesn't have access to, as it's constrained by the same mechanism that constrains the user in other scenarios outside of Copilot.

Microsoft 365 Copilot introduces an additional search indexing method to embed semantic understanding of concepts and language into the Microsoft Graph and enable Copilot and Microsoft 365 Search to better understand natural language expressions. This indexation method significantly improves the ability for Copilot to locate and use the most relevant content. This additional natural language data is called the Semantic Index.

Copilot Agents

Copilot Agents are part of the Microsoft 365 Copilot ecosystem. They enhance productivity by automating tasks and processes through advanced AI capabilities. They're available in both Microsoft 365 Copilot and in Copilot Chat experiences.

Copilot Agents connect to an organization’s knowledge and data sources. By doing so, they can access and utilize specific documents, databases, and other information repositories to provide accurate and contextually relevant responses. This connectivity is facilitated through APIs and connectors that link the agents to various data sources, ensuring that they can retrieve and process the necessary information in real-time. Additionally, Copilot Agents are designed to be extensible, allowing developers to equip them with new skills and capabilities tailored to specific business needs.

Service architecture and information flow

Microsoft 365 Copilot service architecture

The following diagram helps Microsoft 365 Copilot customers understand the service architecture and provides context for this page.

An image that shows the architectural components of Microsoft 365 Copilot.

Here's the general flow of information through the Microsoft 365 Copilot service:

  1. In a Microsoft 365 app, a user enters a prompt in Copilot.
  2. Copilot preprocesses the input prompt using grounding and accesses Microsoft Graph in the user's tenant, or if enabled, from other platforms.
  3. If web-grounding is enabled, Copilot gathers information from the Bing Index.
  4. Copilot sends the grounded prompt to the LLM. The LLM uses the prompt to generate a response that is contextually relevant to the user's task.
  5. Copilot returns the response to the app and the user. The prompt and the results are both logged and available for admins to view via the Data Security Posture Management (DSPM) for AI capability in Microsoft Purview.

For more information on how the Microsoft 365 Copilot works, see the architectural walkthrough.

Copilot Chat service architecture

Copilot Chat is available for eligible customers, with Enterprise data protection, at no extra cost.

An image that shows the architectural components of Microsoft 365 Copilot Chat.

Here's the general flow of information through the Copilot Chat service, which is a subset of the Microsoft 365 Copilot service:

  1. A user uses the Copilot Chat interface to enter a prompt through either a browser or the pinned application in Microsoft Teams.
  2. If the user includes a document as part of the prompt, the document is stored in the user’s OneDrive, within the Microsoft 365 service boundary. No other interactions are possible with organizational data. All data Copilot uses to generate responses is encrypted in transit.
  3. If web grounding is enabled, Copilot Chat parses the user’s prompt and identifies terms where information from the web would improve the quality of the response. Based on these terms, Copilot generates a search query that it sends to the Bing API, where the Bing Index is used to get the latest search results. The security around the interaction ensures that the Bing Index doesn't 'learn' from this interaction. Administrators can turn the Bing Index interaction either on or off. If web grounding is turned off, this interaction doesn't take place, and the flow moves directly to the next interaction. Data, privacy, and security for web search in Microsoft 365 Copilot and Microsoft 365 Copilot Chat outlines the data privacy and security commitments from Microsoft when using the web-ground Copilot Service. Access to web content (Bing integration) in Microsoft 365 Copilot and Copilot Chat captures the recommended guidance for government and regulated customers.
  4. Copilot sends the web-grounded prompts to the LLM, or (where web grounding is turned off) the original prompt is sent to the LLM for processing, with any attached files. The LLM uses the prompt to generate a response that is contextually relevant to the user’s task (applying its conceptual understanding of the world and its ability to deduce logical conclusions).
  5. Copilot returns the response to the app and the user. The prompt and the results are both logged and available for admins to view via the Data Security Posture Management (DSPM) for AI capability in Microsoft Purview.

The differences between Microsoft 365 Copilot and Copilot Chat are detailed in Microsoft Community Hub Blog post.

Note

Microsoft recommends that Copilot should be pinned for all users across government.

Example scenario walkthroughs

The Copilot Chat service operates within the broader Microsoft 365 service boundary. The following scenarios describe the logical flow of a user interaction with web-grounding enabled.

Scenario 1: A web search on the latest patches released by Microsoft (submitted via the Copilot Chat app in Microsoft Teams)

The following sequence describes how Copilot Chat executes scenario 1:

  1. The user enters the following prompt in Copilot Chat: In a table, can you give me a list of all the security patch updates released by Microsoft for Windows 11 in the last 30 days? Include an assessment of the impact to IT and to the end user in a column. Also include the ISM controls that each patch is looking to address and the timeframes required for execution in compliance with the ISM.
  2. No document is attached to this query, so no Bing Index call is made through the API.
  3. Copilot sends a set of keywords to the Bing Index to search for the latest information. Content from multiple references might be used as input to compile a response.
  4. The latest search results are sent to the LLM along with the initial prompt where, in addition to existing information, a response is formulated.
  5. Copilot returns the response and the associated references to the user to validate. The prompt and the results are both logged and available for admins to view via DSPM for AI.

Scenario 2: A web search regarding a departmental project is made via Copilot Chat

The following sequence describes how Copilot Chat executes scenario 2:

  1. The user enters the following prompt in Copilot Chat in the browser: Can you give me details about Project Sunshine, an activity underway at the <department name>?
  2. No document is attached to this query.
  3. Copilot sends a set of keywords to the Bing Index to search for the latest information. As this is an internal project, it's unlikely that any information is available from the Bing Index. Bing doesn't learn from this interaction.
  4. The search results are sent to the LLM where, in addition to existing information, a response is formulated.
  5. Copilot returns the response and the associated references to the user to validate. References might include sources that were searched for information about the project via which no information was found. The prompt and the results are both logged and available for admins to view via DSPM for AI.

Scenario 3: A Copilot chat prompt that references a document classified as Protected

The following sequence describes how Copilot Chat executes scenario 3:

  1. The user uploads a document that is classified as Protected into Copilot Chat and types the following prompt in Copilot Chat in the browser: Can you give me a summary of <document name>?
  2. A document name Protected Report.docx is attached to the query. The document is uploaded to My Files > Microsoft Copilot Chat Files within the user’s OneDrive.
  3. No information is requested from the Bing Index as the request is for the summary of existing information.
  4. The document contents are sent to the LLM, where a summary of the document is formulated.
  5. Copilot returns the response and the associated references to the user to validate. References include sources that were used to generate the summary. The prompt and the results are both logged and available for admins to view via the DSPM for AI.

Scenario 4: A user is working in a PowerPoint document classified as Protected and uses the Copilot Chat side pane

The following sequence describes how Copilot Chat executes scenario 4:

  1. The user opens a PowerPoint presentation document that is classified as Protected and types the following information in Copilot Chat in the side pane: Analyze the document and make suggestions on how to improve it.
  2. The document content is already available, so a copy of the document isn't stored within the user's OneDrive.
  3. Copilot sends a set of keywords to the Bing Index to search for the latest information, based on keyword matching. The Bing Index returns information that is relevant to the analysis of improvements. Bing doesn't learn from this interaction.
  4. The search results are sent to the LLM where, in addition to existing information, a response is formulated.
  5. Copilot returns the response and the associated references to the user to validate. References include sources that were used to generate the summary. The prompt and the results are both logged and available for admins to view via the DSPM for AI.

Connectors and plugins

co An image that shows the Copilot connectors and plugins.

Both connectors and plugins create access to data and systems outside of Microsoft 365. By design, they connect to data and systems outside the Microsoft 365 service boundary.

Connectors Plugins
Access external data Add new skills
Run on a schedule Run in real-time
One way data flow (read) Can support two-way data flow (read and write)
Great for static data sets, like file shares, intranets, and knowledge bases Great for unconstrained data sets like web search, and app integrations like travel booking, or case management

Copilot Studio provides a low-code/no-code toolset for developing custom connectors and plugins for Copilot. Copilot Studio is included with Microsoft 365 Copilot subscriptions and can be enabled or disabled on a per-user basis. For more information, see license assignment.

Connectors

Connectors run on a schedule to index external data into the Microsoft Graph. By indexing external content into the Graph, Copilot can then access that content. Connectors also improve the Microsoft 365 Search experience for users.

Connectors are useful for connecting large external repositories of business content. Some common examples of Connectors include file shares, on-premises intranets, knowledge bases, and an organization’s own public web site. These Connectors are relatively static, constrained sets of data. They're static in the sense that the data isn't generated in response to the specific interaction or the user context.

By indexing these data sources into the Microsoft Graph, users can access their content with Copilot (and Microsoft 365 Search) without needing to have a real-time connection to the source system.

There are a wide variety of existing Connectors available to customers to start bringing external sources of data into Microsoft 365 and into scope for Copilot. Connectors don't facilitate sending customer data outside the Microsoft 365 service boundary, but do bring data from outside the service boundary in.

For more information on available connectors and their security, review the Prebuilt connectors gallery and Connector security model.

Plugins

Plugins (Bing, etc.) are distinct from Connectors in that they run in real-time during the interaction execution to provide new skills and knowledge to Copilot. For example, the included Access to web content (Bing integration) Plugin allows real-time integration of public web content to enrich the knowledge available to Copilot.

The web content plugin enables a new real-time query source for Copilot to include. It enables Copilot to search not only for the content from inside the Microsoft 365 organization (via the Microsoft Graph) but also web content (via Bing Index) in parallel. This capability can be useful for integrating public content on a subject with private knowledge contained within the Microsoft Graph. Knowledge of current and recent events, up-to-date industry news and developments, and other quality online sources can greatly improve the quality of Copilot generated content and responses.

Another example would be a plugin to a travel booking system that Copilot can provide a natural language interface to, such that a user could search for and organize flights from within a Copilot Chat experience. This capability can provide new, integrated, and streamlined user experiences on top of non-Microsoft systems.

It's also possible for customers to create their own plugins to provide new skills and integrate other lines of business applications with Copilot, enhancing the knowledge and skill set of Copilot for the organization’s staff. For more information on Copilot plugins, see the online extensibility guide.