Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
The Microsoft 365 Copilot configuration and planning guide is intended for sensitive and regulated customers in Australia and New Zealand sensitive. This guide aligns with the Australian Signals Directorate (ASD) Blueprint for Secure Cloud configuration guidance for Microsoft 365.
This section refers to a recommended configuration. Recommended elements address configuration that is appropriate and fit for purpose for a sensitive environment.
Office release channel
Microsoft 365 Copilot requires the Microsoft 365 Apps version of Office and isn't supported on older versions. Microsoft 365 Copilot is available on the Current and Monthly Enterprise update channels of Microsoft 365 Apps. For guidance on how to deploy, follow the Microsoft 365 Apps setup guide.
Choose your update channel based on your user requirements and risk assessment, balancing access to the latest features (Current) and a more stable build version (Monthly Enterprise). Both are suitable for sensitive environments and can be applied to user groups or individual users.
The Office Customization Tool helps you build your Microsoft 365 Apps configuration, including the choice of Update Channel.
Make sure the Office Feature Updates task is enabled for the Office applications to continue to work with Copilot correctly. The Office Feature Updates task checks for updates to Connected experiences in Microsoft 365, such as Copilot.
For more information on how to configure your update channel, see how to change your channel for Copilot.
Feedback samples
While it's standard practice in sensitive environments to disable in-product feedback mechanisms, it's worth reiterating that advice in relation to Microsoft 365 Copilot as interaction and response interactions can be submitted to Microsoft if the Allow users to include log files and content samples when feedback is submitted to Microsoft policy is enabled.
For this reason, disable the Allow users to include log files and content samples when feedback is submitted to Microsoft policy.
In environments connected to Active Directory, you can use Group Policy to set this up. However, because web experiences and modern apps are managed only through Microsoft 365's Cloud Policy, you must configure it there at the least. Rich client Office applications (Win32 apps) can be managed by both Group Policy and Cloud Policy, with Group Policy taking precedence where both are in use. For organizations that use a combination of Group Policy and Cloud Policy keep these policies in alignment with one another to avoid confusion.
You configure Feedback Cloud Policy from the Microsoft 365 portal.
For more information about Feedback policies, see managing feedback policies.
Note
Connected experiences policy is also managed by both the Group Policy and Cloud Policy.
Microsoft Teams
One of the most popular features of Microsoft 365 Copilot is the integration with Microsoft Teams. Copilot integrates into Teams in two distinct ways:
- To help with everyday tasks such as summarizing and recapping meetings and helping answer questions in chats and channels.
- As an entry point to Copilot conversational experience.
The Copilot chat experience that is surfaced in Microsoft Teams is referred to by its full name as Microsoft Copilot with Graph-grounded chat, which is reflected in the license element controls availability of the feature more broadly.
The Copilot conversational experience in Microsoft Teams is the Copilot experience. When a user selects the Copilot button in the left navigation pane, they can access the same Microsoft 365 Copilot or Copilot Chat experience available at copilot.cloud.microsoft in a browser.
- With a Microsoft 365 Copilot License: Users can ground their conversations in both work and web data.
- Without a Microsoft 365 Copilot License: The experience is limited to reasoning over web data only.
Teams meetings
Copilot in Teams meetings helps users catch up on meetings they join late and provides a structured meeting recap with notes for meetings that ended. Users can also ask Copilot questions about the content and discussion that happened during a meeting.
For Copilot to work in a meeting, Teams needs to generate a transcript. This requirement means organizations need to decide how Teams operates in this scenario. Copilot can work with transcription in two ways: retained transcripts and temporary transcripts.
Retained transcripts
Users can start retained transcripts in the meeting, or the meeting configuration can start them automatically. Teams creates and stores transcripts with the meeting according to the organization's retention settings.
Users can access, download, retain, or delete these transcripts within the organization's chosen timeframe.
This option offers the richest experience because Copilot can keep answering questions and referencing meeting details as long as the transcript is retained.
Temporary transcripts
If organizations have concerns about retaining and discovering transcripts, this option creates a temporary transcript just for Copilot. It's not discoverable or downloadable and is permanently removed when the meeting ends.
After Copilot finishes its meeting notes, it destroys the transcript. This option benefits organizations that don't want users or discovery processes to retrieve the transcript. The trade-off is that once the meeting ends, Copilot can't discuss the meeting content anymore.
Transcription method
The classification of the environment doesn't determine which transcript method to use. Sensitive environments might use either transcription method. The Australian Privacy Principles and Privacy Act don't prescribe the method organizations must choose. Each customer decides which model best suits their need to retain or dispose of meeting transcripts to meet their own retention and discovery obligations and desires.
The flexibility of Teams meeting policies means different users can have different requirements. Organizations can provide different settings for different user groups.
Teams Meeting policies
Configure Teams Meeting policies in the Microsoft Teams admin portal.
Teams meeting policies can be set up for a wide variety of customer needs. It's important to understand the features and settings available to get the right configuration for your organization's requirements.
The meeting policy transcription setting determines if a user can trigger the creation of a retained transcript.
If the transcription setting is Off, affected users can't trigger the creation of a retained transcript.
How this setting affects Copilot depends on two settings: the meeting policy Copilot setting for the user and the recording & transcript meeting options chosen when the meeting was created.
The meeting policy Copilot setting can be On only with retained transcript or On. This setting lets an administrator require retained transcripts for a user to access Copilot or allow a user to engage with Copilot through either a retained or temporary transcript. By combining this setting with the meeting policy transcript setting, organizations can establish a range of scenarios based on their specific requirements.
The third element that changes how Copilot functions within a meeting is the meeting organizer's choice when configuring the recording & transcript settings of a scheduled meeting.
The meeting option Record and transcribe automatically is available if the meeting policy for the user has Transcription set to On. When Record and transcribe automatically is enabled, it creates a retained transcript as soon as the meeting begins. Copilot is available to licensed users in such a meeting.
If automatic recording and transcribing aren't enabled, then the Who can record and transcribe and Allow Copilot settings become available to configure and affect how Copilot operates.
The Who can record and transcribe setting determines who can enable recording and transcription.
The Allow Copilot setting has two options:
- Only during the meeting: This setting instructs Copilot to create a temporary transcript when the meeting begins and dispose of it at the end of the meeting. Therefore, Copilot isn't available after the meeting finishes, unless a user turns on transcription during the meeting. This action creates a retained transcript that Copilot can use after the meeting finishes.
- Both during and after the meeting: This setting enables a user in the meeting to turn on transcription, creating a retained transcript, which Copilot uses to provide functionality during and after the meeting finishes.
For more information on Teams Meeting Copilot configuration, see Transcription settings.
Access to Web Content (Bing Integration) in Microsoft 365 Copilot and Copilot Chat
Web-grounding refers to the capability of Microsoft Copilot to enhance its responses by drawing on real-time public web content, using a Bing API connector. This feature can improve the relevance and breadth of answers, especially when internal data is limited or when broader, up-to-date context is needed.
In the Australian Government context, the recommended configuration is to enable web-grounding for Copilot Chat, which operates as a general-purpose AI assistant with built-in safeguards and enterprise data protection. This configuration allows users to benefit from up-to-date, web-informed responses while maintaining organizational control through Microsoft Entra ID authentication and data boundaries. Microsoft also recommends that agencies make a risk-based assessment on enabling web-grounding for Microsoft 365 Copilot (used in Word, Excel, PowerPoint, OneNote, and other apps) to reduce the risk of unintended data exposure while they strengthen their security posture.
This two-step approach balances the need for security and compliance in core productivity apps with the flexibility and utility of web-enhanced AI in conversational scenarios.
The logging and auditing of Copilot prompts is the same as the logging and auditing of Teams messages and uses the same storage component within the Microsoft 365 platform. Microsoft extends the same contractual commitment to securing prompts as it provides for Exchange, SharePoint, OneDrive, and Teams. As Copilot is classed as a core online service in our contract terms alongside those other core components, we secure prompts like any other customer content, which assumes it's private, sensitive, and classified protected.
Microsoft further commits:
- No customer data is used to train LLMs.
- Customer data is stored at rest only in Australia.
- Customers retain ownership over prompts and responses.
- Microsoft indemnifies customers against intellectual property rights complaints in the use of Copilot services.
All of these commitments are detailed in our product terms (Privacy and Security and Online Services), which are contractually binding within the enterprise agreement (VSA).
Important
Microsoft recommends that web-grounding be turned on for Copilot Chat. For Microsoft 365 Copilot, though the recommendation is to have web-grounding turned on, each department should make a risk-based decision.
While Microsoft 365 Copilot runs within the Microsoft 365 Service Boundary, plugins operate outside that boundary. The same applies for the integrated web content plugin too. Although Bing is a Microsoft first party service that connects over the Microsoft network, it is outside the Microsoft 365 Service Boundary and isn't IRAP assessed. Therefore, Microsoft can't assert that Bing is suitable for handling classified material.
Microsoft 365 Copilot doesn't send the following information to Bing:
- The user’s original interaction
- Whole Microsoft 365 files (for example, entire documents or emails)
- Identifying information from the user’s Microsoft Entra ID object, such as username, domain, the user ID, or tenant ID
Importantly, and consistent with use of Microsoft Copilot with Enterprise Data Protection, the search queries created by Microsoft 365 Copilot and sent to Bing:
- Aren't stored in, and have no bearing on, the ranking parameters of the Bing Search index.
- Aren't made accessible via the Bing Webmaster tools or any other tools provided to third parties (including Microsoft partners or advertisers).
As users can turn the web content plugin on and off at will, they can use this feature responsibly. This capability is similar to an end user deciding what and when to engage with a public search engine today. However, with the Microsoft 365 Copilot web content plugin, more privacy controls are applied than in consumer web search.
Staff should utilize the web content plugin on a session-by-session basis, enabling the plugin to engage with material from the web, and then disabling it at the start of a 'New Chat' to engage with more sensitive information, if it's appropriate to do so.
Auditing, discovery, and retention
When implementing any new technology, it's important to consider auditing of use, the search, and discovery processes that exist, and policies applied the retention of material. Microsoft 365 Copilot is no different, though as it's integrated into Microsoft 365 it's more a matter of understanding where to go and what you find, rather than needing to set up any new mechanism you don't already have.
Quick access to the auditing, discovery, and retention settings (plus sensitivity labels and Communication Compliance settings) is made available from the Copilot settings page in the Microsoft 365 admin portal. Organization administrators can access the Copilot data security and compliance quick links from the Microsoft 365 admin portal.
Auditing
Every interaction with Copilot generates an audit log entry in the same way as user actions in SharePoint, OneDrive, or Teams do today. The standard Microsoft 365 audit logs now include a category of audit entry for Copilot named Copilot Interaction.
In the audit log data above the Activity, Operation, and Workload information all points to a Copilot interaction, and the JSON payload of the CopilotEventData entry indicates which resources were accessed as part of the interaction, the originating host application, and the chat thread ID. This information can be used to access the user interaction and Copilot response. For more information, see retention and discovery.
Retention
Microsoft 365 Copilot prompt and response interactions are stored as Microsoft Teams messages, in the same way as chat messages between two users are stored by Teams today. This data is held in the user’s mailbox storage and is subject to Microsoft 365 retention and disposal policies in the same way as Teams chats.
Organizations implementing Microsoft 365 Copilot should consider the retention and disposal requirements for such interactions and ensure retention and/or disposal policies are in place to enable discovery and retrieval of user interactions with Copilot as needed.
As with other retention policies, Teams chats and Copilot interactions can be retained for any chosen duration, including indefinitely, and optionally deleted automatically at the end of the retention period. Microsoft 365 Copilot customers should decide on a suitable retention period and apply an appropriate policy to match from the Microsoft Purview portal.
eDiscovery
As Microsoft 365 Copilot interactions are stored in the same manner as chat messages from Microsoft Teams, the same Content Search and eDiscovery features are available to customers to access them.
Microsoft 365 Copilot introduces a new content type for the discovery features in Microsoft 365 called Copilot interactions.
Microsoft 365 Copilot introduces a new content type for the discovery features in Microsoft 365 called Copilot interactions. When performing a Content Search, using the Copilot interactions type allows an administrator or eDiscovery officer to locate specifically the interactions of a given user or group, or to locate a specific interaction that has been identified through audit logs.
Regardless of whether you're an administrator or a manager of an eDiscovery case, Copilot interactions are recorded and can be retrieved while within their default retention period or covered by a retention policy.
Monitoring data security
Using Microsoft Purview Data Security Posture Management (DSPM) for AI is crucial for monitoring and ensuring the security of data within Microsoft 365 Copilot Chat. DSPM for AI provides administrators with insights into AI activities and usage within their organization, allowing them to apply ready-to-use policies to mitigate risks associated with AI usage. This includes identifying and investigating data exposure risks, particularly those involving regulated personal data during interactions with Copilot. By using DSPM for AI, organizations can enhance their data privacy posture, enforce security policies, and prevent unauthorized data exposure. DSPM for AI can also be used to identify unauthorized use of AI across the agency.
Configure DSPM for AI to get reports on how Generative AI is being used in your organization. The E5 implementation guide and the Blueprint to prevent oversharing are also helpful resources.