Wondering How Microsoft and M365 Copilot Use Your Data? ??
Who owns your data?

Wondering How Microsoft and M365 Copilot Use Your Data? ??

Just how does Microsoft use your company data with AI and Copilot? ?

  • How does Copilot harness your organizational data??
  • Can you trust the content it generates??
  • And who ultimately owns that content? ?

We delve into these questions and more. Buckle up for more insights!?


How Microsoft Copilot for Microsoft 365 Protects Your Data and Privacy

Microsoft Copilot for Microsoft 365 is a powerful AI assistant that can help you with various tasks, such as writing emails, creating presentations, summarizing documents, and more:

It does so by connecting to your organizational data in Microsoft Graph, the Microsoft 365 apps you use every day, and a large language model (LLM) that can generate natural language responses.

But how does Microsoft Copilot for Microsoft 365 ensure the security and privacy of your data? How does it comply with the regulations and standards that govern the use of AI in the enterprise? And how can you trust the content that Microsoft Copilot for Microsoft 365 creates?

In this blog post, we will answer these questions and more, by providing an overview of how Microsoft Copilot for Microsoft 365 uses and protects your data, what controls and options you have as an admin or a user, and what best practices and resources you can follow to use Microsoft Copilot for Microsoft 365 responsibly.

How Microsoft Copilot for Microsoft 365 Uses Your Data

Microsoft Copilot for Microsoft 365 provides value by connecting LLMs to your organizational data. Microsoft 365 Copilot accesses content and context through Microsoft Graph. It can generate responses anchored in your organizational data, such as user documents, emails, calendar, chats, meetings, and contacts6. Microsoft Copilot for Microsoft 365 combines this content with the user’s working context, such as the meeting a user is in now, the email exchanges the user had on a topic, or the chat conversations the user had last week7. Microsoft Copilot for Microsoft 365 uses this combination of content and context to help provide accurate, relevant, and contextual responses.

Prompts, responses, and data accessed through Microsoft Graph aren’t used to train foundation LLMs, including those used by Microsoft Copilot for Microsoft 36589.

Microsoft 365 Copilot only surfaces organizational data to which individual users have at least view permissions. It’s important that you’re using the permission models available in Microsoft 365 services, such as SharePoint, to help ensure the right users or groups have the right access to the right content within your organization11. This includes permissions you give to users outside your organization through inter-tenant collaboration solutions, such as shared channels in Microsoft Teams More info.

When you enter prompts using Microsoft Copilot for Microsoft 365, the information contained within your prompts, the data they retrieve, and the generated responses remain within the Microsoft 365 service boundary, in keeping with our current privacy, security, and compliance commitments13. Microsoft Copilot for Microsoft 365 uses Azure OpenAI services for processing, not OpenAI’s publicly available services. More Info.


The Art of Prompting


How Microsoft Copilot for Microsoft 365 Protects Your Data

Microsoft Copilot for Microsoft 365 is built on top of Microsoft’s current commitments to data security and privacy in the enterprise. There’s no change to these commitments. Microsoft Copilot for Microsoft 365 is integrated into Microsoft 365 and adheres to all existing privacy, security, and compliance commitments to Microsoft 365 commercial customers More Info.

Some of the key features and safeguards that Microsoft Copilot for Microsoft 365 offers to protect your data are:

  • Encryption: Microsoft 365 uses service-side technologies that encrypt customer content at rest and in transit, including BitLocker, per-file encryption, Transport Layer Security (TLS) and Internet Protocol Security (IPsec)16. For specific details about encryption in Microsoft 365, see Encryption in the Microsoft Cloud. See More
  • Data residency: Microsoft Copilot for Microsoft 365 calls to the LLM are routed to the closest data centers in the region, but also can call into other regions where capacity is available during high utilization periods18. For European Union (EU) users, we have additional safeguards to comply with the EU Data Boundary. EU traffic stays within the EU Data Boundary while worldwide traffic can be sent to the EU and other countries or regions for LLM processing. Microsoft Copilot for Microsoft 365 is upholding data residency commitments as outlined in the Microsoft Product Terms and Data Protection Addendum (DPA). Copilot will be added as a covered workload in the data residency commitments in Microsoft Product Terms later in 202420. Microsoft Advanced Data Residency (ADR) and Multi-Geo Capabilities offerings will include data residency commitments for Copilot for Microsoft 365 customers later in 2024. See More
  • Information protection: When you have data that’s encrypted by Microsoft Purview Information Protection, Microsoft Copilot for Microsoft 365 honors the usage rights granted to the user21. This encryption can be applied by sensitivity labels or by restricted permissions in Microsoft 365 apps by using Information Rights Management (IRM). For more information about using Microsoft Purview with Microsoft Copilot for Microsoft 365, see Microsoft Purview data security and compliance protections for Microsoft Copilot.
  • Data access and deletion: To view and manage the data that is stored about user interactions with Microsoft Copilot for Microsoft 365, admins can use Content search or Microsoft Purview. Admins can also use Microsoft Purview to set retention policies for the data related to chat interactions with Copilot23. Your users can delete their Copilot interaction history, which includes their prompts and the responses Copilot returns, by going to the My Account portal For more information, see Delete your Microsoft Copilot interaction history.

What Controls and Options You Have as an Admin or a User

As an admin or a user, you have several controls and options to manage the use of Microsoft Copilot for Microsoft 365 in your organization. Some of the key ones are:

  • Policy settings: You can use policy settings to manage the privacy controls for Microsoft 365 Apps for enterprise on Windows or Mac devices in your organization. If you turn off connected experiences that analyze content for Microsoft 365 Apps on Windows or Mac devices in your organization, Microsoft Copilot for Microsoft 365 features won’t be available to your users in the following apps: Word, PowerPoint, Excel, OneNote, Loop, or Whiteboard. Similarly, Microsoft Copilot for Microsoft 365 features in those apps on Windows or Mac devices won’t be available if you turn off the use of connected experiences for Microsoft 365 Apps. For more information about these policy settings, see the following articles: Use policy settings to manage privacy controls for Microsoft 365 Apps for enterprise (for Windows) and Use policy settings to manage privacy controls for Microsoft 365 Apps for enterprise (for Mac).
  • Web content: Microsoft Copilot with Graph-grounded chat can reference web content from the Bing search index to ground user prompts and responses25. Based on the user’s prompt, Copilot for Microsoft 365 determines whether it needs to use Bing to query web content to help provide a relevant response to the user26. The query is passed to the Bing Search API, which is part of the Bing Search service, to retrieve information from the web to ground a response27. Admins can prevent their users from referencing web content in their requests2829. For more information, see Manage access to web content in Microsoft Copilot for Microsoft 365 responses. Even when allowed by the admin, users still have the option whether or not they want to reference web content in their requests. For more information, see: Use additional data sources with Microsoft 365 Copilot.
  • Plugins: Microsoft Copilot for Microsoft 365 experiences can reference third-party tools and services when responding to a user’s request by using Microsoft Graph connectors or plugins. Data from Graph connectors can be returned in Microsoft Copilot for Microsoft 365 responses if the user has permission to access that information. When plugins are enabled, Microsoft Copilot for Microsoft 365 determines whether it needs to use a specific plugin to help provide a relevant response to the user. In the Integrated apps section of the Microsoft 365 admin center, admins can view the permissions and data access required by a plugin as well as the plugin’s terms of use and privacy statement. Admins have full control to select which plugins are allowed in their organization. A user can only access the plugins that their admin allows and that the user installed or is assigned35. Microsoft Copilot for Microsoft 365 only uses plugins that are turned on by the user. For more information, see Extensibility of Microsoft Copilot for Microsoft 365.
  • Feedback: Microsoft may use customer feedback, which is optional, to improve Microsoft Copilot for Microsoft 365, just like we use customer feedback to improve other Microsoft 365 services and Microsoft 365 apps. "We don’t use this feedback to train the foundation LLMs used by Microsoft Copilot for Microsoft. Customers can manage feedback through admin controls. For more information, see Manage Microsoft feedback for your organization and Providing feedback about Microsoft Copilot for Microsoft."

How to Use Microsoft Copilot for Microsoft 365 Responsibly

Microsoft Copilot for Microsoft 365 is a powerful and innovative AI assistant that can help you achieve more with less effort. However, as with any AI system, it also comes with some limitations and challenges that require your attention and care. Here are some best practices and resources that you can follow to use Microsoft Copilot for Microsoft 365 responsibly:

As always, reach out with any questions :)

Don Pistulka

Karsten Mottlau

Release the full Microsoft 365 potential | Modernise your applications in Azure | Modern Work Partner | SharePoint Intranet Product owner

7 个月

Thanks for a great overview Don PistulkaUnfortunately, a number of the references in the post are pointing to a Copilot chatpage. For instance the link "Copilot will be added as a covered workload in the data residency commitments in Microsoft Product Terms later in 202420". Same goes for the link "See more" in the same section. I have a client with questions related to this precise topic so it would be great to be able to read more. Thanks

回复

要查看或添加评论,请登录

Don Pistulka的更多文章

社区洞察

其他会员也浏览了