Securing Generative AI: Preparing for Microsoft 365 Copilot
Microsoft 365 Copilot - Under the Hood

Securing Generative AI: Preparing for Microsoft 365 Copilot

In a world where Artificial Intelligence has transitioned from being a distant idea to a fundamental component of our daily lives, understanding the impact of generative AI models has emerged as a significant aspect of our work routine. Whether you've been impressed by the clever replies of ChatGPT or have utilised technology like Google Bard or Bing Search, you've likely encountered the impact of these transformative technologies.

In this news series of articles on Generative AI Security, we will explore the fundamentals of Generative AI and deep dive Microsoft's Generative AI product, Microsoft 365 Copilot. And with M365 Copilot going Generally Available (GA) in less than 3 weeks, its crucial to shed light on its impact within corporate environments, along with strategies for risk mitigation.

An Overview of Generative AI and Large Language Models (LLMs)

Generative AI is a broad concept encompassing diverse forms of content creation, where as Large Language Models (LLMs) represent a specific application of generative AI. Generative AI can include tasks like generating images, videos, composing music, and much more.?

Large Language Models (LLMs) on the other hand, are specifically designed for tasks revolving around natural language generation and comprehension.

LLMs work by using very large datasets to learn patterns and relationships between phrases and words. These models undergo training on substantial volumes of text data, enabling them to understand the statistical patterns, grammar, and semantics inherent in human language. This vast amount of text may be taken from the Internet, books, and other sources to develop a deep understanding of human language.

Large Language Models (LLMs) operate using a 'prompt and response' system. A prompt typically represents a question or a sentence initiated by the user, while the response is the generation of coherent and contextually relevant sentences or even entire paragraphs in response to the provided prompt or input. The model uses diverse techniques, including attention mechanisms and neural networks, to interpret the input and produce an output intended to be coherent and appropriate to the context.

Microsoft 365 Copilot: What is It?

Copilot serves as Microsoft's implementation of Generative AI and serves as an assistant integrated within each of the Microsoft 365 applications: Word, Excel, PowerPoint, Teams, Outlook, and more. By combining the use of Large Language Models (LLMs) with Microsoft Graph and M365 apps, Microsoft's aims is to eliminate the mundane aspects of everyday tasks, enabling individuals to concentrate on harnessing their creativity to solve problems.

Powerpoint use case

Below are a few productivity example use cases:

  • Within Outlook, Copilot can assist with inbox management, aiding in the prioritisation of emails, condensing email threads, and even crafting responses.
  • Copilot has the capability to join Teams meeting and provide real-time summaries of the ongoing discussions, catalog action items, and highlight any unanswered questions.
  • When it comes to Excel, Copilot can analyse unprocessed data, offering insights, identifying trends, and presenting suggestions to the user.

Microsoft 365 Copilot: Deep Dive

Microsoft 365 Copilot - Under the Hood

As Microsoft 365 Copilot is a SaaS product, many of the backend operations remain concealed from end users. Nevertheless, the behind-the-scenes processes triggered by an end user prompt are quite interesting, involving three primary actors: Microsoft Graph, the Microsoft 365 Apps, and Azure OpenAI services managed by Microsoft running the Large Language Model (LLM)

Here's the workflow detailing the backend processes:

  1. The user initiates a prompt within an application, such as Word, Outlook, or PowerPoint.
  2. Microsoft assembles the user's business context, taking into account their permissions within Microsoft 365.
  3. The prompt is forwarded to the Large Language Model (LLM), for instance, GPT-4, to generate a response.
  4. Microsoft conducts post-processing, responsible AI assessments.
  5. Microsoft generates a response and sends the response the Microsoft 365 application.

Data, Privacy, and Security for Microsoft 365 Copilot

  • From a security perspective, Copilot does not use any of your enterprise data for the training of Copilot LLM. There is no need for concern regarding the possibility of your company data appearing in responses to users from different tenants.
  • Microsoft 365 Copilot doesn't change existing data processing and residency commitments that are applicable to Microsoft 365 tenants. For European union clients, Microsoft has additional safeguards to comply with the European data boundary: EU traffic stays within the EU Data Boundary while worldwide traffic can be sent to the EU and other countries or regions for LLM processing.
  • The permissions model within your Microsoft 365 tenant can help ensure that data won't unintentionally leak between users, groups, and tenants. Microsoft 365 Copilot presents only data that each individual can access using the same underlying controls for data access used in other Microsoft 365 services.

Full list of data privacy controls can be found on the Microsoft Copilot documentation .

Understanding the Risks with Microsoft 365 Copilot

Just like any new technology, there are inherent risks linked to the adoption of Microsoft 365 Copilot. It's imperative to assess these risks and make appropriate preparations prior to implementing such a game changing product within enterprise environments.

Oversharing and Permissions

Any IT professional who has experience with Azure AD and Microsoft 365 is well aware of the complicated nature of permissions management within complex Microsoft 365 environments. This complexity comes from various factors, including the differentiation between regular users, guest users, external users, Entra ID permissions, Microsoft 365 and SharePoint permissions, and group delegations, to name just a few.

Making things worse is the lack of a Zero Trust and least privilege framework in the majority of enterprises. Consequently, a significant portion of users often possess excessive permissions within the tenant.

This was also highlighted in Microsoft's State of Cloud Permissions book (worth a read!):

Microsoft's State of Cloud Permissions top findings

Absence of Proper Labeling or Inaccurate Labeling

Much like the challenge posed by the absence of Zero Trust implementation within the tenant, the issue of insufficient or inaccurate labeling represents another concern for Microsoft 365 Copilot. Labeling tasks are entrusted to enterprise users, and, for instance, a user lacking security awareness might inadvertently classify a file as "public" when it, in fact, contains highly sensitive information, hence creating a serious cybersecurity risk.

The process of labeling can swiftly become complex within enterprise environments, and as the quantity of labels deployed by an organisation increases, this complexity becomes more important.

Preparing for Microsoft 365 Copilot

Review your Teams & SharePoint Sites

To optimize the effectiveness of Microsoft 365 Copilot, it's essential to conduct a comprehensive assessment of your existing Teams and SharePoint sites and their permissions. Over time, organisations tend to accumulate outdated or unused sites, which can clutter search results and potentially reduce Copilot's efficiency and result in oversharing. Identifying and retiring these obsolete sites is a vital step to ensure that Copilot delivers relevant and valuable information.

Furthermore, consider the evaluation of document sharing practices and the implementation of retention policies. These actions will further streamline your Microsoft 365 environment, increasing both security and the overall user experience. Keep an eye out for Microsoft Syntex's SharePoint Advanced Management (SAM), a tool released this year, designed to facilitate the management and governance of SharePoint and OneDrive, as well as to support secure content collaboration.

Improve your overall tenant security posture

Prioritizing security is crucial when preparing for Microsoft 365 Copilot. It's crucial to conduct a comprehensive security audit encompassing your tenant, with a focus on Teams and site owners. This assessment serves to pinpoint potential vulnerabilities and ensure the implementation of appropriate security measures.

As you deep dive into the realm of data protection, consider the implementation of advanced compliance tools, such as Microsoft Purview that offers protection through features like Sensitivity Labels and Data Loss Protection policies. By using these resources, you can secure your sensitive data and protecting your crown jewels from unauthorized access or inadvertent exposure, thereby enhancing the overall security posture of your entire Microsoft 365 environment.

Semantic Index for Copilot (Coming Soon!)

Semantic Index is a robust mapping tool that collaborates with Copilot and Microsoft Graph to correlate your user and company data, uncovering relationships and vital connections. Leveraging a Language Model (LLM), Semantic Index delivers the most relevant and actionable results when interacting with Microsoft 365 Copilot.

This forthcoming feature will play a very important role in verifying the accuracy and value of responses provided by Copilot. Stay tuned for further details and updates on how Semantic Index can enhance your Copilot experience.


Stay tuned for the next articles of the 'Generative AI Security' series!




References

Ivo Vybiral

Digital Workplace Solutions Architect @ Jazz Pharmaceuticals | Leading with Innovation & Technology Adoption | Copilot Evangelist | Microsoft 365 Enthusiast | Microsoft Alumni

1 年

Since MS announced M365 Copilot back in March, I've been going through all possible resources to keep myself up-to-date. Great work!

回复
Ilias Soukallaris

Microsoft 365 Tenant Expert - Modern Workplace

1 年

Great article! I also stumbled upon Syntex Sharepoint advanced management in your article which I had no idea about and very excited after what I read.

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了