Demystifying Microsoft 365 Copilot

Demystifying Microsoft 365 Copilot

In the ever-evolving landscape of digital tools, the M365 Copilot has emerged as a game-changer for enhancing productivity and streamlining workflows. However, with its rise in popularity, several rumors have surfaced about its functionalities and capabilities. In this article, we will demystify these rumors and shed light on the true nature of the M365 Copilot.

Defining the Microsoft 365 Copilot

The M365 Copilot is an experience?using?generative AI?to?assist humans?with?complex cognitive?tasks. You can think of it as an advanced AI-powered assistant integrated into the Microsoft 365 suite. It is designed to assist users by automating repetitive tasks, providing intelligent suggestions, and enhancing overall productivity. The Copilot works seamlessly across various Microsoft 365 applications, including Word, Excel, PowerPoint, and Teams, to name a few.


The Anatomy of M365 Copilot (simplified)

The elements of Microsoft 365 Copilot

It is important to understand that M365 Copilot is not an LLM or a single application. It is an extendable orchestration engine consisting of the following elements:

User Experience: The user experience with Microsoft 365 Copilot is designed to be seamless and intuitive. The user interface of M365 Copilot is seamlessly integrated in Microsoft 365 applications like Word, Excel, PowerPoint, Outlook, and Teams. It leverages data from these applications, as well as from over 1200+ plugins, including Adobe, SAP, and Workday. This integration allows Copilot to assist with various tasks, from drafting emails to generating reports, all while maintaining a consistent and user-friendly interface.

Orchestrator (Orchestration Engine): The orchestration engine in Microsoft 365 Copilot is responsible for overseeing and synchronizing the operations of the Copilot. It ensures that the right skills and actions are executed based on the user's input (prompts). The orchestrator helps in managing the flow of information and tasks, making sure that the Copilot provides relevant and actionable responses

Knowledge (Grounding & Memory): Microsoft 365 Copilot uses grounding to improve the specificity of user prompts. Grounding involves using data from Microsoft Graph, emails, chats, documents, and other sources that the user has permission to access. This ensures that the responses are contextually relevant and actionable. Copilot does not access data that the user does not have permission to view, maintaining strict data access controls.

Skills (Action, Triggers & Workflows): Copilot's skills include understanding, summarizing, predicting, recalling, translating, and generating content. These skills are powered by actions, triggers, and workflows. Actions are specific tasks that Copilot can perform, such as sending emails or updating records. Triggers are events that initiate these actions, and workflows are sequences of actions that automate complex processes. This combination allows Copilot to handle a wide range of tasks efficiently.

Foundation Models: Microsoft 365 Copilot uses a combination of foundation models, such as GPT-4 Turbo, to match the specific needs of each feature. These models are continuously evaluated and aligned with the capabilities required for different tasks, ensuring that Copilot can provide accurate and contextually relevant responses. The foundation models are the backbone of Copilot's ability to understand and generate human-like text.


How Microsoft 365 Copilot works and logical Architecture

Before we start it is important to understand the Microsoft 365 Service Boundary in the context of the M365 Copilot. The Microsoft 365 Service Boundary in the context of M365 Copilot refers to the limits within which Copilot operates to ensure data security, privacy, and compliance. Here are some key points:

  • Data Access: Copilot only accesses and displays organizational data to users who have at least view permissions. This means that even if you have global admin or audit role permissions, Copilot will not access all users' data (files, emails). It exclusively searches and utilizes the current user's Microsoft 365 cloud content within their tenant, excluding other tenants the user may be a B2B guest on or non-current user's tenants with cross-tenant access or sync
  • Compliance: All prompts, retrieved data, and generated responses are kept within the Microsoft 365 compliance boundary, adhering to existing data security and compliance commitments. The Microsoft Trust Center provides detailed information on how Microsoft handles data security and compliance, including how Copilot adheres to these commitments
  • EU Boundaries: For EU customers, Microsoft 365 Copilot is an EU Data Boundary service. This means that processing organizational data outside of EU boundaries is restricted due to privacy regulations
  • Privacy and Security: When using Copilot, your prompts, the data retrieved, and the results stay within the Microsoft 365 service boundary, following Microsoft's privacy, security, and compliance commitments. Copilot uses Azure OpenAI Services, not OpenAI’s public services, so all of the processing stays within the Microsoft 365 service boundary

For more detailed information, you can visit the "Security and Compliance" section of the Microsoft Trust Center. But now lets get started.


Stage 1

Stage 1: User prompt

The M365 Copilot starts to work when Copilot receives an input prompt from a user in a Microsoft 365 app, like Teams, Word or PowerPoint. Note: since security is our top priority all data flow is encrypted via HTTPS and wss://.


Stage 2

Stage 2: Copilot pre-processes the input prompt using grounding.

Grounding improves the specificity of the prompt, and helps you get answers that are relevant and actionable to the specific task. The prompt can include text from input files or other content Copilot discovers.

Copilot only accesses data that an individual user is authorized to access, based on, for example, existing Microsoft 365 role-based access controls. Copilot doesn't access data that the user doesn't have permission to access.

To learn more, see Data, Privacy, and Security for Microsoft 365 Copilot.


Stage 3

Stage 3: Grounding data and modified prompts are sent to the Large Language Models.? Please be aware of the following things about LLMs inside M365 Copilot:

1. The LLM components are inside M365 Service boundary.?

2. Prompts, responses and grounding data are NOT used to train foundational models. Microsoft does not use it for advertising or share customer data with third-parties.?

3. This private instance of Azure OpenAI is maintained by Microsoft.?OpenAI does NOT have access to your data or our models.?


Stage 4

Stage 4: Retrieval-augmented generation. It allows Copilot to provide exactly the right type of information as input to an LLM, combining this user data with other inputs such as information retrieved from knowledge base articles to improve the prompt. Stages 2, 3 and 4 could iterate to then proceed to the next stage called post-processing.?


Stage 5

Stage 5: post-processing

Post-processing here includes Responsible AI (RAI) checks (which are automated and doesn't involve any humans in Microsoft to review), additional calls to Microsoft Graph for Security, Compliance and Privacy reviews, and command generation.?Tenant admins have access to Purview capabilities.? To enable that, Copilot needs to slap on Sensitivity labels on the content and responses coming through as well as record information for Auditing, eDiscovery, etc.?


Stage 6

Stage 6: Response

In stage 6 the user prompt, the response sent back to the user and the citation links (or files that were accessed for generating that response) is stored.? Please note that the storage of this is similar to how and where your mailbox info and other M365 data like Teams chats is stored.? The data at rest is encrypted per our M365 data commitment.? The storage of the User utterance and Copilot’s response is useful for features like Sessions/History that helps you with your user experience.?

In summary, Copilot iteratively processes and orchestrates these sophisticated services to produce results that are relevant to your business because they are contextually based on your organization’s data, and the latest/greatest from the web and 3rd party extensions so you have it all from your favorite client canvas or apps at your finger tips.?

And to tie it back all, your data at rest and during transit is encrypted to keep your data safe, compliant and provide your admins the capability to manage your tenants data securely.?

Conclusion:

I hope this articles adds the needed transparency on how the M365 Copilot works. It is clear that the M365 Copilot is designed with a robust orchestration engine at its heart, ensuring that user data is handled securely and compliantly. As organizations continue to navigate the complexities of digital transformation, the M365 Copilot will undoubtedly play a key role in streamlining workflows and driving productivity.

要查看或添加评论,请登录

Jan Koch的更多文章

  • The AI Workplace

    The AI Workplace

    The Modern Workplace is one of the most valuable assets of any organization, as it is where employees interact and…

  • Microsoft Viva - One year later

    Microsoft Viva - One year later

    Since I published my first articles about Employee Experience Platforms and Microsoft Viva in particular one year ago…

  • Cloud Security Strategy III

    Cloud Security Strategy III

    Welcome to my third and final article about cloud security strategy in the context of Microsoft 365. After the second…

  • Essential cloud security practices

    Essential cloud security practices

    Welcome to my second article about cloud security strategy. This series of articles gives an high level overview how to…

  • Cloud Security Strategy

    Cloud Security Strategy

    Welcome to the first article of the series "Cloud Security Strategy". The aim of this series is to show IT managers a…

    1 条评论
  • Hybrid Cloud

    Hybrid Cloud

    For many companies that are willing to move to the cloud, but still want to run applications in their own on-premises…

    3 条评论
  • IT ethics - Who owns our data?

    IT ethics - Who owns our data?

    The latest update for Apples iOS will let you know if an app wants to track you and Facebook does not like it. But this…

  • New Leadership

    New Leadership

    Now we are finally there. After over one year we seem to approach an "end" of the COVID-19 pandemic and with it the…

  • Employee Experience Platforms – A silver bullet?

    Employee Experience Platforms – A silver bullet?

    Optimizing the modern workplace is a permanent challenge for any company. The pandemic has further reinforced the…

  • Microsoft Viva

    Microsoft Viva

    Microsoft Viva is the latest major addition to the Microsoft 365 platform. It is Microsoft’s answer to the current…

社区洞察

其他会员也浏览了