AI without filter - Use Copilot for M365 to prepare your organization for AI
Note: This is the third article in my "AI without filter" series.
First, the article "From ChatGPT to Superintelligence" will "as simple as possible, but not simpler" introduce you to what I believe are the most important AI terms and describe what we - as humans and developers - need to consider when using AI.
Second, the article "From ChatGPT to Prompt Engineering" will focus on the how you "train" a Large Language Model (LLM), like GPT, using Prompt Engineering techniques with Azure OpenAI.
Third, this article will illustrate how a successful Copilot for M365 implementation could be the best way to prepare your organization for AI.
Summary
Facts and Hypotheses
Inspired by "Four Weddings and a Funeral", I will begin with some facts and hypotheses that may challenge the reader. Let me know in the comments if you disagree (or agree).
Six Facts:
Four Hypotheses:
Three Facts:
Two Hypotheses:
The elements of a generic AI/LLM project
As foundation for a(ny) successful AI/LLM project, you will need ...
#1: A use case that makes sense from a business perspective
It sounds simple, but is in many cases the most difficult task and if the use case is not selected wisely, the project will likely fail. This includes to find the relevant business processes and the personas involved.
For M365 Copilot: The use case (#1) is obvious: To empower every person and every organization on the planet to achieve more - with any M365 scenario.
Note that Copilot for M365 is relevant for any business process and any persona with creation, completion or analysis of M365 data including emails, Teams Chat or Word, Excel or PowerPoint documents.
#2: Data, lots of data – a LLM (Large Language Model)
The LLM could be a standard Natural Language Model, like GPT or LLaMA, but it could also be with domain specific data like healthcare, pharma, legal, financial, retail, sales etc., or it could contain Code (like Copilot for GitHub).
Or it could be something completely different like DNA based LLMs
For M365 Copilot: The core model (the LLM) is a modified version of GPT, accessed using Azure OpenAI.
#3: People that are experts in the domain
Experts who can
For M365 Copilot: Microsoft will ground (#3a) Copilot for M365 for a specific user with the data, the user has access to – including emails, files, conversations etc. Copilot is also trained/pre-prompted (#3b) by Microsoft, including a validation of every prompt and response by RAI (Responsible AI) tools and processes to ensure Fairness, Reliability and safety, Privacy and security, Inclusiveness, Transparency and Accountability.
#4: Adoption - build user prompting skills (for this LLM/domain)
Users should be educated to utilize the AI solution, including how to interact (prompt) with the LLM.
Users should also understand that AI using LLM is not “magic”, but pure mathematics, and that the response from the LLM is not always “right”, but simply based on what seems statistically reasonable based on existing data.
For M365 Copilot: You only need to focus on The adoption!
About Copilot - The very short version
You probably already know this but just to be sure ...
Copilot for M365 is a cloud-based service that helps you write, edit, and format documents, emails, presentations, and other types of content with the power of natural language understanding and generation. Copilot for M365 uses state-of-the-art machine learning models that are trained on large amounts of public data, such as Wikipedia, books, news articles, and web pages. Copilot for M365 does not use your prompts or any data from your organization for training its models. Your data is only used to provide relevant suggestions and insights based on your context and preferences.
Copilot for M365 protects your data and privacy by using encryption, authentication, and authorization mechanisms. All data exchanged between your device and the Copilot for M365 service is encrypted in transit and at rest. Only authorized users can access the Copilot for M365 service and use its features. Copilot for M365 also complies with the Microsoft Privacy Statement and the Microsoft Online Services Terms, which outline how Microsoft collects, uses, and protects your data.
Copilot implementation
The technical implementation of Copilot for M365 is simple, probably one of the easiest you will experience. Look at this video if you want a quick overview How to get ready for Copilot for Microsoft 365.
Before and After the Copilot Adoption project
As described earlier, use case, data, grounding and pre-prompting are already done so you can focus on the adoption.
However, even though, it is not directly related to Copilot for M365, it is strongly recommended to start with your data governance.
Similar, many organizations will over time see a business need to integrate non-O365 data.
Pre-adoption: Data Governance
Data governance is a key aspect of a successful Copilot for M365 project. It ensures that the data used by the copilot solution is of high quality, secure, and compliant with your needs and obligations. This helps you earn trust and make the copilot solution better.
But data governance is not easy. Sometimes, we may share more data than we need to. This is called oversharing. It can be on purpose or by mistake.
Oversharing - or Data Governance in general - is not a Copilot for M365 specific issue, but Copilot for M365 will likely expose potential problems - like oversharing - as Copilot will base conversations on the data, the user has access to.
However, if you overshare today, the risk of people seeing data, they should not see, is still real. Anyone with a little technical insight on M365 will be able to find these data without Copilot.
Note: If you overshare today, people are likely already seeing data they should not have access to - also without Copilot.
You will find tools that may help you improve your data governance, but they are not magic solutions that will fix the problem automatically. They require careful planning, implementation, and monitoring to ensure that your data is properly classified, protected, and governed according to your policies and compliance requirements.
Remember: A Fool with a Tool is still a Fool
Microsoft offers tools like Microsoft E3/E5 and Microsoft Purview that can help you with data governance by providing features such as Data Loss Prevention (DLP), Sensitivity Labels, Information Barriers, Microsoft Information Protection (MIP) and a unified data map of your data assets across your hybrid and multi-cloud environments.
The technology behind this is well-established, and it is recommended that you use a partner with long experience and references. Note that you can choose a separate partner for this, as it is not directly connected to Copilot for M365.
Important: Ensure that the scope of the Data Governance part matches what you want from your Copilot for M365 project.
Reach out to your Microsoft team or to your Microsoft partner if you want more information.
You can also find partner offerings here Microsoft 365 Copilot Partner Directory.
Post-adoption: Extensibility
As illustrated below, you can utilize external data (=non-M365 data) in two ways
You can also use Microsoft Copilot Studio to build your own copilots.
Recommendation: Don’t focus on extensibility too early. Make Copilot for M365 a success first; e.g. learn users to prompt.
Graph connector, M365 Plugin or AOAI - that is the question
If M365 (data and/or apps) are not central in your AI solution (or UI), you should carefully evaluate using Azure OpenAI instead.
This is especially relevant if you want to ground and/or pre-prompt your LLM to fit your business as you can't do that for Copilot for M365.
Recommendation: It is an architectural decision! Take it ...
Like with the Data Governance part, the extensibility is not directly related to the Copilot Adoption and it is an option to select a different partner here as well.
Under-promise and Over-deliver
As a long-time consultant, I know that you get happy customers if you “under-promise and over-deliver”.
This is also true for a Copilot for M365 project.
The risk of over-promising - or overselling
If you “over-promise” - or oversell - you will have a high risk for dissatisfaction and as result low adoption.
I think we have all had “WAOW” moments when using ChatGPT when AI almost magically understood what we wanted or summarized/rephrased a text better than we ever could have done it.
I have also had many "WAOW" moments with Copilot for M365 when it understood the context in a complex meeting, even when people did not speak English.
However, we have likely also all had “Hmmm” moments where the AI tool did not help or even hallucinated or was wrong.
If the Copilot for M365 users only expect “WAOW” moments, they will be disappointed.
I know it is very easy to overpromise (or oversell) when you see videos like below but it is critical to set the exceptions right.
Note these videos (and many others) show the Copilot vision and they are absolutely technically realistic and I am confident we will get there, also sooner than anyone expect.
However, they require a data governance and data quality that few organizations have today. They also require that users are very experienced in prompting.
It is a common pattern for emerging technologies: they often disappoint us in the short run, but surprise us in the long run with their transformative potential. AI is likely no exception to this rule.
AI - and Copilot - is not a magic bullet that can solve all our problems, but a powerful tool that can help us achieve our goals. That's why I believe that Copilot can offer immediate value to all users, if you make it simple and focus on a few quick wins - see my suggestions below.
Users will naturally experiment with new prompts if they have a positive experience and see the benefits. In other words, they will begin "freestyle prompting" – and discover what is effective and what is not.
Also, keep in mind that some of the features that are not available today might become functional over time as Microsoft continues to improve and expand its capabilities!
No need to “over-promise” - just look at the ROI
The list price for Copilot for M365 is 30$ per month per user and the implementation project will be one of the easiest you will ever experience.
This is what you should use to build your business case and your RoI.
To be successful, you will need an adoption project to make users achieve the full value of Copilot for M365, as it is likely their first AI project.
领英推荐
Most organizations will need help here (and it comes with a cost), but as I see it, it is more an investment in AI readiness than in implementing a “tool”.
Why implement Copilot for M365 in a team?
Copilot works best with teams that collaborate frequently and use M365 data sources.
Copilot can improve the productivity, quality, and consistency of your team's communication and documentation and can help your team save time, avoid errors, and focus on the most important tasks.
Copilot can enhance the collaboration and engagement of your team members by providing feedback, suggestions, and insights.
Choose your Copilot users wisely
If you decide to do a phased implementation of Copilot, it is recommended to assign Copilot licenses to a complete group of people that naturally work together, like a department or project team.
It is also important to select the initial Copilot users among people that work with M365 data like meeting recaps and documents.
You should also select people that commit to use Copilot, even in situations where it is as easy to do it without Copilot, but for the long(er) term productivity improvement.
Personas that primarily do short emails or chat messages or participate in meetings, like many CxO's, they might not be the best choice for a pilot implementation.
However, most CxO's will want it immediately when they see the value of the Teams recap capability
Focus on TWO basic scenarios
My recommendation is to start with two very basic scenarios, that will
Scenario 1: Teams meeting recap actions (The "ROI")
Most people can skip one or more meetings every month and just read the recap(s) or even ask relevant questions to Copilot about anything discussed in the meeting, including presentations, chat or voice.
This time savings will in many situations more than cover the 30$ license cost to Copilot.
Last, but not least, it requires almost no formal training, besides prompting, if you just remember to to record the meeting.
Scenario 2: "ChatGPT" like functionality (The "Compliance")
Many people are already using ChatGPT or other web based "free" tools for
Even if users typically avoid confidential data like customer names, product prices or plans, the "prompts" and accepted responses may include elements that could be valuable as "training data" for the LLM provider.
Like ChatGPT, Copilot for M365 is LLM-based but with these important additions
Copilot for M365 respects your privacy and security and gives you full control and governance over your data. You can choose what data to share with Copilot, how to store and manage it, and who to grant access to it. You can also monitor and audit the usage and performance of Copilot and ensure compliance with your policies and regulations.
Like Scenario #1, it is also very easy to implement as most users already have experience with ChatGPT.
Beyond the TWO: "Free-style" Prompting (The "Curiosity")
When users master these two scenarios, they will automatically start exploring new options, and this is when real adoption happens.
This will require formal training, super users, sharing experiences etc. as described in the Adoption project later.
It is not just me ...
No, it is not :) Just look at the example below from Eric Gourmelen , Vice President, CTO Global ECS Cloud at Arrow Electronics in his article Copilot Adoption Race, where he states "The ROI is immediate" and "Using a free service with your company data is just suicidal!"
You can also read Can SMB's afford Microsoft 365 Copilot? | ROI breakdown.
The Art of Selecting the right Copilot Adoption partner
You need to have a partner that you trust and have a relationship with and that has the right skills. It is here more complicated because the technology you need help with is very new.
Copilot for M365 has only been available for most customers and partners since January 15, 2024, so you can't assume they have a lot of experience.
Here are some tips for you in this process.
For the Change Management aspect, many partners in the modern workplace field will have a lot of experience here, and some with ADKAR, but you should request evidence for experience and references.
The Prompt Engineering part is brand new, especially in a Copilot context.
As previously mentioned, I don’t believe users can learn to master prompting through training only and similarly I don't believe anyone can train people in prompting if they don’t do it themselves as a core part of their work.
To master prompting you must do it all the time
I suggest you ask these three simple questions ...
Adoption - the Change Management
Change management is the process of helping users adopt new behaviors and skills that support the adoption of a new technology or solution like Copilot for M365.
One of the most widely used models for change management is ADKAR, which stands for Awareness, Desire, Knowledge, Ability, and Reinforcement. ADKAR can be used in a Copilot for M365 project to guide the change management activities and measure the outcomes of the project.
I will as an example use ADKAR in the sections below, but you can obviously use any other change management method.
ADKAR Step #1: Awareness
The first step of ADKAR is to create awareness among the users about the need for change and the benefits of using Copilot for M365. This can be done by communicating the vision, goals, and benefits of the project, as well as the risks and costs of not changing. Awareness can be created through various channels, such as emails, newsletters, webinars, videos, or presentations.
It is recommended to also share the wider AI vision for the organization and that Copilot is an important milestone in achieving it.
It is highly recommended to introduce change agents (or superusers) that can help individuals move through these stages by providing information, support, and guidance. They can help build awareness of the need for change, create a desire to participate and support the change, provide knowledge on how to change, develop the ability to implement required skills and behaviors, and reinforce the change to make it stick.
As discussed previously, be careful not to oversell Copilot, but focus on the selected basic scenarios, like Teams meetings recaps and ChatGPT-like functionality in your M365 apps, but with enterprise-grade security, privacy, and compliance.
ADKAR Step #2: Desire
The second step of ADKAR is to generate desire among the users to participate in the change and use Copilot for M365. This can be done by addressing the concerns and questions of the users, as well as involving them in the planning and decision-making process. Desire can also be influenced by providing incentives, recognition, or rewards for using Copilot for M365.
Don't forget the "WIIFM effect" - What’s in it for me?
Having Copilot (and AI/prompting) experience can be a valuable asset for many users, and they will all want to include it on their resume.
Fun fact: Perhaps the most telling data point of all, highlights that 77% of the Copilot users Microsoft surveyed during the early project would choose Copilot over a free weekly lunch. That says something.
Stakeholders (or sponsors) are very critical during the Desire phase of the ADKAR model.
Stakeholders may also have concerns themselves about the change. It is important to address these concerns and provide them with the necessary information to help them understand the change and its benefits.
Once stakeholders are convinced that the change is necessary and beneficial, they are more likely to support the change and work towards achieving it.
The stakeholders will also be critical as sponsors for the ongoing, including supporting that the project gets the resources (including funding), needed to make it a success.
ADKAR Step #3: Knowledge
The third step of ADKAR is to provide knowledge to the users on how to use Copilot for M365 and how it will impact their work processes and outcomes. This can be done by providing training, documentation, tutorials, or demos on how to use Copilot for M365 and its features. Knowledge can also be enhanced by providing feedback, coaching, or mentoring to the users.
Start with the selected simple scenarios, but encourage users to experiment, for example by using the Copilotlab that all Copilot users have access to.
As illustrated in the well-known “What the customer really wanted” picture, it is NOT easy to give clear guidance, and it makes it not easier, that you are not talking to person, but to an AI.
Help users see Copilot as a process and motivate them to keep going.
They should also understand that Copilot's answers are NOT always accurate and that they may get a different answer if they repeat the same question.
A personal observation: I met with one of our high potential ISVs (Templafy) recently and we talked about a future “fireside chat”. The next day, I got an invitation to an internal “fireside chat” that had nothing to do with Templafy.
A few days later, when I asked Copilot for a summary of the activity on Templafy, the internal “fireside chat” was including. In other words, Copilot was wrong.
(Note that Templafy has just announced their AI Assistant on Times Square ??)
ADKAR Step #4: Ability
The fourth step of ADKAR is to enable the users to apply their knowledge and use Copilot for M365 effectively and efficiently. This can be done by providing support, guidance, or assistance to the users during the implementation and transition phase. Ability can also be improved by providing opportunities, tools, or resources to practice and improve their skills.
Users can even interact with Copilotlab to get recommendations for very relevant prompts like these:
Note that users will need encouragement to continue using Copilot, even when it sometimes evidently will be faster to do it manually. Remember the also famous “We are too busy” cartoon.
A personal reflection: I began to work with (prompting) ChatGPT about a year ago like many others, and I quickly saw the potential when it succeeded and the disappointment when my prompts failed. I also used it for work-related tasks, but it had little value as I could only use it for non-sensitive data.
I have now been using Copilot regularly for over a month and I still experience some highs and lows. I appreciate the benefits of Teams recap and of having a "coauthor" (like for this article), seamlessly integrated into Word or Outlook.
However, I still feel annoyed when it doesn't work, especially when I attempt to replicate the actions in one of the many (overhyped) videos that are on the "Net".
Some scenarios, that may appear to make sense, might not be feasible (yet), given the data that I have right now. It is also important that all users know that.
ADKAR Step #5: Reinforcement
The fifth and final step of ADKAR is to reinforce the change and ensure that the users continue to use Copilot for M365 and achieve the desired results. This can be done by monitoring, measuring, and evaluating the performance and outcomes of the project, as well as celebrating and acknowledging the achievements and successes of the users. Reinforcement can also be done by providing ongoing support, communication, or feedback to the users.
Measurement:
One of the ways to reinforce the change and assess the effectiveness of the project is to measure the usage and adoption of Copilot for M365 by the users. This can be done by using various tools and methods, such as:
Final words from the Author
This was written with strong support from my Copilot "Coauthor", and significantly faster than I would have been able to do it alone, especially for a non-native speaking/writing English person like me.
However, every word, every sentence, every recommendation and every hypothesis are mine -and mine alone.
Want to hear more, have feedback/suggestions or need help?
As always, I am very interested on your feedback. Please feel free to add a comment to this article or reach out to me ([email protected]).