How Microsoft AI makes Bing and Copilot work

The digital landscape continues to evolve, reshaping the way we work, communicate and collaborate. Realising the need for tools to maximise efficiency throughout workplace interactions, Microsoft has developed Microsoft 365 Copilot.

The slick videos make everything look simple and glossy, but the reality is far more complicated. In this article, I’ll explain the foundations, why it’s so complex and what you need to understand before welcoming it into your working life.

Table of Contents

Foundations of Microsoft Copilot AI

The Copilot system works by using three foundational technologies:

  • Microsoft 365 apps – Word, Excel, PowerPoint, Outlook, Teams and more
  • Microsoft Graph, where all your emails, files, meetings, chats and calendar are located. This provides context for the Copilot system
  • A large language model (a creative model capable of understanding and producing human-readable text based upon hundreds of billions of parameters).

With the integration of these systems within Microsoft 365, Copilot seamlessly increases efficiency and decreases the workload on an organisation.

How Copilot works: prompt to result

Copilot works firstly by receiving a prompt from a user in one of the various Microsoft 365 applications. This prompt is pre-processed in an approach called grounding.

Grounding works through Copilot finding information in Microsoft Graph of any relevant topics that can provide context, which can lead to a more relevant prompt for the LLM. It then sends the modified prompt to the LLM. The modifications to this prompt allow the results to be more accurate as they have the combined context of all the content in your Microsoft Graph and in the prompt to Copilot.

After the LLM generates a response, the response is grounded with Graph once again to provide more context that can be of use to the user. This final grounding is part of the post-processing stage, which also includes responsible AI checks, security compliance, privacy reviews and command generation.

Finally, Copilot sends a response back to the user and uses the commands it generated in the post-processing to interact with the Microsoft 365 apps.

How Bing Chat works

how bing chat works

The complexity of Copilot, in terms of interaction with other applications and drawing context from resources provided, is much higher than that of the Bing Chat feature. This only draws upon its LLM and does not follow the same process of grounding itself with any internal resources such as those in the Microsoft Graph.

While Bing is capable of supplying a user with an answer to questions and citing its sources, it cannot interact with any further Microsoft 365 service like Copilot. It doesn’t follow the steps of pre-processing, processing, post-processing and final delivery of a response.

Instead, Bing chat only sends the prompt to its LLM, which is GPT-4.

A generative pre-trained transformer (GPT) is a type of LLM created by the American artificial intelligence research laboratory, OpenAI. Bing Chat utilises GPT-4 to provide a highly nuanced response to a prompt that includes citations from multiple sources and natural language responses.

Hallucinations: GPT-4 versus GPT-3.5

GPT-4 use has been limited to a few applications including Bing Chat, Copilot and ChatGPT Plus. Currently, the most widely available GPT model is GPT-3.5.

Microsoft 365 Copilot uses GPT-4, the most modern and intelligent AI currently available, which has been known to be less likely to feed into something called “hallucinations”. Hallucinations occur most prevalently in older GPT models and are when the AI generates a factually incorrect response to a prompt based upon the AI’s biases and information that has not been explicitly trained into it (information it has hallucinated).

Copilot’s GPT-4 LLM is much less likely to generate a hallucinated response to a prompt than GPT-3.5 because it has been trained that hallucinations are not acceptable under any circumstances. But there is still the possibility of hallucinations, as it could pull upon irrelevant internal data from across your Microsoft Graph.

Microsoft Copilot AI grounding in Office apps

Microsoft 365 Copilot first receives a prompt from the user in one of the various Microsoft 365 applications. That is, Word, Excel, PowerPoint, Outlook, Teams, etc. These prompts can include references to files from other Microsoft 365 apps.

A prompt can be as simple as asking Copilot to transform data from a Word document into a PowerPoint presentation.

Microsoft Graph grounding

microsoft graph grounding
This image shows the path a user query takes and is published in the detaled “Grounding LLMs” article by intellectronica, as published on Microsoft’s TechCommunity

As mentioned above, upon receiving the prompt, Copilot “grounds” itself with Microsoft Graph. Grounding is a process through which Copilot assigns extra context and content to the prompt, which will make the response more relevant to what the user requires.

Context can be acquired from anything within Graph; this can include emails, files, meetings, chats and calendars. This is part of what makes Copilot so useful for businesses: anyone within the organisation can pull upon the shared data within the Graph for context and content to build the most relevant response possible.

Copilot grounds itself to Graph on two separate occasions. Copilot first grounds itself during the pre-processing stage to make sure that the LLM will supply the user with the most relevant response to the given prompt.

Second, Copilot grounds itself during the post-processing stage. It does this to ‘double-check’ that it is giving the user a relevant response with correct commands to other Microsoft 365 apps. Grounding greatly decreases the chances of hallucinations, as the added context from Microsoft Graph should keep the GPT from providing the user with falsified information.

Microsoft LLM behind Bing Chat and Copilot

Instead of using the readily available GPT-3.5 model for Copilot’s LLM, Microsoft uses GPT-4. Much of the information surrounding GPT-4’s exact capabilities is undisclosed, but we do know that it is more complex and capable of handling different kinds of inputs including images than GPT-3.5.

Microsoft is using GPT-4 to provide the most natural and relevant responses to the user within both Bing Chat and Microsoft 365 Copilot.

Is Microsoft Copilot AI secure?

Within organisations, Copilot can only pull data for which individual users have a minimum of view permissions. As such, a key aspect of getting your business ready for Copilot is ensuring that the correct permissions models are in place before deploying any aspect of it.

By interrogating the context within OneDrive and SharePoint where businesses often host their information, the correct rights for users and groups will need to be correct. Copilot without any controls will allow users from all aspects of the business to have access to content which they should or shouldn’t have access to. 

When a user completes an input prompt within Copilot, the information is held within the prompt, the data it retrieves, as well as the generative responses remain within the Microsoft 365 compliance boundary. The data is held within the organisation tenant within Microsoft 365. Each tenant has its own Copilot orchestrator instance, which relies upon Microsoft Search to collate the information.

Microsoft 365 Copilot ensures data security and privacy by adhering to your organisation’s existing obligations and integrating with your organisational policies. It uses Microsoft Graph content with the same access controls that are applied to Microsoft 365 data already in place.

Is Copilot AI good?

bing chat in action
There’s no disputing the number of uses that AI can be put to, as this summary of Bing’s talents shows – how good the results are is a different matter

The content that generative AI produces isn’t always 100% factual. There are hallucinations, as previously pointed out. Microsoft will continue to improve the responses, but it has also coined the term “usefully wrong”. Perhaps “useful after extensive editing” would be a better term.

Microsoft is working to ensure there are algorithms in place to reduce issues such as misinformation and disinformation, data safety and the promotion of harmful/discriminatory content.

The responses which Copilot provides are only as good as the information it has access to, and rightfully so this should be restricted. For example, helpdesk personnel shouldn’t have access to executive decision-making content.

Prerequisites for Microsoft 365 Copilot

Before your business can roll out Microsoft 365 Copilot, the following must be configured:

  • Microsoft Entra/Azure Active Directory ID – this allows the user to consume Copilot.
  • Microsoft 365 apps – for enterprise organisations they must be deployed to the end users. Copilot will work seamlessly with most applications.
  • OneDrive or SharePoint – this provides the ability for users to share and save files.
  • New Outlook for Windows – the new flavour of Outlook will need to be used, but this is currently under preview. 
  • Microsoft Teams – either the desktop application or the web client.
  • Microsoft Loop (optional) – if your organisation is thinking enough about collaboration, the ability to use Copilot with Loop can be configured. This requires the application policies to be set for your organisation.
  • Licences – this will be introduced when the product becomes generally available. This can be found in the Microsoft 365 admin centre by going to Billing > Licenses.  The licences can also be deployed via Azure Admin Centre to grant the licence in bulk. 
Avatar photo
Jason Wynn

Microsoft is hard-wired into Jason’s DNA. He is a Microsoft MVP and Microsoft Certified Trainer, with nearly two decades of experience across everything from Skype for Business to Microsoft Teams to Office 365. His day job is Presales Specialist at Carillion.

NEXT UP