What is considered an extension to Copilot?
At Microsoft, the term Copilot refers to a category of large language model-based, generative AI interaction. A Copilot empowers users to work smarter, be more productive, more creative, and more connected to the people and things around them.
This article explains the essential characteristics of a Copilot and what your offering must include to be considered a Copilot in Business Central. By using the developer tools for Copilot, you can meet the necessary criteria.
LLM-based
A Copilot experience is an interactive agent built on a large language model (LLM) or other state-of-the-art foundational model. This means that the Copilot may use advanced natural language processing either to understand user input or to generate responses that are tailored to the user's specific needs. Copilot capabilities may also call upon other machine learning models to get tasks done.
Responds to natural language for empowering users to achieve tasks
One of the key features of a Copilot is its ability to respond to natural language input from users. Copilot should help users accomplish complex tasks that would otherwise require substantial cognitive load, creativity, and time. For example, Copilot can assist users with the following tasks:
- Simplify or solve complex tasks
- Prioritize and plan tasks
- Support communication and relationships with people
- Get better, more in-depth, or more nuanced insights
- Get quicker insights for immediate tasks
- Understand relationships, patterns, and trends
Manages generative AI risks through human-in-the-loop pattern
While Copilot can be a powerful tool for productivity and creativity, it's important to manage the risks associated with generative AI. AI might suggest actions with undesirable or harmful side effects due to inadequate or erroneous input, incorrect interpretation of input, or flawed inference. Also, AI might provide incorrect, offensive, or inappropriate statements and claims.
Copilot must mitigate or reduce these risks by incorporating human oversight, also known as human-in-the-loop (HITL), into the experience. To reduce the risk of undesirable or harmful side effects, Copilot supports the following:
- A preview of AI-generated results before taking action on the user's behalf.
- When showing generated information, Copilot indicates that the information is AI-generated and might state the confidence level, allowing users to make informed decisions.
- Users have the option to completely discard or undo the results.
- Users can provide feedback about the quality of the results.
Next steps
Feedback
https://aka.ms/ContentUserFeedback.
Coming soon: Throughout 2024 we will be phasing out GitHub Issues as the feedback mechanism for content and replacing it with a new feedback system. For more information see:Submit and view feedback for