- Microsoft 365 developer tenant with custom apps enabled
- Visual Studio 2022 with Teams Toolkit (Microsoft Teams development tools) workload installed
- Azure subscription
- Open AzureAI service with a deployed base model configured with File Upload data source (On Your Data). Use 'documents' as the name of your index.
- Use these documents in your file upload
- Azure AI Content Safety service
Custom engine agents are chatbots for Microsoft Teams powered by generative AI, designed to provide sophisticated conversational experiences. Custom engine agents are built using the Teams AI library, which provides comprehensive AI functionalities, including managing prompts, actions, and model integration as well as extensive options for UI customization. This ensures that your chatbots leverage the full range of AI capabilities while delivering a seamless and engaging experience aligned with Microsoft platforms.
Here, you create a custom engine agent that uses a language model hosted in Azure to answer questions using natural language:
- Create: Create a custom agent agent project and use Teams Toolkit in Visual Studio.
- Provision: Upload your custom engine agent to Microsoft Teams and validate the results.
- Prompt template: Determine the agent behaviour.
- Suggested prompts: Define prompts for starting new conversations.
- Message handlers: Run logic when recieving a specific keyword or phrase.
- Chat with your data: Integrate with Azure AI Search to implement Retrieval Augmentation Generation (RAG).
- Feedback: Collect feedback from end users.
- Customize responses: Customize the agent response.
- Sensitive information: Display sensitivity guidance to end users.
- Content moderation: Integrate Azure Content Safety service to detect harmful user-generated and AI-generated content.
Start with opening the starter project in Visual Studio 2022.
- Clone this repository to your machine.
- In the LAB-466-BEGIN folder, open Custom.Engine.Agent.sln to launch Visual Studio.
The solution contains two projects:
- Custom.Engine.Agent: This is an ASP.NET Core Web API project which contains your agent code. The agent logic and generative AI capatbilies are implemented using Teams AI library.
- TeamsApp: This is a Teams Toolkit project which contains the app package files, environment, workflow and infrastructure files. You will use this project to provision the required resources for your agent.
Dev tunnels allow developers to securely share local web services across the internet. When users interact with the agent in Microsoft Teams, the Teams platform will send and recieve messages (called Activities) from your agent code via the Bot Framework. As the code is running on your local machine, the Dev Tunnel exposes the localhost domain which your web API runs on as a publicly accessible URL.
Continue in Visual Studio:
- Open the View menu, expand Other windows, and select Dev Tunnels.
- In the Dev Tunnels pane, select the plus (+) icon.
- In the dialog window, create the tunnel using the following settings:
- Account: Expand the dropdown and select Sign in, then select Work or school account, then again and select OK. Use the Microsoft 365 account details to sign in. In the Stay signed in to all your apps dialog, select No, sign in to this app only.
- Name: custom-engine-agent
- Tunnel type: Temporary
- Access: Public
- To create the tunnel, select OK.
- In the confirmation prompt, select OK.
- Close the Dev Tunnels window.
To save time we have already provisioned a language model in Azure for you to use in this lab. Teams Toolkit uses environment (.env) files to store values centrally that can be used across your application.
Continue in Visual Studio:
-
In the TeamsApp project, expand the env folder.
-
Open .env.local file.
-
Update the AZURE_OPENAI_ENDPOINT and AZURE_OPENAI_DEPLOYMENT_NAME environment variable values with the values from your model deployment.
-
Save your changes.
-
Rename .env.local.user.sample to .env.local.user.
-
Open .env.local.user file.
-
Update the contents of the file, replacing [INSERT KEY] with the value from your model deployment.
SECRET_AZURE_OPENAI_API_KEY=[INSERT KEY]
-
Save the changes.
Note
When Teams Toolkit uses an environment variable with that is prefixed with SECRET, it will ensure that the value does not appear in any logs.
Teams Toolkit help developers automate tasks using workflow files. The workflow files are YML files which are stored in the root of the TeamsApp project.
Continue in Visual Studio:
- In TeamsApp project, open teamsapp.local.yml.
- Examine the contents of the file.
The file contains a single stage called Provision which contains several tasks.
- teamsApp/create: Registers an app in Teams Developer Portal and writes the app ID to env.env.local.
- aadApp/create: Registers an app in Microsoft Entra and writes several values to env.env.local.
- aadApp/update: Applies an app manifest to the Microsoft Entra app registration.
- arm/deploy: Provisions the Azure Bot Service using Bicep. It writes several values back to env.env.local.
- file/createOrUpdateJsonFile: Updates appsettings.development.json file with environment variables which can be used by code at runtime.
- teamsApp/validateManifest: Validates the app manifest file.
- teamsApp/zipAppPackage: Creates the Teams app package.
- teamsApp/validateAppPackage: Validates the app package.
- teamsApp/update: Updates the app registration in the Teams Developer Portal.
- Right-click TeamsApp project.
- Expand the Teams Toolkit menu and select Prepare Teams App Dependencies.
- In the Microsoft 365 account dialog, select the account you used to create the Dev Tunnel earlier and select Continue. This will start the Dev Tunnel and write the tunnel endpoint and domain to the env\env.local file.
- In the Provision dialog, configure the resource group to be used to host the Azure Bot Service:
- Subscription: Expand the dropdown and select the subscription in the list
- Resource group: Select New..., enter rg-custom-engine-agent-local the text field and then select OK.
- Region: Expand the dropdown and select a region in the list
- Select Provision
- In the warning prompt, select Provision.
- Wait for the process to complete, this can take a 1-3 minutes. Teams Toolkit will output its progress in the Output pane.
- In the Info prompt, select View provisioned resources to open a browser.
Take a minute to examine the Azure Bot Service resource in the Azure Portal.
With everything in place, we are now ready to test our custom engine agent in Microsoft Teams for the first time.
First, we need to start a debug session to start our local web API that contains the agent logic.
Continue in Visual Studio:
- To start a debug session, press F5 on your keyboard, or select the Start button in the toolbar. A browser window is launched and navigates to Microsoft Teams.
- In the browser, sign in to Microsoft 365 using your Microsoft 365 account details.
Important
The first time you debug Teams the app install dialog will not appear, instead a Welcome to Teams dialog is shown instead. To install the app for the first time, you will need to steop and create a new debug session.
- Close the browser to stop the debug session.
- To start a debug session, press F5 on your keyboard, or select the Start button in the toolbar. A browser window is launched and navigates to Microsoft Teams.
- Wait for Microsoft Teams to load and for the App install dialog to appear.
Previously, Teams Toolkit registered the app in the Teams Developer Portal. To use the app we need to install it for the current user. Teams Toolkit launches the browser using a special URL which enables developers to install the app before they test it.
Note
If any changes are made to the app manifest file. Developers will need to run the Prepare Teams App dependencies process again and install the app for the changes to be reflected in Microsoft Teams.
Continuing in the web browser:
- In the App install dialog, select Add.
- In the App install confirmation dialog, select Open. The custom engine agent is displayed in Microsoft Teams.
Now let's test that everything is working as expected.
Continuing in the web browser:
- Enter Hello, world! in the message box and press Enter to send the message to the agent. A typing indicator appears whilst waiting for the agent to respond.
- Notice the natural language response from the agent and a label AI generated is shown in the agent response.
- Continue a conversation with the agent.
- Go back to Visual Studio. Notice that in the Debug pane, Teams AI library is tracking the full conversation and displays appended conversation history in the output.
- Close the browser to stop the debug session.
The functionality of our agent is implemented using Teams AI library. Let's take a look at how our agent is configured.
In Visual Studio:
- In the Custom.Engine.Agent project, open Program.cs file.
- Examine the contents of the file.
The file sets up the web application and integrates it with Microsoft Bot Framework and services.
- WebApplicationBuilder: Initializes web application with controllers and HTTP client services.
- Configuration: Retrieve configuration options from the apps configration and sets up Bot Framework authentication.
- Dependency injection: Registers BotFrameworkAuthentication and TeamsAdapter services. Configures Azure Blob Storage for persisting agent state and sets up an Azure OpenAI model service.
- Agent setup: Registers the agent as a transient service. The agent logic is implemented using Teams AI library.
Let's take a look at the agent setup.
builder.Services.AddTransient<IBot>(sp =>
{
// Create loggers
ILoggerFactory loggerFactory = sp.GetService<ILoggerFactory>();
// Create Prompt Manager
PromptManager prompts = new(new()
{
PromptFolder = "./Prompts"
});
// Create ActionPlanner
ActionPlanner<TurnState> planner = new(
options: new(
model: sp.GetService<OpenAIModel>(),
prompts: prompts,
defaultPrompt: async (context, state, planner) =>
{
PromptTemplate template = prompts.GetPrompt("Chat");
return await Task.FromResult(template);
}
)
{ LogRepairs = true },
loggerFactory: loggerFactory
);
Application<TurnState> app = new ApplicationBuilder<TurnState>()
.WithAIOptions(new(planner))
.WithStorage(sp.GetService<IStorage>())
.Build();
return app;
});
The key elements of the agent setup are:
- ILoggerFactory: Used for logging messages to the output pane for debugging.
- PromptManager: Determines the location of prompt templates.
- ActionPlanner: Determines which model and prompt should be used when handling a user message. By default, the planner uses a prompt template named Chat.
- ApplicationBuilder: Creates an object which represents a Bot that can handle incoming activities.
The agent is added as a transient service which means that everytime a message is recieved from the Bot Framework, our agent code will be executed.
Prompts play a crucial role in communicating and directing the behavior of language models.
Prompts are stored in the Prompts folder. A prompt is defined as a subfolder that contains two files:
- config.json: The prompt configuration. This enables you to control parameters such as temperature, max tokens etc. that are passed to the language model.
- skprompt.txt: The prompt text template. This text determines the behaviour of the agent.
Here, you'll update the default prompt to change the agents behaviour.
Continuing in Visual Studio:
-
In the Custom.Engine.Agent project, expand the Prompt folder.
-
In the Chat folder, open the skprompt.txt file.
-
Update the contents of the file:
You are a career specialist named "Career Genie" that helps Human Resources team for writing job posts. You are friendly and professional. You always greet users with excitement and introduce yourself first. You like using emojis where appropriate.
-
Save changes.
Now let's test our change.
- To start a debug session, press F5 on your keyboard, or select the Start button in the toolbar.
Continuing in the web browser:
- In the app dialog, select Open to open the agent in Microsoft Teams.
- In the message box, enter Hi and send the message. Wait for the response. Notice the change in the response.
- In the message box, enter Can you help me write a job post for a Senior Developer role? and send the message. Wait for the response.
Continue the conversation by sending more messages.
- What would be the list of required skills for a Project Manager role?
- Can you share a job template?
Close the browser to stop the debug session.
Suggested prompts are shown in the user interface and a good way for users to discover how the agent can help them through examples.
Here, you'll define two suggested prompts.
Continuing in Visual Studio:
-
In the TeamsApp project, expand the appPackage folder.
-
In the appPackage folder, open the manifest.json file.
-
In the bots array property, expand the first object with a commandLists array property.
"commandLists": [ { "scopes": [ "personal" ], "commands": [ { "title": "Write a job post for <role>", "description": "Generate a job posting for a specific role" }, { "title": "Skill required for <role>", "description": "Identify skills required for a specific role" } ] } ]
-
Save your changes.
The bots array property should look like:
"bots": [
{
"botId": "${{BOT_ID}}",
"scopes": [
"personal"
],
"supportsFiles": false,
"isNotificationOnly": false,
"commandLists": [
{
"scopes": [
"personal"
],
"commands": [
{
"title": "Write a job post for <role>",
"description": "Generate a job posting for a specific role"
},
{
"title": "Skill required for <role>",
"description": "Identify skills required for a specific role"
}
]
}
]
}
],
As you've made a change to the app manifest file, we need to Run the Prepare Teams App Dependencies process to update the app registration in the Teams Developer Portal before starting a debug session to test it.
Continuing in Visual Studio:
- Right-click TeamsApp project, expand the Teams Toolkit menu and select Prepare Teams App Dependencies.
- Confirm the prompts and wait till the process completes.
Now let's test the change.
- Start a debug session, press F5 on your keyboard, or select the Start button in the toolbar.
Continuing in the web browser:
- In the app dialog, select Open to open the agent in Microsoft Teams.
- Above the message box, select View prompts to open the prompt suggestions flyout.
- In the Prompts dialog, select one of the prompts. The text is added into the message box.
- In the message box, replace <role> with a job title, for example, Senior Software Engineer, and send the message.
The prompt suggestions can also be seen when the user opens the agent for the first time.
Continuing in the web browser:
- In the Microsoft Teams side bar, go to Chat.
- Find the chat with the name Custom engine agent in the list and select the ... menu.
- Select Delete and confirm the action.
- In the Microsoft Teams side bar, select ... to open the apps flyout.
- Select Custom engine agent to start a new chat. The two suggested prompts are shown in the user interface.
Close the browser to stop the debug session.
Suppose you want to run some logic when a message that contains a specific phrase or keyword is sent to the agent, a message handler allows you to do that.
Up to this point, every time you send and recieve a message the contents of the messages are saved in the agent state. During development the agent state is stored in an emulated Azure Storage account hosted on your machine. You can inspect the agent state using Azure Storage Explorer.
Note
Message handlers are processed before the ActionPlanner and so take priority for handling the response.
Here, you'll create a message handler that will clear the conversation history stored in the agent state when a message that contains /new is sent, and respond with a fixed message.
Continuing in Visual Studio:
-
In the Custom.Engine.Agent project, create a file called MessageHandlers.cs with the following contents:
using Microsoft.Bot.Builder; using Microsoft.Teams.AI.State; namespace Custom.Engine.Agent; internal class MessageHandlers { internal static async Task NewChat(ITurnContext turnContext, TurnState turnState, CancellationToken cancellationToken) { turnState.DeleteConversationState(); await turnContext.SendActivityAsync("Conversation history has been cleared and a new conversation has been started.", cancellationToken: cancellationToken); } }
-
Save your changes.
-
Open Program.cs, in the agent code, add the following code after the app declaration :
app.OnMessage("/new", MessageHandlers.NewChat);
-
Save your changes.
The agent code should look like:
builder.Services.AddTransient<IBot>(sp =>
{
// Create loggers
ILoggerFactory loggerFactory = sp.GetService<ILoggerFactory>();
// Create Prompt Manager
PromptManager prompts = new(new()
{
PromptFolder = "./Prompts"
});
// Create ActionPlanner
ActionPlanner<TurnState> planner = new(
options: new(
model: sp.GetService<OpenAIModel>(),
prompts: prompts,
defaultPrompt: async (context, state, planner) =>
{
PromptTemplate template = prompts.GetPrompt("Chat");
return await Task.FromResult(template);
}
)
{ LogRepairs = true },
loggerFactory: loggerFactory
);
Application<TurnState> app = new ApplicationBuilder<TurnState>()
.WithAIOptions(new(planner))
.WithStorage(sp.GetService<IStorage>())
.Build();
app.OnMessage("/new", MessageHandlers.NewChat);
return app;
});
Now let's test the change.
Tip
Your debug session from the previous section should still be running, if not start a new debug session.
- In the message box, enter /new and send the message. Notice that the message in the response is not from the language model but from the message handler.
Close the browser to stop the debug session.
Retrieval Augmentation Generation (RAG) is a technique used to improve the accuracy and relevance of responses generated by language models. Suppose you have a collection of documents that you want the language model to reason over and use in it's responses. RAG enables you to provide extra knowledge and context beyond the data that the language model is trained on.
Azure OpenAI On Your Data enables you to run language models on your own enterprise data without needing to train or fine-tune models. You can specify sources to support the responses based on the latest information available in your designated data sources.
Here you'll implement RAG using Azure Open On Your Data to enable the language model to reason over resumes and provide candidate recommendations.
Like we did with the language model, we've already provisioned and configured the services in Azure for you to use.
We provisioned and configured the following resources:
- Azure Storage Account: Stores files uploaded through the Azure OpenAI On Your Data file upload feature.
- Embeddings model: Generates numerical representations (embeddings) of document contents for use with language models during the file upload process.
- Azure AI Search: Hosts the search index of our documents. Contains the document embeddings and additional metadata, such as file paths and timestamps.
First, let's create some environment varibles to store details that we will need to integrate Azure AI Search.
Continuing in Visual Studio:
-
In the TeamApp project, expand the env folder.
-
Open the .env.local file and add the following environment variables, replacing {INSERT ENDPOINT] with the URL of your Azure AI Search service:
AZURE_SEARCH_ENDPOINT=[INSERT ENDPOINT] AZURE_SEARCH_INDEX_NAME=documents
-
Save your changes.
-
Open the .env.local.user file and add a new environment variable, replacing [INSERT KEY] with the key of your Azure AI Search service.
SECRET_AZURE_SEARCH_KEY=[INSERT KEY]
-
Save your changes.
Next, let's make sure that these value are written to the appsettings.development.json file so we can access them at runtime in our agent code.
-
In the Custom.Engine.Agent project, open teamsapp.local.yml file.
-
Update the file/createOrUpdateJsonFile action with the new environment variables:
AZURE_SEARCH_ENDPOINT: ${{AZURE_SEARCH_ENDPOINT}} AZURE_SEARCH_INDEX_NAME: ${{AZURE_SEARCH_INDEX_NAME}} AZURE_SEARCH_KEY: ${{SECRET_AZURE_SEARCH_KEY}}
Important
YAML is strict when it comes to indendation, make sure you indent the YAML after pasting by using tabs to remove the errors in the editor.
- Save your changes.
The file/createOrUpdateJsonFile should look like:
- uses: file/createOrUpdateJsonFile
with:
target: ../Custom.Engine.Agent/appsettings.Development.json
content:
BOT_ID: ${{BOT_ID}}
BOT_PASSWORD: ${{SECRET_BOT_PASSWORD}}
AZURE_OPENAI_DEPLOYMENT_NAME: ${{AZURE_OPENAI_DEPLOYMENT_NAME}}
AZURE_OPENAI_KEY: ${{SECRET_AZURE_OPENAI_API_KEY}}
AZURE_OPENAI_ENDPOINT: ${{AZURE_OPENAI_ENDPOINT}}
AZURE_STORAGE_CONNECTION_STRING: UseDevelopmentStorage=true
AZURE_STORAGE_BLOB_CONTAINER_NAME: state
AZURE_SEARCH_ENDPOINT: ${{AZURE_SEARCH_ENDPOINT}}
AZURE_SEARCH_INDEX_NAME: ${{AZURE_SEARCH_INDEX_NAME}}
AZURE_SEARCH_KEY: ${{SECRET_AZURE_SEARCH_KEY}}
Now, extend the model so we can easily access the new environment variable values in code.
-
Open Config.cs, update the ConfigOptions class, add the following properties:
public string AZURE_SEARCH_ENDPOINT { get; set; } public string AZURE_SEARCH_INDEX_NAME { get; set; } public string AZURE_SEARCH_KEY { get; set; }
-
Save your changes.
The ConfigOptions class should look like:
public class ConfigOptions
{
public string BOT_ID { get; set; }
public string BOT_PASSWORD { get; set; }
public string AZURE_OPENAI_KEY { get; set; }
public string AZURE_OPENAI_ENDPOINT { get; set; }
public string AZURE_OPENAI_DEPLOYMENT_NAME { get; set; }
public string AZURE_STORAGE_CONNECTION_STRING { get; set; }
public string AZURE_STORAGE_BLOB_CONTAINER_NAME { get; set; }
public string AZURE_SEARCH_ENDPOINT { get; set; }
public string AZURE_SEARCH_INDEX_NAME { get; set; }
public string AZURE_SEARCH_KEY { get; set; }
}
Update the prompt template configuration file to integrate Azure OpenAI On Your Data data source.
Continuing in Visual Studio:
-
In the Custom.Engine.Agent project, expand the Prompts folder.
-
Expand the Chat folder and open the config.json file. Replace the file contents with the following code:
{ "schema": 1.1, "description": "Custom engine agent", "type": "completion", "completion": { "model": "gpt-4", "completion_type": "chat", "include_history": true, "include_input": true, "max_input_tokens": 100, "max_tokens": 1000, "temperature": 0.1, "top_p": 1, "presence_penalty": 0, "frequency_penalty": 0, "data_sources": [ { "type": "azure_search", "parameters": { "endpoint": "$azure-search-endpoint$", "index_name": "$azure-search-index-name$", "authentication": { "type": "api_key", "key": "$azure-search-key$" } } } ] } }
-
Save your changes.
Let's examine some of the key changes made to the configuration.
The properties temperature and top_p control the creativity, or randomness of language model outputs. To take a simple view, language models work by selecting the next probable word (token).
- temperature: Has been lowered to 0.1. Instructs the model to be more deterministic, choosing only the most probable tokens in it's reasoning.
- top_p: Has been increased to 1. Instructs the model to choose from the widest pool of words (tokens) possible in it's reasoning.
These changes optimise the language model for use in scenarios where precision and unambiguity is critical. You should always consider adjusting these parameters to be appropriate for your use case.
Azure OpenAI On Your Data integration is defined in the data_sources array. Here we define our configuration providing the minimum required information needed to use the retrieve documents from the Azure AI Search index in the model's reasoning process.
Notice that we use placeholders as values for some properties, for example
Continuing in Visual Studio:
-
Open Program.cs file.
-
Replace the contents of the defaultPrompt function to dynamically replace the placeholders in the prompt template configuration with the values we stored in our environment variable files earlier.
PromptTemplate template = prompts.GetPrompt("Chat"); var dataSources = template.Configuration.Completion.AdditionalData["data_sources"]; var dataSourcesString = JsonSerializer.Serialize(dataSources); var replacements = new Dictionary<string, string> { { "$azure-search-key$", config.AZURE_SEARCH_KEY }, { "$azure-search-index-name$", config.AZURE_SEARCH_INDEX_NAME }, { "$azure-search-endpoint$", config.AZURE_SEARCH_ENDPOINT }, }; foreach (var replacement in replacements) { dataSourcesString = dataSourcesString.Replace(replacement.Key, replacement.Value); } dataSources = JsonSerializer.Deserialize<JsonElement>(dataSourcesString); template.Configuration.Completion.AdditionalData["data_sources"] = dataSources; return await Task.FromResult(template);
-
Save your changes.
The ActionPlanner object should look like:
ActionPlanner<TurnState> planner = new(
options: new(
model: sp.GetService<OpenAIModel>(),
prompts: prompts,
defaultPrompt: async (context, state, planner) =>
{
PromptTemplate template = prompts.GetPrompt("Chat");
var dataSources = template.Configuration.Completion.AdditionalData["data_sources"];
var dataSourcesString = JsonSerializer.Serialize(dataSources);
var replacements = new Dictionary<string, string>
{
{ "$azure-search-key$", config.AZURE_SEARCH_KEY },
{ "$azure-search-index-name$", config.AZURE_SEARCH_INDEX_NAME },
{ "$azure-search-endpoint$", config.AZURE_SEARCH_ENDPOINT },
};
foreach (var replacement in replacements)
{
dataSourcesString = dataSourcesString.Replace(replacement.Key, replacement.Value);
}
dataSources = JsonSerializer.Deserialize<JsonElement>(dataSourcesString);
template.Configuration.Completion.AdditionalData["data_sources"] = dataSources;
return await Task.FromResult(template);
}
)
{ LogRepairs = true },
loggerFactory: loggerFactory
);
The defaultPrompt anonymous function provides a way to dyanmically alter the behaviour of our agent. This is where you can include logic to choose different prompt templates for different functions or behaviours that you want the agent to provide. Suppose you want to dynamically adjust the temperature or choose a different prompt template based in a specific input. Here is where you would add the logic to make those changes on the fly.
Now, update the prompt text to reflect the change in agent behaviour.
-
In the Custom.Engine.Agent project, expand the Prompts folder, then expand the chat folder.
-
Open the skprompt.txt file, then replace the contents with the following:
You are a career specialist named "Career Genie" that helps Human Resources team for finding the right candidate for the jobs. You are friendly and professional. You always greet users with excitement and introduce yourself first. You like using emojis where appropriate. Always mention all citations in your content.
As we've made a change to the app manifest file, we need to Run the Prepare Teams App Dependencies process to update the app registration in the Teams Developer Portal before starting a debug session to test it.
Continuing in Visual Studio:
- Right-click TeamsApp project, expand the Teams Toolkit menu and select Prepare Teams App Dependencies.
- Confirm the prompts and wait till the process completes.
Now let's test the change.
- Start a debug session, press F5 on your keyboard, or select the Start button in the toolbar.
Continuing in the web browser:
- In the app dialog, select Open to open the agent in Microsoft Teams.
- In the message box, enter Can you suggest a candidate who is suitable for spanish speaking role that requires at least 2 years of .NET experience? and send the message. Wait for the response.
Note that the response contains a reference to a document. The document was used by the language model in it's reasoning when generating an answer and provided as part of the answer. Hover over the reference in the response to view more information about the document.
Try out some more prompts and review the outputs, for example:
- Who would be suitable for a position that requires 5+ python development experience?
Close the browser to stop the debug session.
Feedback is a crucial way to understand the quality of the responses that are produced by your agent once you put it in the hands of your end users. Using the Feedback Loop feature in Teams AI library, you can enable controls to collect postive and negative feedback from end users in the response.
Here, you'll create a feedback handler and register it with the application to capture user feedback.
Continuing in Visual Studio:
-
In the Custom.Engine.Agent project, create a new folder with the name Models.
-
In the Models folder, create a new file with the name Feedback.cs with the following contents:
using System.Text.Json.Serialization; namespace Custom.Engine.Agent.Models; internal class Feedback { [JsonPropertyName("feedbackText")] public string FeedbackText { get; set; } }
- In the Custom.Engine.Agent project, create a new file with the name FeedbackHandler.cs with the following contents:
using Custom.Engine.Agent.Models;
using Microsoft.Bot.Builder;
using Microsoft.Teams.AI.Application;
using Microsoft.Teams.AI.State;
using System.Text.Json;
namespace Custom.Engine.Agent;
internal class FeedbackHandler
{
internal static async Task OnFeedback(ITurnContext turnContext, TurnState turnState, FeedbackLoopData feedbackLoopData, CancellationToken cancellationToken)
{
var reaction = feedbackLoopData.ActionValue.Reaction;
var feedback = JsonSerializer.Deserialize<Feedback>(feedbackLoopData.ActionValue.Feedback).FeedbackText;
await turnContext.SendActivityAsync($"Thank you for your feedback!", cancellationToken: cancellationToken);
await turnContext.SendActivityAsync($"Reaction: {reaction}", cancellationToken: cancellationToken);
await turnContext.SendActivityAsync($"Feedback: {feedback}", cancellationToken: cancellationToken);
}
}
Now, update the agent logic.
-
In the Custom.Engine.Agent project, open Program.cs file.
-
In the agent code, create a new AIOptions object after the ActionPlanner object.
AIOptions<TurnState> options = new(planner) { EnableFeedbackLoop = true };
-
Update Application object, passing the new options object.
Application<TurnState> app = new ApplicationBuilder<TurnState>() .WithAIOptions(options) .WithStorage(sp.GetService<IStorage>()) .Build();
-
After the message handler, register the Feedback Loop handler with the application.
app.OnFeedbackLoop(FeedbackHandler.OnFeedback);
-
Save your changes
Your agent code should look like the following:
builder.Services.AddTransient<IBot>(sp =>
{
// Create loggers
ILoggerFactory loggerFactory = sp.GetService<ILoggerFactory>();
// Create Prompt Manager
PromptManager prompts = new(new()
{
PromptFolder = "./Prompts"
});
// Create ActionPlanner
ActionPlanner<TurnState> planner = new(
options: new(
model: sp.GetService<OpenAIModel>(),
prompts: prompts,
defaultPrompt: async (context, state, planner) =>
{
PromptTemplate template = prompts.GetPrompt("Chat");
var dataSources = template.Configuration.Completion.AdditionalData["data_sources"];
var dataSourcesString = JsonSerializer.Serialize(dataSources);
var replacements = new Dictionary<string, string>
{
{ "$azure-search-key$", config.AZURE_SEARCH_KEY },
{ "$azure-search-index-name$", config.AZURE_SEARCH_INDEX_NAME },
{ "$azure-search-endpoint$", config.AZURE_SEARCH_ENDPOINT },
};
foreach (var replacement in replacements)
{
dataSourcesString = dataSourcesString.Replace(replacement.Key, replacement.Value);
}
dataSources = JsonSerializer.Deserialize<JsonElement>(dataSourcesString);
template.Configuration.Completion.AdditionalData["data_sources"] = dataSources;
return await Task.FromResult(template);
}
)
{ LogRepairs = true },
loggerFactory: loggerFactory
);
AIOptions<TurnState> options = new(planner)
{
EnableFeedbackLoop = true
};
Application<TurnState> app = new ApplicationBuilder<TurnState>()
.WithAIOptions(options)
.WithStorage(sp.GetService<IStorage>())
.Build();
app.OnMessage("/new", MessageHandlers.NewChat);
app.OnFeedbackLoop(FeedbackHandler.OnFeedback);
return app;
});
Now let's test the changes.
Continuing in Visual Studio:
- Start a debug session, press F5 on your keyboard, or select the Start button in the toolbar.
Continuing in the web browser:
- In the app dialog, select Open to open the agent in Microsoft Teams.
- In the message box, enter /new and send the message to clear the conversation history and start a new chat.
- In the message box, enter Can you suggest a candidate who is suitable for spanish speaking role that requires at least 2 years of .NET experience? and send the message. Wait for the response.
- In the repsonse, select either the thumbs up (👍) or thumbs down (👎) icon. A feedback dialog is displayed.
- Enter a message into the message box and submit the feedback. Your reaction and feedback text is displayed in a response.
- Close the browser to stop the debug session.
You've seen so far that Teams AI library provides some user interface components automatically, such as the AI generated label and document citations when you integrated Azure OpenAI On Your Data. Suppose you want more granular control over how responses are represented, for example, you want to display additional controls. Teams AI library allows developers to override the PredictedSAYCommand action which is responsible for sending the repsonse from the language model to the Teams user interface.
Here, you'll render the language model response in an Adaptive Card. The Adaptive Card displays the languge model text response and includes controls to display additional citation information.
Continuing in Visual Studio:
- In the Custom.Engine.Agent project, create a file named ResponseCardCreator.cs with the following contents:
using AdaptiveCards;
using Microsoft.Teams.AI.AI.Models;
namespace Custom.Engine.Agent;
internal static class ResponseCardCreator
{
public static AdaptiveCard CreateResponseCard(ChatMessage response)
{
var citations = response.Context.Citations;
var citationCards = new List<AdaptiveAction>();
for (int i = 0; i < citations.Count; i++)
{
var citation = citations[i];
var card = new AdaptiveCard(new AdaptiveSchemaVersion(1, 5))
{
Body = [
new AdaptiveTextBlock
{
Text = citation.Title,
Weight = AdaptiveTextWeight.Bolder,
FontType = AdaptiveFontType.Default
},
new AdaptiveTextBlock
{
Text = citation.Content,
Wrap = true
}
]
};
citationCards.Add(new AdaptiveShowCardAction
{
Title = $"{i + 1}",
Card = card
});
}
var formattedText = FormatResponse(response.GetContent<string>());
var adaptiveCard = new AdaptiveCard(new AdaptiveSchemaVersion(1, 5))
{
Body = [
new AdaptiveTextBlock
{
Text = formattedText,
Wrap = true
},
new AdaptiveTextBlock
{
Text = "Citations",
Weight = AdaptiveTextWeight.Bolder,
FontType = AdaptiveFontType.Default,
Wrap = true
},
new AdaptiveActionSet
{
Actions = citationCards
}
]
};
return adaptiveCard;
}
private static string FormatResponse(string text)
{
return System.Text.RegularExpressions.Regex.Replace(text, @"\[doc(\d)+\]", "**[$1]** ");
}
}
This class is responsible for creating an Adaptive Card that contains the response from the LLM and document citations.
Next, create an action handler to override the PredictedSAYCommand action.
- Create a file named Actions.cs with the following contents:
using Microsoft.Bot.Builder;
using Microsoft.Teams.AI.AI.Action;
using Microsoft.Teams.AI.AI.Planners;
using Microsoft.Teams.AI.AI;
using AdaptiveCards;
using Microsoft.Bot.Schema;
using Newtonsoft.Json.Linq;
namespace Custom.Engine.Agent;
internal class Actions
{
[Action(AIConstants.SayCommandActionName, isDefault: false)]
public static async Task<string> SayCommandAsync([ActionTurnContext] ITurnContext turnContext, [ActionParameters] PredictedSayCommand command, CancellationToken cancellationToken = default)
{
IMessageActivity activity;
if (command?.Response?.Context?.Citations?.Count > 0)
{
AdaptiveCard card = ResponseCardCreator.CreateResponseCard(command.Response);
Attachment attachment = new()
{
ContentType = AdaptiveCard.ContentType,
Content = card
};
activity = MessageFactory.Attachment(attachment);
}
else
{
activity = MessageFactory.Text(command.Response.GetContent<string>());
}
activity.Entities =
[
new Entity
{
Type = "https://schema.org/Message",
Properties = new()
{
{ "@type", "Message" },
{ "@context", "https://schema.org" },
{ "@id", string.Empty },
{ "additionalType", JArray.FromObject(new string[] { "AIGeneratedContent" } ) }
}
}
];
activity.ChannelData = new
{
feedbackLoopEnabled = true
};
await turnContext.SendActivityAsync(activity, cancellationToken);
return string.Empty;
}
}
The method is responsible for creating and sending a message activity. If the language model response includes citations, it creates an adaptive card and attaches it to the message. Otherwise, it sends a simple text message.
An entity is defined in the activity which represents the AI generated label, and channelData is defined which enables the feedback controls. As we are overriding the default handler, we need to provide these in the activity otherwise they will not be displayed.
Next, register the action in the agent code.
-
In the Custom.Engine.Agent project, open Program.cs file.
-
Register the Actions class with the application after the feeback loop handler.
app.AI.ImportActions(new Actions());
-
Save your changes.
Your agent code should look like:
builder.Services.AddTransient<IBot>(sp =>
{
// Create loggers
ILoggerFactory loggerFactory = sp.GetService<ILoggerFactory>();
// Create Prompt Manager
PromptManager prompts = new(new()
{
PromptFolder = "./Prompts"
});
// Create ActionPlanner
ActionPlanner<TurnState> planner = new(
options: new(
model: sp.GetService<OpenAIModel>(),
prompts: prompts,
defaultPrompt: async (context, state, planner) =>
{
PromptTemplate template = prompts.GetPrompt("Chat");
var dataSources = template.Configuration.Completion.AdditionalData["data_sources"];
var dataSourcesString = JsonSerializer.Serialize(dataSources);
var replacements = new Dictionary<string, string>
{
{ "$azure-search-key$", config.AZURE_SEARCH_KEY },
{ "$azure-search-index-name$", config.AZURE_SEARCH_INDEX_NAME },
{ "$azure-search-endpoint$", config.AZURE_SEARCH_ENDPOINT },
};
foreach (var replacement in replacements)
{
dataSourcesString = dataSourcesString.Replace(replacement.Key, replacement.Value);
}
dataSources = JsonSerializer.Deserialize<JsonElement>(dataSourcesString);
template.Configuration.Completion.AdditionalData["data_sources"] = dataSources;
return await Task.FromResult(template);
}
)
{ LogRepairs = true },
loggerFactory: loggerFactory
);
AIOptions<TurnState> options = new(planner)
{
EnableFeedbackLoop = true
};
Application<TurnState> app = new ApplicationBuilder<TurnState>()
.WithAIOptions(options)
.WithStorage(sp.GetService<IStorage>())
.Build();
app.OnMessage("/new", MessageHandlers.NewChat);
app.OnFeedbackLoop(FeedbackHandler.OnFeedback);
app.AI.ImportActions(new Actions());
return app;
});
Now let's test the change.
- Start a debug session, press F5 on your keyboard, or select the Start button in the toolbar.
Continuing in the web browser:
- In the app dialog, select Open to open the agent in Microsoft Teams.
- In the message box, enter /new and send the message to clear the conversation history and start a new chat.
- In the message box, enter Can you suggest a candidate who is suitable for spanish speaking role that requires at least 2 years of .NET experience? and send the message. Wait for the response.
Close the browser to stop the debug session.
Not all company data should be shared outside your organsation, some data can be sensitive. As you noted in the previous section, you defined a label in the activity Entities collection which displayed the AI generated label in the response.
Here, you'll update the entity properties to display a new label to inform users that the information provided may be sensitive and whether or not, and whether it can be shared outside of your organization.
Continuing in Visual Studio:
-
In the Custom.Engine.Agent project, open Actions.cs file.
-
Update the Properties collection of the activity Entity with a new property name usageInfo.
{ "usageInfo", JObject.FromObject( new JObject(){ { "@type", "CreativeWork" }, { "name", "Confidential" }, { "description", "Sensitive information, do not share outside of your organization." }, }) }
-
Save your changes.
The Entities collection should look like:
activity.Entities =
[
new Entity
{
Type = "https://schema.org/Message",
Properties = new()
{
{ "@type", "Message" },
{ "@context", "https://schema.org" },
{ "@id", string.Empty },
{ "additionalType", JArray.FromObject(new string[] { "AIGeneratedContent" } ) },
{ "usageInfo", JObject.FromObject(
new JObject(){
{ "@type", "CreativeWork" },
{ "name", "Confidential" },
{ "description", "Sensitive information, do not share outside of your organization." },
})
}
}
}
];
Now let's test the change.
- Start a debug session, press F5 on your keyboard, or select the Start button in the toolbar.
Continuing in the web browser:
- In the app dialog, select Open to open the agent in Microsoft Teams.
- In the message box, enter /new and send the message to clear the conversation history and start a new chat.
- In the message box, enter Can you suggest a candidate who is suitable for spanish speaking role that requires at least 2 years of .NET experience? and send the message. Wait for the response.
Note that next to the AI Generated label is a new shield icon. Hover over the icon to view the sensitivity information that was provided in the entity properties.
Close your browser to stop the debug session.
Azure AI Content Safety is an AI service that detects harmful user-generated and AI-generated content in applications and services.
Here, you'll register the Azure Safety Content Moderator to moderate both inputs and output, and add actions to provide custom messages when the content safety measures are triggered.
Continuing in Visual Studio:
-
In the Custom.Engine.Agent project, open Actions.cs file and add the following code to the Actions class.
[Action(AIConstants.FlaggedInputActionName)] public static async Task<string> OnFlaggedInput([ActionTurnContext] ITurnContext turnContext, [ActionParameters] Dictionary<string, object> entities) { string entitiesJsonString = System.Text.Json.JsonSerializer.Serialize(entities); await turnContext.SendActivityAsync($"I'm sorry your message was flagged: {entitiesJsonString}"); return string.Empty; } [Action(AIConstants.FlaggedOutputActionName)] public static async Task<string> OnFlaggedOutput([ActionTurnContext] ITurnContext turnContext) { await turnContext.SendActivityAsync("I'm not allowed to talk about such things."); return string.Empty; }
-
Save your changes.
To save time we have already provisioned an Azure Content Safety resource in Azure for you to use in this lab.
First, let's create some environment variables to store details that we will need to integrate with the service.
Continuing in Visual Studio:
-
In the TeamApp project, expand the env folder.
-
Open the .env.local file and add the following, replacing [INSERT ENDPOINT] with the URL of the Azure Content Safety service:
AZURE_CONTENT_SAFETY_ENDPOINT=[INSERT ENDPOINT]
-
Save your changes.
-
Open the .env.local.user file.
-
Add the SECRET_AZURE_CONTENT_SAFETY_KEY variable, replacing [INSERT KEY] with the key of the Azure Content Safety service.
SECRET_AZURE_CONTENT_SAFETY_KEY=[INSERT KEY]
-
Save your changes.
Next, let's make sure that these value are written to the appsettings.development.json file so we can access them at runtime in our agent code.
-
In the Custom.Engine.Agent project, open teamsapp.local.yml file.
-
Add the following properties to the file/createOrUpdateJsonFile action:
AZURE_CONTENT_SAFETY_KEY: ${{SECRET_AZURE_CONTENT_SAFETY_KEY}} AZURE_CONTENT_SAFETY_ENDPOINT: ${{AZURE_CONTENT_SAFETY_ENDPOINT}}
-
Save your changes.
The file/createOrUpdateJsonFile action should look like:
- uses: file/createOrUpdateJsonFile
with:
target: ../Custom.Engine.Agent/appsettings.Development.json
content:
BOT_ID: ${{BOT_ID}}
BOT_PASSWORD: ${{SECRET_BOT_PASSWORD}}
AZURE_OPENAI_DEPLOYMENT_NAME: ${{AZURE_OPENAI_DEPLOYMENT_NAME}}
AZURE_OPENAI_KEY: ${{SECRET_AZURE_OPENAI_API_KEY}}
AZURE_OPENAI_ENDPOINT: ${{AZURE_OPENAI_ENDPOINT}}
AZURE_STORAGE_CONNECTION_STRING: UseDevelopmentStorage=true
AZURE_STORAGE_BLOB_CONTAINER_NAME: state
AZURE_SEARCH_ENDPOINT: ${{AZURE_SEARCH_ENDPOINT}}
AZURE_SEARCH_INDEX_NAME: ${{AZURE_SEARCH_INDEX_NAME}}
AZURE_SEARCH_KEY: ${{SECRET_AZURE_SEARCH_KEY}}
AZURE_CONTENT_SAFETY_KEY: ${{SECRET_AZURE_CONTENT_SAFETY_KEY}}
AZURE_CONTENT_SAFETY_ENDPOINT: ${{AZURE_CONTENT_SAFETY_ENDPOINT}}
Now, extend the ConfigOptions model so we can easily access the new environment variable values in code.
-
Open Config.cs, update the ConfigOptions class with the following properties:
public string AZURE_CONTENT_SAFETY_KEY { get; set; } public string AZURE_CONTENT_SAFETY_ENDPOINT { get; set; }
-
Save your changes.
The ConfigOptions class should look like:
public class ConfigOptions
{
public string BOT_ID { get; set; }
public string BOT_PASSWORD { get; set; }
public string AZURE_OPENAI_KEY { get; set; }
public string AZURE_OPENAI_ENDPOINT { get; set; }
public string AZURE_OPENAI_DEPLOYMENT_NAME { get; set; }
public string AZURE_STORAGE_CONNECTION_STRING { get; set; }
public string AZURE_STORAGE_BLOB_CONTAINER_NAME { get; set; }
public string AZURE_SEARCH_ENDPOINT { get; set; }
public string AZURE_SEARCH_INDEX_NAME { get; set; }
public string AZURE_SEARCH_KEY { get; set; }
public string AZURE_CONTENT_SAFETY_KEY { get; set; }
public string AZURE_CONTENT_SAFETY_ENDPOINT { get; set; }
}
Now, register the Azure Content Safety moderator.
-
Open Program.cs.
-
Before the agent logic, register AzureContentSafetyModerator as a service.
builder.Services.AddSingleton<IModerator<TurnState>>(sp => new AzureContentSafetyModerator<TurnState>(new( config.AZURE_CONTENT_SAFETY_KEY, config.AZURE_CONTENT_SAFETY_ENDPOINT, ModerationType.Both )) );
-
In the bot logic, update the AIOptions object to register the safety moderator with the application.
AIOptions<TurnState> options = new(planner) { EnableFeedbackLoop = true, Moderator = sp.GetService<IModerator<TurnState>>() };
-
Save your changes.
The code to register the content safety moderator and the agent logic in Program.cs should look like:
builder.Services.AddSingleton<IModerator<TurnState>>(sp =>
new AzureContentSafetyModerator<TurnState>(new(
config.AZURE_CONTENT_SAFETY_KEY,
config.AZURE_CONTENT_SAFETY_ENDPOINT,
ModerationType.Both
))
);
// Create the bot as transient. In this case the ASP Controller is expecting an IBot.
builder.Services.AddTransient<IBot>(sp =>
{
// Create loggers
ILoggerFactory loggerFactory = sp.GetService<ILoggerFactory>();
// Create Prompt Manager
PromptManager prompts = new(new()
{
PromptFolder = "./Prompts"
});
// Create ActionPlanner
ActionPlanner<TurnState> planner = new(
options: new(
model: sp.GetService<OpenAIModel>(),
prompts: prompts,
defaultPrompt: async (context, state, planner) =>
{
PromptTemplate template = prompts.GetPrompt("Chat");
var dataSources = template.Configuration.Completion.AdditionalData["data_sources"];
var dataSourcesString = JsonSerializer.Serialize(dataSources);
var replacements = new Dictionary<string, string>
{
{ "$azure-search-key$", config.AZURE_SEARCH_KEY },
{ "$azure-search-index-name$", config.AZURE_SEARCH_INDEX_NAME },
{ "$azure-search-endpoint$", config.AZURE_SEARCH_ENDPOINT },
};
foreach (var replacement in replacements)
{
dataSourcesString = dataSourcesString.Replace(replacement.Key, replacement.Value);
}
dataSources = JsonSerializer.Deserialize<JsonElement>(dataSourcesString);
template.Configuration.Completion.AdditionalData["data_sources"] = dataSources;
return await Task.FromResult(template);
}
)
{ LogRepairs = true },
loggerFactory: loggerFactory
);
AIOptions<TurnState> options = new(planner)
{
EnableFeedbackLoop = true,
Moderator = sp.GetService<IModerator<TurnState>>()
};
Application<TurnState> app = new ApplicationBuilder<TurnState>()
.WithAIOptions(options)
.WithStorage(sp.GetService<IStorage>())
.Build();
app.OnMessage("/new", MessageHandlers.NewChat);
app.OnFeedbackLoop(FeedbackHandler.OnFeedback);
app.AI.ImportActions(new Actions());
return app;
});
Now, let's test the change.
Continuing in Visual Studio:
- Right-click TeamsApp project, expand the Teams Toolkit menu and select Prepare Teams App Dependencies.
- Confirm the prompts and wait till the process completes.
- Start a debug session, press F5 on your keyboard, or select the Start button in the toolbar.
Continuing in the web browser:
- In the app dialog, select Open to open the agent in Microsoft Teams.
- In the message box, enter /new and send the message to clear the conversation history and start a new chat.
- In the message box, enter Physical punishment is a way to correct bad behavior and doesn’t cause harm to children. and send the message. Wait for the response.
Notice that the agent response is from the flagged input action as the content of the message triggers the content safety policy. The response contains a payload that is sent from the Azure Content Safety service with details of why the message was flagged.