This accelerator enables users to quickly deploy the Azure infrastructure required to run the Knowledge Bot HR application. The chatbot application, which includes both frontend and backend code, is packaged into containers hosted on Azure Container Services. Users can provision the necessary infrastructure with just a few clicks.
The following resources will be deployed automatically as part of this accelerator:
- Azure Container Registry (ACR): Stores the container images for the chatbot's frontend and backend.
- App Service Plan: A hosting plan for the web applications (frontend and backend) and Function App.
- Azure Web Apps: Two Linux web apps—one for the backend and one for the frontend—both running on containers.
- Azure Storage Account: Stores data, documents, and other files used by the chatbot.
- Azure Cognitive Search: Provides search indexing and query capabilities for uploaded documents.
- Azure Redis Cache: Caching service to optimize application performance.
- Azure Key Vault: Manages secrets and certificates securely, such as API keys and connection strings.
- Azure Application Insights: Provides monitoring and diagnostics for the backend and blob-triggered functions.
- Log Analytics Workspace: Centralized workspace for collecting and analyzing logs.
- Azure Function App: Handles blob storage triggers and other serverless backend operations.
- Azure Event Grid: Manages event routing for storage account events (e.g., blob creation).
- Azure OpenAI Service (Optional): Hosts AI models like GPT-3.5 and GPT-4 for natural language processing in the chatbot.
- Azure Document Intelligence: Enhances the chatbot's ability to understand and extract information from documents.
- Azure SQL Database: Stores relational data for the chatbot's backend.
- Managed Identity: Provides a secure identity for accessing other Azure resources without requiring credentials in code.
Once deployed, the chatbot is ready to be customized with the necessary configurations and integrated with external services like Azure AD for authentication and Azure OpenAI for advanced language capabilities.
Before deploying the accelerator, ensure you have the following:
- GitHub account (Admin access).
- Azure Subscription with the following:
- An existing Azure Resource Group with Owner permissions.
- Azure OpenAI enabled (if using OpenAI models).
- Cognitive Services Multi-service account: Ensure you have accepted the Responsible AI Terms and Conditions. Review terms here.
- Azure Active Directory (AD): Permissions to create an app registration, assign roles, and configure authentication.
To deploy the necessary Azure infrastructure, click on the appropriate Deploy to Azure button below:
Important: This will provision the infrastructure only. You must complete the following manual steps to finalize the deployment.
- Navigate to Azure Active Directory in your Azure portal.
- Select App Registrations > New Registration to create a new AD app.
- Under Redirect URI, enter the appropriate URL where your frontend is hosted (e.g., the URL of the container in Azure Container Service).
- Set up the necessary API permissions, roles, and groups:
- Add any required permissions, such as for Microsoft Graph (e.g., reading user profile data).
- Configure authentication methods for the app (OAuth2, OpenID Connect, etc.).
- Save your changes.
If your Knowledge Bot HR integrates with OpenAI models, follow these steps:
- Navigate to your Azure OpenAI Resource in the Azure portal.
- Go to the Deployments tab and create a new deployment for the desired model (e.g., GPT-3.5, GPT-4).
- Select the model version, configure the required parameters, and click Deploy.
- After deployment, note the endpoint URL and API key as you will need these for configuring the chatbot's backend.
- Go to the Azure Function App resource that was provisioned during deployment.
- In the Functions tab, click Add to create a new function.
- Select the appropriate template (Blob Trigger) based on your chatbot logic.
- Add the function code that handles requests to and from the chatbot's backend.
- use deployment center to deploy code using github connection. Use the ingestionfunction folder for deploying the function code.
- Save and test the function by uploading a file from the chatbot.
- Once all steps are complete, navigate to your frontend URL and test the chatbot application.
- Ensure that the chatbot can authenticate users (via Azure AD), interact with OpenAI models (if applicable), and store results (e.g., in Azure Blob Storage).
- Monitor logs and errors in Azure Functions and Azure Container Services for debugging.
- Authentication issues: Ensure the redirect URI and permissions are correctly configured in Azure AD.
- Model deployment issues: Verify that the correct model is deployed in the Azure OpenAI resource and that the correct API key is used in your application.
- Azure Function errors: Check function logs in the Azure portal for detailed error messages.
If you have suggestions or find any issues, feel free to open a pull request or raise an issue in the project repository.
This project is licensed under the MIT License.