Skip to content

Commit

Permalink
Python: KernelHttpServer (#915)
Browse files Browse the repository at this point in the history
### Motivation and Context
This is a python implementation of the KernelHttpServer. Trying to
replicate the same functionalities the C# version offers.

see: #851 

### Description

Implementation of the KernelHttpServer in python. Currently not all
applications are supported (some are missing skill implementation in
python).

Supported endpoint
- [POST] api/skills/{skillName}/invoke/{functionName}
- [GET] api/ping
- [POST] api/planner/createplan
- [POST] api/planner/execute/{maxsteps}


Here the status of each application : 

- **book-creator-webapp-react** :  Working
- **chat-summary-webapp-react** : Not working , missing
ConversationSummary skill in python
- **github-qna-webapp-react** : Not working , missing the github skill
in python
- **auth-api-webapp-react** : Not working, missing ConversationSummary
skill and TaskListSkill

### Contribution Checklist
<!-- Before submitting this PR, please make sure: -->
- [x] The code builds clean without any errors or warnings
- [x] The PR follows SK Contribution Guidelines
(https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
- [x] The code follows the .NET coding conventions
(https://learn.microsoft.com/dotnet/csharp/fundamentals/coding-style/coding-conventions)
verified with `dotnet format`
- [x] All unit tests pass, and I have added new tests where possible
- [x] I didn't break anyone 😄

---------

Co-authored-by: Abby Harrison <[email protected]>
Co-authored-by: Mark Karle <[email protected]>
Co-authored-by: Shawn Callegari <[email protected]>
Co-authored-by: Dmytro Struk <[email protected]>
  • Loading branch information
5 people authored Jul 19, 2023
1 parent 8492d50 commit 9bdeb42
Show file tree
Hide file tree
Showing 11 changed files with 517 additions and 0 deletions.
8 changes: 8 additions & 0 deletions samples/python/kernel_http_server/.funcignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
.git*
.vscode
__azurite_db*__.json
__blobstorage__
__queuestorage__
local.settings.json
test
.venv
93 changes: 93 additions & 0 deletions samples/python/kernel_http_server/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
# Semantic Kernel Service API (For Learning Samples)

Watch the [Service API Quick Start Video](https://aka.ms/SK-Local-API-Setup).

This service API is written in Python against Azure Function Runtime v4 and exposes
some Semantic Kernel APIs that you can call via HTTP POST requests for the learning samples.

![azure-function-diagram](https://user-images.githubusercontent.com/146438/222305329-0557414d-38ce-4712-a7c1-4f6c63c20320.png)

## !IMPORTANT

> This service API is for educational purposes only and should not be used in any production use
> case. It is intended to highlight concepts of Semantic Kernel and not any architectural
> security design practices to be used.
## Prerequisites

[Azure Functions Core Tools](https://learn.microsoft.com/azure/azure-functions/functions-run-local)
installation is required for this service API to run locally.

## Configuring the host

This starter can be configured in two ways:

- A `.env` file in the project which holds api keys and other secrets and configurations
- Or with HTTP Headers on each request

Make sure you have an
[Open AI API Key](https://openai.com/api/) or
[Azure Open AI service key](https://learn.microsoft.com/azure/cognitive-services/openai/quickstart?pivots=rest-api)

### Configure with a .env file

Copy the `.env.example` file to a new file named `.env`. Then, copy those keys into the `.env` file:

```
OPENAI_API_KEY=""
OPENAI_ORG_ID=""
AZURE_OPENAI_DEPLOYMENT_NAME=""
AZURE_OPENAI_ENDPOINT=""
AZURE_OPENAI_API_KEY=""
```

### Configure with HTTP Headers

On each HTTP request, use these headers:

```
"x-ms-sk-completion-model" # e.g. text-davinci-003
"x-ms-sk-completion-endpoint" # e.g. https://my-endpoint.openai.azure.com
"x-ms-sk-completion-backend" # AZURE_OPENAI or OPENAI
"x-ms-sk-completion-key" # Your API key
```

## Running the service API locally

**Run** `python -m venv .venv && .\.venv\Scripts\python -m pip install -r requirements.txt && .venv\Scripts\activate`
to create and activate a virtual environment
**Run** `func start` from the command line. This will run the service API locally at `http://localhost:7071`.

Four endpoints will be exposed by the service API:

- **InvokeFunction**: [POST] `http://localhost:7071/api/skills/{skillName}/invoke/{functionName}`
- **Ping**: [GET] `http://localhost:7071/api/ping`
- **CreatePlan**: [POST] `http://localhost:7071/api/planner/createplan`
- **ExecutePlan**: [POST] `http://localhost:7071/api/planner/executeplan/{maxSteps}`

They accept input in the JSON body of the request:

```json
{
"value": "", // the "input" of the prompt
"inputs": // a list of extra key-value parameters for ContextVariables or Plan State
[
{"key": "", "value": ""}
],
"skills": [] // list of skills to use (for the planner)
}
```

For planning, first create a plan with your prompt in the "value" parameter. Take the JSON "state" response,
rename "state" to "inputs", and use it for the input to ExecutePlan.

## Next steps

Now that your service API is running locally,
let's try it out in a sample app so you can learn core Semantic Kernel concepts!
The service API will need to be run or running for each sample app you want to try.

Sample app learning examples:

- [Book creator](../../apps/book-creator-webapp-react/README.md) – learn how Planner and chaining of
semantic functions can be used in your app
29 changes: 29 additions & 0 deletions samples/python/kernel_http_server/function_app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
import azure.functions as func
import logging

from utils.kernel_server import KernelServer

app = func.FunctionApp(http_auth_level=func.AuthLevel.ANONYMOUS)

@app.route(route="skills/{skillName}/invoke/{functionName}", methods=["POST"])
async def completion(req: func.HttpRequest) -> func.HttpResponse:
logging.info("Completion request")
kernel_server = KernelServer()
return await kernel_server.completion(req)

@app.route(route="ping", methods=["GET"])
async def ping(req: func.HttpRequest) -> func.HttpResponse:
logging.info("ping request")
return func.HttpResponse()

@app.route(route="planner/createplan", methods=["POST"])
async def create_plan(req: func.HttpRequest) -> func.HttpResponse:
logging.info("Create Plan Request")
kernel_server = KernelServer()
return await kernel_server.create_plan(req)

@app.route(route="planner/execute/{maxSteps}", methods=["POST"])
async def execute_plan(req: func.HttpRequest) -> func.HttpResponse:
logging.info("Execute Plan Request")
kernel_server = KernelServer()
return await kernel_server.execute_plan(req, req.route_params.get("maxSteps"))
15 changes: 15 additions & 0 deletions samples/python/kernel_http_server/host.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
}
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[4.*, 5.0.0)"
}
}
10 changes: 10 additions & 0 deletions samples/python/kernel_http_server/local.settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
{
"IsEncrypted": false,
"Host": {
"CORS": "*"
},
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "python",
"AzureWebJobsFeatureFlags": "EnableWorkerIndexing"
}
}
7 changes: 7 additions & 0 deletions samples/python/kernel_http_server/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# Do not include azure-functions-worker in this file
# The Python Worker is managed by the Azure Functions platform
# Manually managing azure-functions-worker may cause unexpected issues

azure-functions
semantic-kernel
dataclasses_json
26 changes: 26 additions & 0 deletions samples/python/kernel_http_server/utils/ask.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
from dataclasses import dataclass
from dataclasses_json import dataclass_json

from typing import List


@dataclass_json
@dataclass
class AskInput:
key: str = None
value: str = None


@dataclass_json
@dataclass
class Ask:
skills: List[str] = None
inputs: List[AskInput] = None
value: str = None


@dataclass_json
@dataclass
class AskResult:
value: str = None
state: List[AskInput] = None
72 changes: 72 additions & 0 deletions samples/python/kernel_http_server/utils/config.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
from dataclasses import dataclass

from enum import Enum

import semantic_kernel as sk


DEFAULT_OPENAI_MODEL = "text-davinci-003"


class AIService(Enum):
AZURE_OPENAI = "0"
OPENAI = "1"


class SKHttpHeaders(Enum):
COMPLETION_MODEL = "x-ms-sk-completion-model"
COMPLETION_ENDPOINT = "x-ms-sk-completion-endpoint"
COMPLETION_SERVICE = "x-ms-sk-completion-backend"
COMPLETION_KEY = "x-ms-sk-completion-key"
EMBEDDING_MODEL = "x-ms-sk-embedding-model"
EMBEDDING_ENDPOINT = "x-ms-sk-embedding-endpoint"
EMBEDDING_SERVICE = "x-ms-sk-embedding-backend"
EMBEDDING_KEY = "x-ms-sk-embedding-key"
MS_GRAPH = "x-ms-sk-msgraph"


@dataclass
class AIServiceConfig:
deployment_model_id: str
endpoint: str
key: str
serviceid: str
org_id: str = None


def headers_to_config(headers: dict) -> AIServiceConfig:
if SKHttpHeaders.COMPLETION_MODEL.value in headers:
return AIServiceConfig(
deployment_model_id=headers[SKHttpHeaders.COMPLETION_MODEL.value],
endpoint=headers[SKHttpHeaders.COMPLETION_ENDPOINT.value],
key=headers[SKHttpHeaders.COMPLETION_KEY.value],
serviceid=headers[SKHttpHeaders.COMPLETION_SERVICE.value],
)
elif SKHttpHeaders.EMBEDDING_MODEL.value in headers:
return AIServiceConfig(
deployment_model_id=headers[SKHttpHeaders.EMBEDDING_MODEL.value],
endpoint=headers[SKHttpHeaders.EMBEDDING_ENDPOINT.value],
key=headers[SKHttpHeaders.EMBEDDING_KEY.value],
serviceid=headers[SKHttpHeaders.EMBEDDING_SERVICE.value],
)
raise ValueError("No valid headers found")


def dotenv_to_config(use_azure_openai=True):
if use_azure_openai:
deployment_model_id, api_key, endpoint = sk.azure_openai_settings_from_dot_env()
return AIServiceConfig(
deployment_model_id=deployment_model_id,
endpoint=endpoint,
key=api_key,
serviceid=AIService.AZURE_OPENAI.value,
)
else:
api_key, org_id = sk.openai_settings_from_dot_env()
return AIServiceConfig(
deployment_model_id=DEFAULT_OPENAI_MODEL,
endpoint=None,
key=api_key,
serviceid=AIService.OPENAI.value,
org_id=org_id,
)
Loading

0 comments on commit 9bdeb42

Please sign in to comment.