Skip to content

Commit

Permalink
Merge pull request #816 from JayaniH/azure-openai
Browse files Browse the repository at this point in the history
Update azure.openai.chat connector
  • Loading branch information
LakshanSS authored Oct 30, 2023
2 parents cbb2906 + 1dd2b8c commit bd358e8
Show file tree
Hide file tree
Showing 8 changed files with 1,740 additions and 1,051 deletions.
2 changes: 1 addition & 1 deletion openapi/azure.openai.chat/Ballerina.toml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ name = "azure.openai.chat"
icon = "icon.png"
distribution = "2201.4.1"
repository = "https://github.com/ballerina-platform/openapi-connectors/tree/main/openapi/azure.openai.chat"
version = "1.0.2"
version = "2.0.0"
authors = ["Ballerina"]
[build-options]
observabilityIncluded = true
57 changes: 54 additions & 3 deletions openapi/azure.openai.chat/Module.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Create and initialize a `chat:Client` with the obtained `apiKey` and a `serviceU
>**Note:** These operations are in the form of remote operations.

Following is an example of creating a conversation with an OpenAI gpt-35-turbo model:
Following is an example of creating a conversation with an Azure OpenAI chat model:

```ballerina
public function main() returns error? {
Expand All @@ -48,14 +48,65 @@ Create and initialize a `chat:Client` with the obtained `apiKey` and a `serviceU
serviceUrl = serviceUrl
);
chat:Chat_completions_body chatBody = {
chat:CreateChatCompletionRequest chatBody = {
messages: [{role: "user", content: "What is Ballerina?"}]
};
chat:Inline_response_200 chatResult = check chatClient->/deployments/["chat"]/chat/completions.post("2023-03-15-preview", chatBody);
chat:CreateChatCompletionResponse chatResult = check chatClient->/deployments/["chat"]/chat/completions.post("2023-08-01-preview", chatBody);
io:println(chatResult);
}
```
Following is a sample of using function calling with an Azure OpenAI chat model:
```ballerina
public function main() returns error? {
final chat:Client chatClient = check new (
config = {auth: {apiKey: apiKey}},
serviceUrl = serviceUrl
);
chat:ChatCompletionRequestMessage[] messages = [{role: "user", content: "What is the weather in Seattle?"}];
chat:ChatCompletionFunctions[] functions = [
{
name: "get_current_weather",
description: "Get the current weather in a given location",
parameters: {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city or town to get the weather for"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"]
}
},
"required": ["location"]
}
}
];
chat:CreateChatCompletionRequest chatBody = {messages, functions};
chat:CreateChatCompletionResponse chatResult = check chatClient->/deployments/["chat"]/chat/completions.post("2023-08-01-preview", chatBody);
io:println(chatResult);
chat:ChatCompletionRequestMessage_function_call? functionCall = chatResult.choices[0].message?.function_call;
if functionCall is chat:ChatCompletionRequestMessage_function_call {
messages.push({role: "assistant", content: (), function_call: functionCall});
// Invoke the function [functionCall.name] with the arguments [functionCall.arguments] and get the output [functionOutput]
messages.push({role: "function", name: functionCall.name, content: functionOutput.toString()});
}
}
```
2. Use `bal run` command to compile and run the Ballerina program.
4 changes: 2 additions & 2 deletions openapi/azure.openai.chat/Package.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@ Connects to [Azure OpenAI Chat Completions API](https://learn.microsoft.com/en-u

### Package overview

The `azure.openai.chat` is a [Ballerina](https://ballerina.io/) connector for connecting to the Azure OpenAI Service REST API Chat Completions Endpoint.
The `azure.openai.chat` is a [Ballerina](https://ballerina.io/) connector for connecting to the Azure OpenAI Service REST API Chat Completions and Chat Completions extensions Endpoints.

#### Compatibility
Azure OpenAI Service REST API: v2023-03-15-preview
Azure OpenAI Service REST API: v2023-08-01-preview

## Report issues
To report bugs, request new features, start new discussions, view project boards, etc., go to the [Ballerina Extended Library repository](https://github.com/ballerina-platform/ballerina-extended-library).
Expand Down
28 changes: 25 additions & 3 deletions openapi/azure.openai.chat/client.bal
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
// AUTO-GENERATED FILE. DO NOT MODIFY.
// This file is auto-generated by the Ballerina OpenAPI tool.

import ballerina/http;

# This is a generated connector from [Azure OpenAI Chat Completions API v2023-03-15-preview](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference#chat-completions/) OpenAPI specification.
# This is a generated connector from [Azure OpenAI Chat Completions API v2023-08-01-preview](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference#chat-completions/) OpenAPI specification.
# The Azure Azure OpenAI Service REST API Chat Completions Endpoint will create completions for chat messages with the ChatGPT (preview) and GPT-4 (preview) models.
@display {label: "Azure OpenAI Chat", iconPath: "icon.png"}
public isolated client class Client {
Expand Down Expand Up @@ -50,7 +53,7 @@ public isolated client class Client {
#
# + return - OK
@display {label: "Create Chat Completion"}
resource isolated function post deployments/[string deploymentId]/chat/completions(string apiVersion, Chat_completions_body payload) returns Inline_response_200|error {
resource isolated function post deployments/[string deploymentId]/chat/completions(string apiVersion, CreateChatCompletionRequest payload) returns CreateChatCompletionResponse|error {
string resourcePath = string `/deployments/${getEncodedUri(deploymentId)}/chat/completions`;
map<any> headerValues = {};
map<anydata> queryParam = {"api-version": apiVersion};
Expand All @@ -62,7 +65,26 @@ public isolated client class Client {
http:Request request = new;
json jsonBody = payload.toJson();
request.setPayload(jsonBody, "application/json");
Inline_response_200 response = check self.clientEp->post(resourcePath, request, httpHeaders);
CreateChatCompletionResponse response = check self.clientEp->post(resourcePath, request, httpHeaders);
return response;
}
# Using extensions to creates a completion for the chat messages.
#
# + return - OK
@display {label: "Create Extensions Chat Completion"}
resource isolated function post deployments/[string deploymentId]/extensions/chat/completions(string apiVersion, ExtensionsChatCompletionsRequest payload) returns ExtensionsChatCompletionsResponse|error {
string resourcePath = string `/deployments/${getEncodedUri(deploymentId)}/extensions/chat/completions`;
map<any> headerValues = {};
map<anydata> queryParam = {"api-version": apiVersion};
if self.apiKeyConfig is ApiKeysConfig {
headerValues["api-key"] = self.apiKeyConfig?.apiKey;
}
resourcePath = resourcePath + check getPathForQueryParam(queryParam);
map<string|string[]> httpHeaders = getMapForHeaders(headerValues);
http:Request request = new;
json jsonBody = payload.toJson();
request.setPayload(jsonBody, "application/json");
ExtensionsChatCompletionsResponse response = check self.clientEp->post(resourcePath, request, httpHeaders);
return response;
}
}
Loading

0 comments on commit bd358e8

Please sign in to comment.