Skip to content

Commit

Permalink
Merge pull request #830 from nadheesh/main
Browse files Browse the repository at this point in the history
Upgrade OpenAI and Azure OpenAI chat connectors
  • Loading branch information
NipunaRanasinghe authored Jan 30, 2024
2 parents afa9af5 + acb85df commit 6f6ee9b
Show file tree
Hide file tree
Showing 15 changed files with 11,832 additions and 2,956 deletions.
4 changes: 2 additions & 2 deletions openapi/azure.openai.chat/Ballerina.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@ keywords = ["AI/Chat", "Azure OpenAI", "Cost/Paid", "GPT-3.5", "ChatGPT", "Vendo
org = "ballerinax"
name = "azure.openai.chat"
icon = "icon.png"
distribution = "2201.4.1"
distribution = "2201.8.4"
repository = "https://github.com/ballerina-platform/openapi-connectors/tree/main/openapi/azure.openai.chat"
version = "2.0.1"
version = "3.0.0"
authors = ["Ballerina"]
[build-options]
observabilityIncluded = true
116 changes: 88 additions & 28 deletions openapi/azure.openai.chat/Module.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
## Overview
This is a generated connector from [Azure OpenAI Chat Completions API](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference#chat-completions/) OpenAPI specification.

The Azure Azure OpenAI Service REST API Chat Completions Endpoint will create completions for chat messages with the ChatGPT (preview) and GPT-4 (preview) models.
The Azure Azure OpenAI Service REST API Chat Completions Endpoint will create completions for chat messages with the GPT3.5 (preview), GPT-4 (preview) models and GPT-4 Vision models.

## Prerequisites
- Create an [Azure](https://azure.microsoft.com/en-us/features/azure-portal/) account
Expand Down Expand Up @@ -52,7 +52,7 @@ Create and initialize a `chat:Client` with the obtained `apiKey` and a `serviceU
messages: [{role: "user", content: "What is Ballerina?"}]
};
chat:CreateChatCompletionResponse chatResult = check chatClient->/deployments/["chat"]/chat/completions.post("2023-08-01-preview", chatBody);
chat:CreateChatCompletionResponse chatResult = check chatClient->/deployments/["chat"]/chat/completions.post("2023-12-01-preview", chatBody);
io:println(chatResult);
}
Expand All @@ -68,44 +68,104 @@ Create and initialize a `chat:Client` with the obtained `apiKey` and a `serviceU
serviceUrl = serviceUrl
);
chat:ChatCompletionRequestMessage[] messages = [{role: "user", content: "What is the weather in Seattle?"}];
chat:ChatCompletionFunctions[] functions = [
chat:ChatCompletionRequestMessage[] messages = [{role: "user", "content": "What is the weather in Seattle?"}];
chat:ChatCompletionTool[] tools = [
{
name: "get_current_weather",
description: "Get the current weather in a given location",
parameters: {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city or town to get the weather for"
'type: "function",
'function: {
name: "get_current_weather",
description: "Get the current weather in a given location",
parameters: {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city or town to get the weather for"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"]
}
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"]
}
},
"required": ["location"]
"required": ["location"]
}
}
}
];
chat:CreateChatCompletionRequest chatBody = {messages, functions};
chat:CreateChatCompletionRequest chatBody = {messages, tools};
chat:CreateChatCompletionResponse chatResult = check chatClient->/deployments/["chat"]/chat/completions.post("2023-08-01-preview", chatBody);
chat:CreateChatCompletionResponse chatResult = check chatClient->/deployments/[deployementId]/chat/completions.post("2023-12-01-preview", chatBody);
io:println(chatResult);
chat:ChatCompletionRequestMessage_function_call? functionCall = chatResult.choices[0].message?.function_call;
record {|chat:ChatCompletionResponseMessage message?; chat:ContentFilterChoiceResults content_filter_results?; int index?; string finish_reason?; anydata...;|}[] choices = check chatResult.choices.ensureType();
// continue the chat
chat:ChatCompletionRequestMessage message = check choices[0].message.cloneWithType();
messages.push(message);
// check if there are any tool calls
chat:ChatCompletionMessageToolCall[]? toolCalls = choices[0].message?.tool_calls;
if toolCalls is chat:ChatCompletionMessageToolCall[] {
foreach chat:ChatCompletionMessageToolCall toolCall in toolCalls {
string functionName = toolCall.'function.name;
string functionArguments = toolCall.'function.arguments;
// invoke the function
anydata functionResponse = "<function response>";
messages.push(
{
role: "tool",
"tool_call_id": toolCall.id,
"name": functionName,
"content": functionResponse
});
}
}
if functionCall is chat:ChatCompletionRequestMessage_function_call {
messages.push({role: "assistant", content: (), function_call: functionCall});
// Invoke the function [functionCall.name] with the arguments [functionCall.arguments] and get the output [functionOutput]
// do the second chat request
chatResult = check chatClient->/deployments/["chatgpt"]/chat/completions.post("2023-12-01-preview", {messages});
io:println(chatResult);
}
```
messages.push({role: "function", name: functionCall.name, content: functionOutput.toString()});
}
Following a sample to use OpenAI vision capabilities with chat model
```ballerina
public function main() returns error? {
final chat:Client chatClient = check new (
config = {auth: {apiKey: apiKey}},
serviceUrl = serviceUrl
);
chat:CreateChatCompletionResponse response = check chatClient->/deployments/[deployementId]/chat/completions.post("2023-12-01-preview",
{
messages: [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": [
{
"type": "text",
"text": "Describe the image."
},
{
"type": "image_url",
"image_url": {
"url": "<image url>"
}
}
]
}
]
}
);
record {|chat:ChatCompletionResponseMessage message?; chat:ContentFilterChoiceResults content_filter_results?; int index?; string finish_reason?; anydata...;|}[] choices = check response.choices.ensureType();
io:println(choices[0].message?.content);
}
```
Expand Down
2 changes: 1 addition & 1 deletion openapi/azure.openai.chat/Package.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Connects to [Azure OpenAI Chat Completions API](https://learn.microsoft.com/en-u
The `azure.openai.chat` is a [Ballerina](https://ballerina.io/) connector for connecting to the Azure OpenAI Service REST API Chat Completions and Chat Completions extensions Endpoints.

#### Compatibility
Azure OpenAI Service REST API: v2023-08-01-preview
Azure OpenAI Service REST API: v2023-12-01-preview

## Report issues
To report bugs, request new features, start new discussions, view project boards, etc., go to the [Ballerina Extended Library repository](https://github.com/ballerina-platform/ballerina-extended-library).
Expand Down
22 changes: 16 additions & 6 deletions openapi/azure.openai.chat/client.bal
Original file line number Diff line number Diff line change
@@ -1,9 +1,21 @@
// AUTO-GENERATED FILE. DO NOT MODIFY.
// This file is auto-generated by the Ballerina OpenAPI tool.

// Copyright (c) 2024 WSO2 LLC. (http://www.wso2.org) All Rights Reserved.
//
// WSO2 Inc. licenses this file to you under the Apache License,
// Version 2.0 (the "License"); you may not use this file except
// in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing,
// software distributed under the License is distributed on an
// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
// KIND, either express or implied. See the License for the
// specific language governing permissions and limitations
// under the License.
import ballerina/http;

# This is a generated connector from [Azure OpenAI Chat Completions API v2023-08-01-preview](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference#chat-completions/) OpenAPI specification.
# This is a generated connector from [Azure OpenAI Chat Completions API v2023-12-01-preview](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference#chat-completions/) OpenAPI specification.
# The Azure Azure OpenAI Service REST API Chat Completions Endpoint will create completions for chat messages with the ChatGPT (preview) and GPT-4 (preview) models.
@display {label: "Azure OpenAI Chat", iconPath: "icon.png"}
public isolated client class Client {
Expand Down Expand Up @@ -52,7 +64,6 @@ public isolated client class Client {
# Creates a completion for the chat message
#
# + return - OK
@display {label: "Create Chat Completion"}
resource isolated function post deployments/[string deploymentId]/chat/completions(string apiVersion, CreateChatCompletionRequest payload) returns CreateChatCompletionResponse|error {
string resourcePath = string `/deployments/${getEncodedUri(deploymentId)}/chat/completions`;
map<any> headerValues = {};
Expand All @@ -71,7 +82,6 @@ public isolated client class Client {
# Using extensions to creates a completion for the chat messages.
#
# + return - OK
@display {label: "Create Extensions Chat Completion"}
resource isolated function post deployments/[string deploymentId]/extensions/chat/completions(string apiVersion, ExtensionsChatCompletionsRequest payload) returns ExtensionsChatCompletionsResponse|error {
string resourcePath = string `/deployments/${getEncodedUri(deploymentId)}/extensions/chat/completions`;
map<any> headerValues = {};
Expand Down
Loading

0 comments on commit 6f6ee9b

Please sign in to comment.