Skip to content

Commit

Permalink
Merge branch 'main' into python_azuresearch_auth
Browse files Browse the repository at this point in the history
  • Loading branch information
franklinlindemberg authored Sep 13, 2024
2 parents 30cf190 + 77aa4e3 commit 303b6d0
Show file tree
Hide file tree
Showing 657 changed files with 31,249 additions and 21,185 deletions.
2 changes: 2 additions & 0 deletions .github/_typos.toml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ extend-exclude = [
"test_code_tokenizer.py",
"*response.json",
"test_content.txt",
"serializedChatHistoryV1_15_1.json",
"MultipleFunctionsVsParameters.cs"
]

[default.extend-words]
Expand Down
24 changes: 22 additions & 2 deletions .github/workflows/dotnet-build-and-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ concurrency:

permissions:
contents: read
id-token: "write"

jobs:
paths-filter:
Expand Down Expand Up @@ -57,11 +58,13 @@ jobs:
os: "ubuntu-latest",
configuration: Release,
integration-tests: true,
environment: "integration",
}
- { dotnet: "8.0", os: "windows-latest", configuration: Debug }
- { dotnet: "8.0", os: "windows-latest", configuration: Release }

runs-on: ${{ matrix.os }}
environment: ${{ matrix.environment }}
steps:
- uses: actions/checkout@v4
- name: Setup dotnet ${{ matrix.dotnet }}
Expand All @@ -84,6 +87,14 @@ jobs:
dotnet test -c ${{ matrix.configuration }} $project --no-build -v Normal --logger trx --collect:"XPlat Code Coverage" --results-directory:"TestResults/Coverage/" -- DataCollectionRunSettings.DataCollectors.DataCollector.Configuration.ExcludeByAttribute=GeneratedCodeAttribute,CompilerGeneratedAttribute,ExcludeFromCodeCoverageAttribute
done
- name: Azure CLI Login
if: github.event_name != 'pull_request' && matrix.integration-tests
uses: azure/login@v2
with:
client-id: ${{ secrets.AZURE_CLIENT_ID }}
tenant-id: ${{ secrets.AZURE_TENANT_ID }}
subscription-id: ${{ secrets.AZURE_SUBSCRIPTION_ID }}

- name: Run Integration Tests
shell: bash
if: github.event_name != 'pull_request' && matrix.integration-tests
Expand All @@ -96,6 +107,7 @@ jobs:
AzureOpenAI__Label: azure-text-davinci-003
AzureOpenAIEmbedding__Label: azure-text-embedding-ada-002
AzureOpenAI__DeploymentName: ${{ vars.AZUREOPENAI__DEPLOYMENTNAME }}
AzureOpenAI__ChatDeploymentName: ${{ vars.AZUREOPENAI__CHATDEPLOYMENTNAME }}
AzureOpenAIEmbeddings__DeploymentName: ${{ vars.AZUREOPENAIEMBEDDING__DEPLOYMENTNAME }}
AzureOpenAI__Endpoint: ${{ secrets.AZUREOPENAI__ENDPOINT }}
AzureOpenAIEmbeddings__Endpoint: ${{ secrets.AZUREOPENAI_EASTUS__ENDPOINT }}
Expand All @@ -110,23 +122,31 @@ jobs:
OpenAITextToAudio__ModelId: ${{ vars.OPENAITEXTTOAUDIO__MODELID }}
OpenAIAudioToText__ApiKey: ${{ secrets.OPENAIAUDIOTOTEXT__APIKEY }}
OpenAIAudioToText__ModelId: ${{ vars.OPENAIAUDIOTOTEXT__MODELID }}
OpenAITextToImage__ApiKey: ${{ secrets.OPENAITEXTTOIMAGE__APIKEY }}
OpenAITextToImage__ModelId: ${{ vars.OPENAITEXTTOIMAGE__MODELID }}
AzureOpenAITextToAudio__ApiKey: ${{ secrets.AZUREOPENAITEXTTOAUDIO__APIKEY }}
AzureOpenAITextToAudio__Endpoint: ${{ secrets.AZUREOPENAITEXTTOAUDIO__ENDPOINT }}
AzureOpenAITextToAudio__DeploymentName: ${{ vars.AZUREOPENAITEXTTOAUDIO__DEPLOYMENTNAME }}
AzureOpenAIAudioToText__ApiKey: ${{ secrets.AZUREOPENAIAUDIOTOTEXT__APIKEY }}
AzureOpenAIAudioToText__Endpoint: ${{ secrets.AZUREOPENAIAUDIOTOTEXT__ENDPOINT }}
AzureOpenAIAudioToText__DeploymentName: ${{ vars.AZUREOPENAIAUDIOTOTEXT__DEPLOYMENTNAME }}
AzureOpenAITextToImage__ApiKey: ${{ secrets.AZUREOPENAITEXTTOIMAGE__APIKEY }}
AzureOpenAITextToImage__Endpoint: ${{ secrets.AZUREOPENAITEXTTOIMAGE__ENDPOINT }}
AzureOpenAITextToImage__DeploymentName: ${{ vars.AZUREOPENAITEXTTOIMAGE__DEPLOYMENTNAME }}
Bing__ApiKey: ${{ secrets.BING__APIKEY }}
OpenAI__ApiKey: ${{ secrets.OPENAI__APIKEY }}
OpenAI__ChatModelId: ${{ vars.OPENAI__CHATMODELID }}
AzureAIInference__ApiKey: ${{ secrets.AZUREAIINFERENCE__APIKEY }}
AzureAIInference__Endpoint: ${{ secrets.AZUREAIINFERENCE__ENDPOINT }}

# Generate test reports and check coverage
- name: Generate test reports
uses: danielpalme/[email protected].8
uses: danielpalme/[email protected].9
with:
reports: "./TestResults/Coverage/**/coverage.cobertura.xml"
targetdir: "./TestResults/Reports"
reporttypes: "JsonSummary"
assemblyfilters: "+Microsoft.SemanticKernel.Abstractions;+Microsoft.SemanticKernel.Core;+Microsoft.SemanticKernel.PromptTemplates.Handlebars;+Microsoft.SemanticKernel.Connectors.OpenAI;+Microsoft.SemanticKernel.Yaml;+Microsoft.SemanticKernel.Agents.Abstractions;+Microsoft.SemanticKernel.Agents.Core;+Microsoft.SemanticKernel.Agents.OpenAI"
assemblyfilters: "+Microsoft.SemanticKernel.Abstractions;+Microsoft.SemanticKernel.Core;+Microsoft.SemanticKernel.PromptTemplates.Handlebars;+Microsoft.SemanticKernel.Connectors.OpenAI;+Microsoft.SemanticKernel.Connectors.AzureOpenAI;+Microsoft.SemanticKernel.Yaml;+Microsoft.SemanticKernel.Agents.Abstractions;+Microsoft.SemanticKernel.Agents.Core;+Microsoft.SemanticKernel.Agents.OpenAI"

- name: Check coverage
shell: pwsh
Expand Down
46 changes: 46 additions & 0 deletions docs/decisions/0051-dotnet-azure-model-as-a-service.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
---
# These are optional elements. Feel free to remove any of them.
status: proposed
contact: rogerbarreto
date: 2024-08-07
deciders: rogerbarreto, markwallace-microsoft
consulted: taochen
---

# Support Connector for .Net Azure Model-as-a-Service (Azure AI Studio)

## Context and Problem Statement

There has been a demand from customers to use and support natively models deployed in [Azure AI Studio - Serverless APIs](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/model-catalog-overview#model-deployment-managed-compute-and-serverless-api-pay-as-you-go), This mode of consumption operates on a pay-as-you-go basis, typically using tokens for billing purposes. Clients can access the service via the [Azure AI Model Inference API](https://learn.microsoft.com/en-us/azure/ai-studio/reference/reference-model-inference-api?tabs=azure-studio) or client SDKs.

At present, there is no official support for [Azure AI Studio](https://learn.microsoft.com/en-us/azure/ai-studio/what-is-ai-studio). The purpose of this ADR is to examine the constraints of the service and explore potential solutions to enable support for the service via the development of a new AI connector.

## Azure Inference Client library for .NET

The Azure team has a new client library, namely [Azure.AI.Inference](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.Inference_1.0.0-beta.1/sdk/ai/Azure.AI.Inference/README.md) in .Net, for effectively interacting with the service. While the service API is OpenAI-compatible, it is not permissible to use the OpenAI and the Azure OpenAI client libraries for interacting with the service as they are not independent with respect to both the models and their providers. This is because Azure AI Studio features a diverse range of open-source models, other than OpenAI models.

### Limitations

Currently is known that the first version of the client SDK will only support: `Chat Completion` and `Text Embedding Generation` and `Image Embedding Generation` with `TextToImage Generation` planned.

There are no current plans to support `Text Generation` modality.

## AI Connector

### Namespace options

- `Microsoft.SemanticKernel.Connectors.AzureAI`
- `Microsoft.SemanticKernel.Connectors.AzureAIInference`
- `Microsoft.SemanticKernel.Connectors.AzureAIModelInference`

Decision: `Microsoft.SemanticKernel.Connectors.AzureAIInference`

### Support for model-specific parameters

Models can possess supplementary parameters that are not part of the default API. The service API and the client SDK enable the provision of model-specific parameters. Users can provide model-specific settings via a dedicated argument along with other settings, such as `temperature` and `top_p`, among others.

Azure AI Inference specialized `PromptExecutionSettings`, will support those customizable parameters.

### Feature Branch

The development of the Azure AI Inference connector will be done in a feature branch named `feature-connectors-azureaiinference`.
12 changes: 6 additions & 6 deletions docs/decisions/0052-python-ai-connector-new-abstract-methods.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,24 +30,24 @@ Auto function invocation can cause a side effect where a single call to get_chat

### Two new abstract methods

> Revision: In order to not break existing customers who have implemented their own AI connectors, these two methods are not decorated with the `@abstractmethod` decorator, but instead throw an exception if they are not implemented in the built-in AI connectors.
```python
@abstractmethod
async def _send_chat_request(
async def _inner_get_chat_message_content(
self,
chat_history: ChatHistory,
settings: PromptExecutionSettings
) -> list[ChatMessageContent]:
pass
raise NotImplementedError
```

```python
@abstractmethod
async def _send_streaming_chat_request(
async def _inner_get_streaming_chat_message_content(
self,
chat_history: ChatHistory,
settings: PromptExecutionSettings
) -> AsyncGenerator[list[StreamingChatMessageContent], Any]:
pass
raise NotImplementedError
```

### A new `ClassVar[bool]` variable in `ChatCompletionClientBase` to indicate whether a connector supports function calling
Expand Down
7 changes: 6 additions & 1 deletion dotnet/Directory.Build.props
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,11 @@
<ImplicitUsings>disable</ImplicitUsings>
</PropertyGroup>

<PropertyGroup>
<!-- In "main" branch this flag should be always "false". -->
<IsReleaseCandidate>false</IsReleaseCandidate>
</PropertyGroup>

<PropertyGroup>
<!-- Disable NuGet packaging by default. Projects can override. -->
<IsPackable>disable</IsPackable>
Expand All @@ -30,4 +35,4 @@
<_Parameter1>false</_Parameter1>
</AssemblyAttribute>
</ItemGroup>
</Project>
</Project>
33 changes: 19 additions & 14 deletions dotnet/Directory.Packages.props
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,11 @@
<ManagePackageVersionsCentrally>true</ManagePackageVersionsCentrally>
</PropertyGroup>
<ItemGroup>
<PackageVersion Include="Azure.AI.Inference" Version="1.0.0-beta.1" />
<PackageVersion Include="OpenAI" Version="2.0.0-beta.11" />
<PackageVersion Include="System.ClientModel" Version="1.1.0-beta.7" />
<PackageVersion Include="Azure.AI.ContentSafety" Version="1.0.0" />
<PackageVersion Include="Azure.AI.OpenAI" Version="1.0.0-beta.17" />
<PackageVersion Include="Azure.AI.OpenAI.Assistants" Version="1.0.0-beta.4" />
<PackageVersion Include="Azure.AI.OpenAI" Version="2.0.0-beta.5" />
<PackageVersion Include="Azure.Identity" Version="1.12.0" />
<PackageVersion Include="Azure.Monitor.OpenTelemetry.Exporter" Version="1.3.0" />
<PackageVersion Include="Azure.Search.Documents" Version="11.6.0" />
Expand All @@ -28,7 +30,7 @@
<PackageVersion Include="Microsoft.Bcl.TimeProvider" Version="8.0.1" />
<PackageVersion Include="Microsoft.Extensions.Logging.Debug" Version="8.0.0" />
<PackageVersion Include="Microsoft.Identity.Client" Version="4.64.0" />
<PackageVersion Include="Microsoft.ML.OnnxRuntime" Version="1.18.1" />
<PackageVersion Include="Microsoft.ML.OnnxRuntime" Version="1.19.2" />
<PackageVersion Include="FastBertTokenizer" Version="1.0.28" />
<PackageVersion Include="Pinecone.NET" Version="2.1.1" />
<PackageVersion Include="System.Diagnostics.DiagnosticSource" Version="8.0.1" />
Expand All @@ -38,6 +40,7 @@
<PackageVersion Include="System.Text.Json" Version="8.0.4" />
<PackageVersion Include="System.Threading.Tasks.Extensions" Version="4.5.4" />
<PackageVersion Include="System.ValueTuple" Version="4.5.0" />
<PackageVersion Include="OllamaSharp" Version="3.0.1" />
<!-- Tokenizers -->
<PackageVersion Include="Microsoft.ML.Tokenizers" Version="0.22.0-preview.24378.1" />
<PackageVersion Include="Microsoft.DeepDev.TokenizerLib" Version="1.3.3" />
Expand All @@ -57,14 +60,16 @@
<PackageVersion Include="Microsoft.Extensions.Logging.Abstractions" Version="8.0.1" />
<PackageVersion Include="Microsoft.Extensions.Logging.Console" Version="8.0.0" />
<PackageVersion Include="Microsoft.Extensions.Options.DataAnnotations" Version="8.0.0" />
<PackageVersion Include="Microsoft.Extensions.TimeProvider.Testing" Version="8.7.0" />
<PackageVersion Include="Microsoft.Extensions.TimeProvider.Testing" Version="8.9.1" />
<PackageVersion Include="Microsoft.Extensions.FileProviders.Physical" Version="[8.0.0, 9.0.0)" />
<PackageVersion Include="Microsoft.Extensions.FileProviders.Embedded" Version="[8.0.0, 9.0.0)" />
<!-- Test -->
<PackageVersion Include="Microsoft.NET.Test.Sdk" Version="17.11.0" />
<PackageVersion Include="Moq" Version="[4.18.4]" />
<PackageVersion Include="System.Threading.Channels" Version="8.0.0" />
<PackageVersion Include="System.Threading.Tasks.Dataflow" Version="8.0.0" />
<PackageVersion Include="Verify.Xunit" Version="23.5.2" />
<PackageVersion Include="xunit" Version="2.7.0" />
<PackageVersion Include="xunit" Version="2.9.0" />
<PackageVersion Include="xunit.abstractions" Version="2.0.3" />
<PackageVersion Include="xunit.runner.visualstudio" Version="2.5.7" />
<PackageVersion Include="xretry" Version="1.9.0" />
Expand All @@ -78,7 +83,7 @@
<PackageVersion Include="MongoDB.Driver" Version="2.27.0" />
<PackageVersion Include="Microsoft.Graph" Version="[4.51.0, 5)" />
<PackageVersion Include="Microsoft.Identity.Client.Extensions.Msal" Version="[2.28.0, )" />
<PackageVersion Include="Microsoft.OpenApi" Version="1.6.16" />
<PackageVersion Include="Microsoft.OpenApi" Version="1.6.21" />
<PackageVersion Include="Microsoft.OpenApi.Readers" Version="1.6.16" />
<PackageVersion Include="Microsoft.OpenApi.ApiManifest" Version="0.5.4-preview" />
<PackageVersion Include="Google.Apis.CustomSearchAPI.v1" Version="[1.60.0.3001, )" />
Expand All @@ -88,11 +93,11 @@
<PackageVersion Include="YamlDotNet" Version="15.3.0" />
<PackageVersion Include="Fluid.Core" Version="2.11.1" />
<!-- Memory stores -->
<PackageVersion Include="Microsoft.Azure.Cosmos" Version="3.41.0-preview.0" />
<PackageVersion Include="Microsoft.Azure.Cosmos" Version="3.44.0-preview.0" />
<PackageVersion Include="Pgvector" Version="0.2.0" />
<PackageVersion Include="NRedisStack" Version="0.12.0" />
<PackageVersion Include="Milvus.Client" Version="2.3.0-preview.1" />
<PackageVersion Include="Testcontainers.Milvus" Version="3.8.0" />
<PackageVersion Include="Testcontainers.Milvus" Version="3.10.0" />
<PackageVersion Include="Microsoft.Data.SqlClient" Version="5.2.1" />
<PackageVersion Include="Qdrant.Client" Version="1.9.0" />
<!-- Symbols -->
Expand All @@ -103,12 +108,12 @@
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageVersion Include="Microsoft.VisualStudio.Threading.Analyzers" Version="17.10.48" />
<PackageVersion Include="Microsoft.VisualStudio.Threading.Analyzers" Version="17.11.20" />
<PackageReference Include="Microsoft.VisualStudio.Threading.Analyzers">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<PackageVersion Include="xunit.analyzers" Version="1.15.0" />
<PackageVersion Include="xunit.analyzers" Version="1.16.0" />
<PackageReference Include="xunit.analyzers">
<PrivateAssets>all</PrivateAssets>
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
Expand All @@ -134,8 +139,8 @@
<IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
</PackageReference>
<!-- OnnxRuntimeGenAI -->
<PackageVersion Include="Microsoft.ML.OnnxRuntimeGenAI" Version="0.3.0"/>
<PackageVersion Include="Microsoft.ML.OnnxRuntimeGenAI.Cuda" Version="0.3.0"/>
<PackageVersion Include="Microsoft.ML.OnnxRuntimeGenAI.DirectML" Version="0.3.0"/>
<PackageVersion Include="Microsoft.ML.OnnxRuntimeGenAI" Version="0.4.0" />
<PackageVersion Include="Microsoft.ML.OnnxRuntimeGenAI.Cuda" Version="0.3.0" />
<PackageVersion Include="Microsoft.ML.OnnxRuntimeGenAI.DirectML" Version="0.4.0"/>
</ItemGroup>
</Project>
</Project>
Loading

0 comments on commit 303b6d0

Please sign in to comment.