From eb3eaeda006a20cce3991ab6000239031b9c8110 Mon Sep 17 00:00:00 2001 From: Sergei Grebnov Date: Mon, 16 Dec 2024 12:31:28 -0800 Subject: [PATCH] Document Azure OpenAI Models support (#683) --- .../docs/components/embeddings/azure.md | 34 ++++++++++++++++++ .../docs/components/embeddings/huggingface.md | 2 +- .../docs/components/models/anthropic.md | 4 +-- spiceaidocs/docs/components/models/azure.md | 35 +++++++++++++++++++ .../docs/components/models/filesystem.md | 2 +- .../docs/components/models/huggingface.md | 2 +- spiceaidocs/docs/components/models/index.md | 1 + spiceaidocs/docs/components/models/openai.md | 10 ++++-- spiceaidocs/docs/components/models/spiceai.md | 2 +- 9 files changed, 83 insertions(+), 9 deletions(-) create mode 100644 spiceaidocs/docs/components/embeddings/azure.md create mode 100644 spiceaidocs/docs/components/models/azure.md diff --git a/spiceaidocs/docs/components/embeddings/azure.md b/spiceaidocs/docs/components/embeddings/azure.md new file mode 100644 index 00000000..4faad95a --- /dev/null +++ b/spiceaidocs/docs/components/embeddings/azure.md @@ -0,0 +1,34 @@ +--- +title: 'Azure OpenAI Embedding Models' +sidebar_label: 'Azure OpenAI' +sidebar_position: 2 +--- + +To use an embedding model hosted on Azure OpenAI, specify the `azure` path in the `from` field and the following parameters from the [Azure OpenAI Model Deployment](https://ai.azure.com/resource/deployments) page: + +| Param | Description | Default | +| ----------------------- | --------------------------------------------------------------------------- | --------------------------- | +| `azure_api_key` | The Azure OpenAI API key from the models deployment page. | - | +| `azure_api_version` | The API version used for the Azure OpenAI service. | - | +| `azure_deployment_name` | The name of the model deployment. | Model name | +| `endpoint` | The Azure OpenAI resource endpoint, e.g., `https://resource-name.openai.azure.com`. | - | +| `azure_entra_token` | The Azure Entra token for authentication. | - | + +Only one of `azure_api_key` or `azure_entra_token` can be provided for model configuration. + +Example: + +```yaml +embeddings: + - name: embeddings-model + from: azure:text-embedding-3-small + params: + endpoint: ${ secrets:SPICE_AZURE_AI_ENDPOINT } + azure_deployment_name: text-embedding-3-small + azure_api_version: 2023-05-15 + azure_api_key: ${ secrets:SPICE_AZURE_API_KEY } +``` + +Refer to the [Azure OpenAI Service models](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models) for more details on available models and configurations. + +Follow [Quickstart: Using Azure OpenAI models](https://github.com/spiceai/quickstarts/tree/trunk/azure_openai) to try Azure OpenAI models for vector-based search and chat functionalities with structured (taxi trips) and unstructured (GitHub files) data. diff --git a/spiceaidocs/docs/components/embeddings/huggingface.md b/spiceaidocs/docs/components/embeddings/huggingface.md index 0955fdb1..1072b16f 100644 --- a/spiceaidocs/docs/components/embeddings/huggingface.md +++ b/spiceaidocs/docs/components/embeddings/huggingface.md @@ -1,7 +1,7 @@ --- title: 'HuggingFace Text Embedding Models' sidebar_label: 'HuggingFace' -sidebar_position: 2 +sidebar_position: 3 --- To use an embedding model from HuggingFace with Spice, specify the `huggingface` path in the `from` field of your configuration. The model and its related files will be automatically downloaded, loaded, and served locally by Spice. diff --git a/spiceaidocs/docs/components/models/anthropic.md b/spiceaidocs/docs/components/models/anthropic.md index dbed221a..35a0be97 100644 --- a/spiceaidocs/docs/components/models/anthropic.md +++ b/spiceaidocs/docs/components/models/anthropic.md @@ -2,12 +2,12 @@ title: 'Anthropic Models' description: 'Instructions for using language models hosted on Anthropic with Spice.' sidebar_label: 'Anthropic' -sidebar_position: 5 +sidebar_position: 3 --- To use a language model hosted on Anthropic, specify `anthropic` in the `from` field. -To use a specific model, include its model ID in the `from` field (see example below). If not specified, the default model is `"claude-3-5-sonnet-latest"`. +To use a specific model, include its model ID in the `from` field (see example below). If not specified, the default model is `claude-3-5-sonnet-latest`. The following parameters are specific to Anthropic models: diff --git a/spiceaidocs/docs/components/models/azure.md b/spiceaidocs/docs/components/models/azure.md new file mode 100644 index 00000000..0ec82862 --- /dev/null +++ b/spiceaidocs/docs/components/models/azure.md @@ -0,0 +1,35 @@ +--- +title: 'Azure OpenAI Models' +description: 'Instructions for using Azure OpenAI models' +sidebar_label: 'Azure OpenAI' +sidebar_position: 2 +--- + +To use a language model hosted on Azure OpenAI, specify the `azure` path in the `from` field and the following parameters from the [Azure OpenAI Model Deployment](https://ai.azure.com/resource/deployments) page: + +| Param | Description | Default | +| ----------------------- | --------------------------------------------------------------------------- | --------------------------- | +| `azure_api_key` | The Azure OpenAI API key from the models deployment page. | - | +| `azure_api_version` | The API version used for the Azure OpenAI service. | - | +| `azure_deployment_name` | The name of the model deployment. | Model name | +| `endpoint` | The Azure OpenAI resource endpoint, e.g., `https://resource-name.openai.azure.com`. | - | +| `azure_entra_token` | The Azure Entra token for authentication. | - | + +Only one of `azure_api_key` or `azure_entra_token` can be provided for model configuration. + +Example: + +```yaml +models: + - from: azure:gpt-4o-mini + name: gpt-4o-mini + params: + endpoint: ${ secrets:SPICE_AZURE_AI_ENDPOINT } + azure_api_version: 2024-08-01-preview + azure_deployment_name: gpt-4o-mini + azure_api_key: ${ secrets:SPICE_AZURE_API_KEY } +``` + +Refer to the [Azure OpenAI Service models](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models) for more details on available models and configurations. + +Follow [Quickstart: Using Azure OpenAI models](https://github.com/spiceai/quickstarts/tree/trunk/azure_openai) to try Azure OpenAI models for vector-based search and chat functionalities with structured (taxi trips) and unstructured (GitHub files) data. diff --git a/spiceaidocs/docs/components/models/filesystem.md b/spiceaidocs/docs/components/models/filesystem.md index 11d155aa..e057197f 100644 --- a/spiceaidocs/docs/components/models/filesystem.md +++ b/spiceaidocs/docs/components/models/filesystem.md @@ -2,7 +2,7 @@ title: 'Filesystem' description: 'Instructions for using models hosted on a filesystem with Spice.' sidebar_label: 'Filesystem' -sidebar_position: 3 +sidebar_position: 5 --- To use a model hosted on a filesystem, specify the path to the model file in the `from` field. diff --git a/spiceaidocs/docs/components/models/huggingface.md b/spiceaidocs/docs/components/models/huggingface.md index bfb6e446..21e7b527 100644 --- a/spiceaidocs/docs/components/models/huggingface.md +++ b/spiceaidocs/docs/components/models/huggingface.md @@ -2,7 +2,7 @@ title: 'HuggingFace' description: 'Instructions for using machine learning models hosted on HuggingFace with Spice.' sidebar_label: 'HuggingFace' -sidebar_position: 1 +sidebar_position: 4 --- To use a model hosted on HuggingFace, specify the `huggingface.co` path in the `from` field along with the files to include. diff --git a/spiceaidocs/docs/components/models/index.md b/spiceaidocs/docs/components/models/index.md index 6fab14c2..aed16db4 100644 --- a/spiceaidocs/docs/components/models/index.md +++ b/spiceaidocs/docs/components/models/index.md @@ -13,6 +13,7 @@ Spice supports various model providers for traditional machine learning (ML) mod | `huggingface` | Models hosted on [HuggingFace](https://huggingface.co) | ONNX | GGUF, GGML, SafeTensor | | `spice.ai` | Models hosted on the [Spice Cloud Platform](https://docs.spice.ai/building-blocks/spice-models) | ONNX | - | | `openai` | OpenAI (or compatible) LLM endpoint | - | Remote HTTP endpoint | +| `azure` | Azure OpenAI | - | Remote HTTP endpoint | | `anthropic` | Models hosted on [Anthropic](https://www.anthropic.com) | - | Remote HTTP endpoint | | `grok` | Coming soon | - | Remote HTTP endpoint | diff --git a/spiceaidocs/docs/components/models/openai.md b/spiceaidocs/docs/components/models/openai.md index e71769dc..da2bd3da 100644 --- a/spiceaidocs/docs/components/models/openai.md +++ b/spiceaidocs/docs/components/models/openai.md @@ -2,12 +2,12 @@ title: 'OpenAI (or Compatible) Language Models' description: 'Instructions for using language models hosted on OpenAI or compatible services with Spice.' sidebar_label: 'OpenAI' -sidebar_position: 4 +sidebar_position: 1 --- To use a language model hosted on OpenAI (or compatible), specify the `openai` path in the `from` field. -For a specific model, include it as the model ID in the `from` field (see example below). The default model is `"gpt-3.5-turbo"`. +For a specific model, include it as the model ID in the `from` field (see example below). The default model is `gpt-3.5-turbo`. These parameters are specific to OpenAI models: @@ -23,7 +23,7 @@ Example: ```yaml models: - from: openai:gpt-4o - name: local_fs_model + name: openai_model params: openai_api_key: ${ secrets:SPICE_OPENAI_API_KEY } ``` @@ -32,6 +32,10 @@ models: Spice supports several OpenAI compatible providers. Specify the appropriate endpoint in the params section. +### Azure OpenAI + +Follow [Azure AI Models](./azure) instructions. + ### Groq Groq provides OpenAI compatible endpoints. Use the following configuration: diff --git a/spiceaidocs/docs/components/models/spiceai.md b/spiceaidocs/docs/components/models/spiceai.md index f886e3f3..b5b98712 100644 --- a/spiceaidocs/docs/components/models/spiceai.md +++ b/spiceaidocs/docs/components/models/spiceai.md @@ -2,7 +2,7 @@ title: 'Spice Cloud Platform' description: 'Instructions for using models hosted on the Spice Cloud Platform with Spice.' sidebar_label: 'Spice Cloud Platform' -sidebar_position: 2 +sidebar_position: 5 --- To use a model hosted on the [Spice Cloud Platform](https://docs.spice.ai/building-blocks/spice-models), specify the `spice.ai` path in the `from` field.