Skip to content

Commit

Permalink
Document Azure OpenAI Models support (#683)
Browse files Browse the repository at this point in the history
  • Loading branch information
sgrebnov authored Dec 16, 2024
1 parent 588b281 commit eb3eaed
Show file tree
Hide file tree
Showing 9 changed files with 83 additions and 9 deletions.
34 changes: 34 additions & 0 deletions spiceaidocs/docs/components/embeddings/azure.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
---
title: 'Azure OpenAI Embedding Models'
sidebar_label: 'Azure OpenAI'
sidebar_position: 2
---

To use an embedding model hosted on Azure OpenAI, specify the `azure` path in the `from` field and the following parameters from the [Azure OpenAI Model Deployment](https://ai.azure.com/resource/deployments) page:

| Param | Description | Default |
| ----------------------- | --------------------------------------------------------------------------- | --------------------------- |
| `azure_api_key` | The Azure OpenAI API key from the models deployment page. | - |
| `azure_api_version` | The API version used for the Azure OpenAI service. | - |
| `azure_deployment_name` | The name of the model deployment. | Model name |
| `endpoint` | The Azure OpenAI resource endpoint, e.g., `https://resource-name.openai.azure.com`. | - |
| `azure_entra_token` | The Azure Entra token for authentication. | - |

Only one of `azure_api_key` or `azure_entra_token` can be provided for model configuration.

Example:

```yaml
embeddings:
- name: embeddings-model
from: azure:text-embedding-3-small
params:
endpoint: ${ secrets:SPICE_AZURE_AI_ENDPOINT }
azure_deployment_name: text-embedding-3-small
azure_api_version: 2023-05-15
azure_api_key: ${ secrets:SPICE_AZURE_API_KEY }
```
Refer to the [Azure OpenAI Service models](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models) for more details on available models and configurations.
Follow [Quickstart: Using Azure OpenAI models](https://github.com/spiceai/quickstarts/tree/trunk/azure_openai) to try Azure OpenAI models for vector-based search and chat functionalities with structured (taxi trips) and unstructured (GitHub files) data.
2 changes: 1 addition & 1 deletion spiceaidocs/docs/components/embeddings/huggingface.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: 'HuggingFace Text Embedding Models'
sidebar_label: 'HuggingFace'
sidebar_position: 2
sidebar_position: 3
---

To use an embedding model from HuggingFace with Spice, specify the `huggingface` path in the `from` field of your configuration. The model and its related files will be automatically downloaded, loaded, and served locally by Spice.
Expand Down
4 changes: 2 additions & 2 deletions spiceaidocs/docs/components/models/anthropic.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,12 @@
title: 'Anthropic Models'
description: 'Instructions for using language models hosted on Anthropic with Spice.'
sidebar_label: 'Anthropic'
sidebar_position: 5
sidebar_position: 3
---

To use a language model hosted on Anthropic, specify `anthropic` in the `from` field.

To use a specific model, include its model ID in the `from` field (see example below). If not specified, the default model is `"claude-3-5-sonnet-latest"`.
To use a specific model, include its model ID in the `from` field (see example below). If not specified, the default model is `claude-3-5-sonnet-latest`.

The following parameters are specific to Anthropic models:

Expand Down
35 changes: 35 additions & 0 deletions spiceaidocs/docs/components/models/azure.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
---
title: 'Azure OpenAI Models'
description: 'Instructions for using Azure OpenAI models'
sidebar_label: 'Azure OpenAI'
sidebar_position: 2
---

To use a language model hosted on Azure OpenAI, specify the `azure` path in the `from` field and the following parameters from the [Azure OpenAI Model Deployment](https://ai.azure.com/resource/deployments) page:

| Param | Description | Default |
| ----------------------- | --------------------------------------------------------------------------- | --------------------------- |
| `azure_api_key` | The Azure OpenAI API key from the models deployment page. | - |
| `azure_api_version` | The API version used for the Azure OpenAI service. | - |
| `azure_deployment_name` | The name of the model deployment. | Model name |
| `endpoint` | The Azure OpenAI resource endpoint, e.g., `https://resource-name.openai.azure.com`. | - |
| `azure_entra_token` | The Azure Entra token for authentication. | - |

Only one of `azure_api_key` or `azure_entra_token` can be provided for model configuration.

Example:

```yaml
models:
- from: azure:gpt-4o-mini
name: gpt-4o-mini
params:
endpoint: ${ secrets:SPICE_AZURE_AI_ENDPOINT }
azure_api_version: 2024-08-01-preview
azure_deployment_name: gpt-4o-mini
azure_api_key: ${ secrets:SPICE_AZURE_API_KEY }
```
Refer to the [Azure OpenAI Service models](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models) for more details on available models and configurations.
Follow [Quickstart: Using Azure OpenAI models](https://github.com/spiceai/quickstarts/tree/trunk/azure_openai) to try Azure OpenAI models for vector-based search and chat functionalities with structured (taxi trips) and unstructured (GitHub files) data.
2 changes: 1 addition & 1 deletion spiceaidocs/docs/components/models/filesystem.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
title: 'Filesystem'
description: 'Instructions for using models hosted on a filesystem with Spice.'
sidebar_label: 'Filesystem'
sidebar_position: 3
sidebar_position: 5
---

To use a model hosted on a filesystem, specify the path to the model file in the `from` field.
Expand Down
2 changes: 1 addition & 1 deletion spiceaidocs/docs/components/models/huggingface.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
title: 'HuggingFace'
description: 'Instructions for using machine learning models hosted on HuggingFace with Spice.'
sidebar_label: 'HuggingFace'
sidebar_position: 1
sidebar_position: 4
---

To use a model hosted on HuggingFace, specify the `huggingface.co` path in the `from` field along with the files to include.
Expand Down
1 change: 1 addition & 0 deletions spiceaidocs/docs/components/models/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ Spice supports various model providers for traditional machine learning (ML) mod
| `huggingface` | Models hosted on [HuggingFace](https://huggingface.co) | ONNX | GGUF, GGML, SafeTensor |
| `spice.ai` | Models hosted on the [Spice Cloud Platform](https://docs.spice.ai/building-blocks/spice-models) | ONNX | - |
| `openai` | OpenAI (or compatible) LLM endpoint | - | Remote HTTP endpoint |
| `azure` | Azure OpenAI | - | Remote HTTP endpoint |
| `anthropic` | Models hosted on [Anthropic](https://www.anthropic.com) | - | Remote HTTP endpoint |
| `grok` | Coming soon | - | Remote HTTP endpoint |

Expand Down
10 changes: 7 additions & 3 deletions spiceaidocs/docs/components/models/openai.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,12 @@
title: 'OpenAI (or Compatible) Language Models'
description: 'Instructions for using language models hosted on OpenAI or compatible services with Spice.'
sidebar_label: 'OpenAI'
sidebar_position: 4
sidebar_position: 1
---

To use a language model hosted on OpenAI (or compatible), specify the `openai` path in the `from` field.

For a specific model, include it as the model ID in the `from` field (see example below). The default model is `"gpt-3.5-turbo"`.
For a specific model, include it as the model ID in the `from` field (see example below). The default model is `gpt-3.5-turbo`.

These parameters are specific to OpenAI models:

Expand All @@ -23,7 +23,7 @@ Example:
```yaml
models:
- from: openai:gpt-4o
name: local_fs_model
name: openai_model
params:
openai_api_key: ${ secrets:SPICE_OPENAI_API_KEY }
```
Expand All @@ -32,6 +32,10 @@ models:
Spice supports several OpenAI compatible providers. Specify the appropriate endpoint in the params section.
### Azure OpenAI
Follow [Azure AI Models](./azure) instructions.
### Groq
Groq provides OpenAI compatible endpoints. Use the following configuration:
Expand Down
2 changes: 1 addition & 1 deletion spiceaidocs/docs/components/models/spiceai.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
title: 'Spice Cloud Platform'
description: 'Instructions for using models hosted on the Spice Cloud Platform with Spice.'
sidebar_label: 'Spice Cloud Platform'
sidebar_position: 2
sidebar_position: 5
---

To use a model hosted on the [Spice Cloud Platform](https://docs.spice.ai/building-blocks/spice-models), specify the `spice.ai` path in the `from` field.
Expand Down

0 comments on commit eb3eaed

Please sign in to comment.