From ad8adcc8e87d00fa87a92459ad6b3dbc834ce3b3 Mon Sep 17 00:00:00 2001 From: Donnie Adams Date: Tue, 26 Nov 2024 13:46:36 -0500 Subject: [PATCH] chore: add configuration info for Voyage model provider Signed-off-by: Donnie Adams --- docs/docs/05-configuration/02-model-providers.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/docs/docs/05-configuration/02-model-providers.md b/docs/docs/05-configuration/02-model-providers.md index 31d32dc4..ebf6d1c9 100644 --- a/docs/docs/05-configuration/02-model-providers.md +++ b/docs/docs/05-configuration/02-model-providers.md @@ -70,6 +70,10 @@ When configuring models with the Azure OpenAI provider in Otto8, the "Target Mod The Anthropic model provider requires setting the `OTTO8_ANTHROPIC_MODEL_PROVIDER_API_KEY` environment variable. You can get an API key for your Anthropic account [here](https://console.anthropic.com/settings/keys). +## Voyage AI + +Voyage is Anthropic's recommended text-embedding provider. The Voyage model provider requires setting the `OTTO8_VOYAGE_MODEL_PROVIDER_API_KEY` environment variable. You can get an API key for your Voyage account [here](https://dash.voyageai.com/api-keys). + ## Ollama The Ollama model provider requires the `OTTO8_OLLAMA_MODEL_PROVIDER_HOST` environment variable. This host must point to a running instance of Ollama. For your reference, the default host and port for Ollama is `127.0.0.1:11434`. Otto8 doesn't set this by default.