From fe1e09706f69a4934430d2f2dadc354ffc4aa4dc Mon Sep 17 00:00:00 2001 From: Simon Willison Date: Mon, 4 Nov 2024 10:26:02 -0800 Subject: [PATCH] llm-lambda-labs !stable-docs --- docs/plugins/directory.md | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/plugins/directory.md b/docs/plugins/directory.md index 22438d6a..c94795e4 100644 --- a/docs/plugins/directory.md +++ b/docs/plugins/directory.md @@ -36,6 +36,7 @@ These plugins can be used to interact with remotely hosted models via their API: - **[llm-bedrock-anthropic](https://github.com/sblakey/llm-bedrock-anthropic)** by Sean Blakey adds support for Claude and Claude Instant by Anthropic via Amazon Bedrock. - **[llm-bedrock-meta](https://github.com/flabat/llm-bedrock-meta)** by Fabian Labat adds support for Llama 2 and Llama 3 by Meta via Amazon Bedrock. - **[llm-together](https://github.com/wearedevx/llm-together)** adds support for the [Together AI](https://www.together.ai/) extensive family of hosted openly licensed models. +- **[llm-lambda-labs](https://github.com/simonw/llm-lambda-labs)** provides access to models hosted by [Lambda Labs](https://docs.lambdalabs.com/public-cloud/lambda-chat-api/), including the Nous Hermes 3 series. If an API model host provides an OpenAI-compatible API you can also [configure LLM to talk to it](https://llm.datasette.io/en/stable/other-models.html#openai-compatible-models) without needing an extra plugin.