Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extend Theia AI docu that specific models might require customization #665

Merged
merged 5 commits into from
Nov 29, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 8 additions & 3 deletions src/docs/theia_ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -370,13 +370,18 @@ This approach enables users to interact with the AI-powered Chat UI more efficie

## Custom LLM Provider

Theia AI currently provides out-of-the-box support for any OpenAI-compatible LLM services (including the OpenAI models hosted by OpenAI or self-hosted compatible LLMs) and Ollama LLMs. We are also working on integrating support for LLamaFile. As a tool provider, you can easily add support for other LLM APIs too. Please consider contributing any further LLM providers to the open source project, so that the community can benefit from increased compatibility and share the maintenance cost.
Theia AI currently provides out-of-the-box support for any OpenAI-compatible LLM services (including the OpenAI models hosted by OpenAI or self-hosted compatible LLMs), Hugging Face, Ollama LLMs and Llamafile.

To support a specific LLM, you basically need to provide an implementation of the interface ‘LanguageModel’, which provides an abstraction between agents and the underlying LLM. This Language Model is then registered at the ‘LanguageModelRegistry’, to make it retrievable for agents, e.g. like this:
*Note: Theia AI enables connections to various models. However, be aware that some models may require specific customizations and optimizations to perform well in all scenarios. If you encounter issues implementing those customizations, please [provide feedback](https://github.com/eclipse-theia/theia/issues/new/choose).*

Learn more about which LLM providers are supported out of the box in at the example of the [Theia IDE](user_ai), which allows end users to configure the used LLM per agent. Of course, as a tool provider you can choose whether you provide this configurability to end users.

As a tool provider, you can easily add support for additional LLM APIs too. Please consider contributing your additional LLM providers to the open source project, so that the community can benefit from increased compatibility and share the maintenance cost.

To support a specific LLM API, you basically need to provide an implementation of the interface ‘LanguageModel’, which provides an abstraction between agents and the underlying LLM. This language model implementation then needs to be registered at the ‘LanguageModelRegistry’, to make it retrievable for agents as follows:

```typescript
this.languageModelRegistry.addLanguageModels([new OllamaModel());
this.languageModelRegistry.addLanguageModels([new OllamaModel()]);
```

If you want to allow the user to configure the LLM provider, e.g. change the URL or the available models, consider integrating your LLM provider with Theias preference system.
Expand Down
4 changes: 3 additions & 1 deletion src/docs/user_ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,10 +38,12 @@ To activate AI support in the Theia IDE, go to Preferences and enable the settin

To use Theia AI within the Theia IDE, you need to provide access to at least one LLM. Theia IDE comes with preinstalled support for OpenAI API-compatible models, either hosted by OpenAI or self-hosted via VLLM. Additionally, Theia IDE supports connecting to models via Ollama. See the corresponding sections below on how to configure these providers.

Other LLM providers, including local models, can be added easily. We are also working on enabling LLamaFile support in the near future. If you would like to see support for a specific LLM, please provide feedback or consider contributing.
Other LLM providers, including local models, can be added easily. If you would like to see support for a specific LLM, please provide feedback or consider contributing.

Each LLM provider offers a configurable list of available models (see the screenshot below for Hugging Face Models models). To use a model in your IDE, configure it on a per-agent basis in the AI Configuration view.

*Note: Theia IDE enables connections to various models (e.g., HuggingFace, custom OpenAPI models, LlamaFile). However, not all models may work out of the box, as they may require specific customizations or optimizations. If you encounter issues, please [provide feedback](https://github.com/eclipse-theia/theia/issues/new/choose), keeping in mind this is an early-phase feature.*

### OpenAI (Hosted by OpenAI)

To enable the use of OpenAI, you need to create an API key in your OpenAI account and enter it in the settings AI-features => OpenAiOfficial (see the screenshot below).
Expand Down