Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using others models, is possible? #447

Open
CrashAngelArts opened this issue Jan 3, 2024 · 3 comments
Open

Using others models, is possible? #447

CrashAngelArts opened this issue Jan 3, 2024 · 3 comments

Comments

@CrashAngelArts
Copy link

Hi, is possible config files to using mixtral models in mentat?

@PCSwingle
Copy link
Member

PCSwingle commented Jan 4, 2024

Right now, we use OpenAI's sdk, so only models that have the same output as OpenAI can be used (you can set the OPENAI_API_BASE environment variable to point to wherever your other model is running; check out our README for more information). However, once PR #450, which adds litellm, is merged, it should allow you to use any model!

Edit: Due to a few issues with litellm, we decided to not integrate it with mentat after all; however, if you check the readme now, I just updated it with an example on how to use a litellm proxy server with mentat. I haven't used mixtral personally before, but I assume it should work with litellm fairly easily.

@ishaan-jaff
Copy link

@PCSwingle what issues did you run into with litellm ? - litellm maintainer

@vnnivas
Copy link

vnnivas commented May 30, 2024

Tried with Mistral along with litellm as per instructions. Added OPEN_API_BASE in .env file, When I start mentat, I still get a message "No openai API key detected"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants