Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use a LLM different from OpenAI #1

Open
NeuroinformaticaFBF opened this issue Oct 24, 2024 · 1 comment
Open

Use a LLM different from OpenAI #1

NeuroinformaticaFBF opened this issue Oct 24, 2024 · 1 comment

Comments

@NeuroinformaticaFBF
Copy link

Hello there!

I was trying to use your code to run some test.
Deprecations aside, I see that it is all coded to run with OpenAI models.

Is there a quick way to switch to other models, like for example Llama?

@iseesaw
Copy link
Contributor

iseesaw commented Oct 28, 2024

Yes, you can run the Llama model using vLLM on an OpenAI-compatible server. This will allow you to call the API just as our code does, enabling compatibility with Llama.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants