Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LiteLLM project support should be added to give user more power to choose different LLM easily instead of just Gpt #59

Open
Greatz08 opened this issue Mar 19, 2024 · 0 comments

Comments

@Greatz08
Copy link

Litellm is great foss project for helping users to run local api proxy server which will accept any response in open ai api format.You should read more on it and many projects have implemented it and this project need easy support for variety of llm so LiteLLM is the way.
Please implement it so that other users with different LLM can also test this project and give feedback for improvements wherever needed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant