-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Local LLM with function calling #406
Comments
After all that, which may or may not be worth reading.. I found this by doing some more searching https://microsoft.github.io/autogen/blog/2023/07/14/Local-LLMs/ |
I just had a similar problem |
I am also looking for ways to use local models. But I didn't find it. |
right now I am using glaive-function-calling model from huggingface as function calling agent llm. it's not perfect but it's working as least. and compatible with openai format. |
Has anyone used Autogen in conjunction with local LLM to create an agent that manages function calls? It seems that this function calling feature integrates seamlessly with OpenAI, but I'm facing challenges with LLM. Does anyone have any suggestions or experience to share regarding LLM? I was using CodeLlama.
The text was updated successfully, but these errors were encountered: