-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
React Agent Workflow #12
Comments
Correct, Although if I'm reading this right, vllm does support function calling But only for certain llms (like llama3.1) I'd suggest using llama3.2 + the openai server + the There's a small example here that you can follow for setup (note, I haven't actually tried this yet with vllm + openai like, but in theory, should work) The other option is re-implementing the workflow to implement a react loop instead of a function calling loop, but it will be less reliable imo |
Thanks for your help. |
Vllm yet is giving bit of a hard time Tried with Meta 3.1 70 B Instruct, and variants of Phi, Mistral I will close the issue. |
Getting trouble in implementing this with vLLms for all llm functions.
Getting
raise ValueError("LLM must be a function calling model!") ValueError: LLM must be a function calling model!
specific error.For all the llm callingfunctions like
The text was updated successfully, but these errors were encountered: