Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: function call gets errors when using open source llm #453

Closed
wuyongyi opened this issue Oct 27, 2023 · 3 comments
Closed

Bug: function call gets errors when using open source llm #453

wuyongyi opened this issue Oct 27, 2023 · 3 comments
Labels
models Pertains to using alternate, non-GPT, models (e.g., local models, llama, etc.)

Comments

@wuyongyi
Copy link

wuyongyi commented Oct 27, 2023

I use autogen from local llm which is hosted by litellm. In most case the notebooks work fine but when I try:
"""
import autogen

response = autogen.oai.Completion.create(
config_list=[
{
"model": "ollama/codellama",
"api_base": "http://127.0.0.1:8000/v1",
"api_type": "open_ai",
"api_key": "testkey", # just a placeholder
"functions": [
{
"name": "ask_planner",
"description": "ask planner to: 1. get a plan for finishing a task, 2. verify the execution result of the plan and potentially suggest new plan.",
"parameters": {
"type": "object",
"properties": {
"message": {
"type": "string",
"description": "question to ask planner. Make sure the question include enough context, such as the code and the execution result. The planner does not know the conversation between you and the user, unless you share the conversation with the planner.",
},
},
"required": ["message"],
},
},
],
}
],
prompt="hi",
)
print(response)
"""
I get errors like:
"""
...
File ~/miniconda3/lib/python3.11/site-packages/autogen/oai/completion.py:1033, in Completion.cost(cls, response)
1023 @classmethod
1024 def cost(cls, response: dict):
1025 """Compute the cost of an API call.
1026
1027 Args:
(...)
1031 The cost in USD. 0 if the model is not supported.
1032 """
-> 1033 model = response.get("model")
1034 if model not in cls.price1K:
1035 return 0

AttributeError: 'list' object has no attribute 'get'
"""
if I remove the "function" part in oai.Completion.create call, everything works fine.

Does anyone have any suggestions about this issue?

@wuyongyi
Copy link
Author

dive into issues and just find this:

https://docs.litellm.ai/docs/completion/function_call#adding-function-to-prompt

LiteLLM only supports: OpenAI gpt-4-0613 and gpt-3.5-turbo-0613 for function calling.

Is that the point here?

@ishaan-jaff
Copy link

@wuyongyi we allow you to do function calling for non openai llms too: https://docs.litellm.ai/docs/completion/function_call#function-calling-for-non-openai-llms

@thinkall
Copy link
Collaborator

Closing this issue due to inactivity. If you have further questions, please open a new issue or join the discussion in AutoGen Discord server: https://discord.com/invite/Yb5gwGVkE5

@thinkall thinkall added the models Pertains to using alternate, non-GPT, models (e.g., local models, llama, etc.) label Jun 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
models Pertains to using alternate, non-GPT, models (e.g., local models, llama, etc.)
Projects
None yet
Development

No branches or pull requests

3 participants