Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Issue]: Function calling is not working properly for gemini model. #1198

Closed
Haripritamreddy opened this issue Jan 10, 2024 · 7 comments
Closed
Assignees
Labels
0.2 Issues which are related to the pre 0.4 codebase

Comments

@Haripritamreddy
Copy link

Haripritamreddy commented Jan 10, 2024

Describe the issue

I have used litellm to get open ai compatable api for google's gemini pro model and used it in base_url.So when i tried function calling it is returning a json object but it is not the used format.I have tried different prompts but it still's not working.

Steps to reproduce

1.Step 1:
import yfinance as yf
import autogen

Configuration for OpenAI API

config_list = [
{
'model': 'gpt-3.5-turbo',
'api_key': 'anything',
'base_url': 'https://6ad3-35-245-36-6.ngrok-free.app',
}
]

Autogen configuration

llm_config = {
"functions": [
{
"name": "get_stock_price",
"description": "Get the current stock price for a given stock symbol.",
"parameters": {
"type": "object",
"properties": {
"symbol": {
"type": "string",
"description": "Stock symbol to get the price for.",
}
},
"required": ["symbol"],
},
},
],
"config_list": config_list,
"timeout": 120,
}

Autogen agent initialization

chatbot = autogen.AssistantAgent(
name="chatbot",
system_message="For coding tasks, only use the functions you have been provided with. Reply TERMINATE when the task is done.",
llm_config=llm_config,
)

Function to get stock price

def get_stock_price(symbol):
try:
# Use yfinance to get stock price
stock_data = yf.Ticker(symbol)
current_price = stock_data.history(period="1d")['Close'].iloc[-1]
return f"The current price of {symbol} is ${current_price:.2f}"
except Exception as e:
return f"Error fetching stock price: {str(e)}"

Autogen user proxy initialization

user_proxy = autogen.UserProxyAgent(
name="user_proxy",
is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
human_input_mode="NEVER",
max_consecutive_auto_reply=10,
code_execution_config={"work_dir": "coding"},
)

Register the get_stock_price function

user_proxy.register_function(
function_map={
"get_stock_price": get_stock_price,
}
)

Start the conversation

user_proxy.initiate_chat(
chatbot,
message="Get the current stock price for symbol apple.",
)

Screenshots and logs

image

Additional Information

No response

@davorrunje
Copy link
Collaborator

We are migrating code from function calls to tool calls. Do you have an option to use tool calls with Gemini?

@Haripritamreddy
Copy link
Author

No It is not working.Unlike function calling which is partially working It is giving altleast json output.But when i use tool calls with litellm its throwing erros.

import autogen
from typing import Literal
from typing_extensions import Annotated

config_list = [
{
'model': 'gpt-3.5-turbo',
'api_key': 'anything',
'base_url': 'http://abce-35-231-22-211.ngrok-free.app',
}
]

llm_config = {
"config_list": config_list,
"timeout": 120,
}

currency_bot = autogen.AssistantAgent(
name="currency_bot",
system_message="For currency exchange tasks, only use the functions you have been provided with. Reply TERMINATE "
"when the task is done.",
llm_config=llm_config,
)

user_proxy = autogen.UserProxyAgent(
name="user_proxy",
is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
human_input_mode="NEVER",
max_consecutive_auto_reply=10,
)

CurrencySymbol = Literal["USD", "EUR"]

def exchange_rate(base_currency: CurrencySymbol, quote_currency: CurrencySymbol) -> float:
if base_currency == quote_currency:
return 1.0
elif base_currency == "USD" and quote_currency == "EUR":
return 1 / 1.09
elif base_currency == "EUR" and quote_currency == "USD":
return 1.1
else:
raise ValueError(f"Unknown currencies {base_currency}, {quote_currency}")

@user_proxy.register_for_execution()
@currency_bot.register_for_llm(description="Currency exchange calculator.")
def currency_calculator(
base_amount: Annotated[float, "Amount of currency in base_currency"],
base_currency: Annotated[CurrencySymbol, "Base currency"] = "USD",
quote_currency: Annotated[CurrencySymbol, "Quote currency"] = "EUR",
) -> str:
quote_amount = exchange_rate(base_currency, quote_currency) * base_amount
return f"{quote_amount} {quote_currency}"

start the conversation

user_proxy.initiate_chat(
currency_bot,
message="How much is 123.45 USD in EUR?",
)

Error:

How much is 123.45 USD in EUR?


Traceback (most recent call last):
File "c:\Users\Asus\Documents\autogen\function_test.py", line 60, in
user_proxy.initiate_chat(
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\agentchat\conversable_agent.py", line 621, in initiate_chat
self.send(self.generate_init_message(**context), recipient, silent=silent)
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\agentchat\conversable_agent.py", line 398, in send
recipient.receive(message, self, request_reply, silent)
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\agentchat\conversable_agent.py", line 551, in receive
reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\agentchat\conversable_agent.py", line 1191, in generate_reply
final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\agentchat\conversable_agent.py", line 708, in generate_oai_reply
response = client.create(
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\oai\client.py", line 261, in create
response = self._completions_create(client, params)
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\autogen\oai\client.py", line 359, in _completions_create
response = completions.create(**params)
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_utils_utils.py", line 272, in wrapper
return func(*args, **kwargs)
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai\resources\chat\completions.py", line 645, in create
return self._post(
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 1088, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 853, in request
return self._request(
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 916, in _request
return self._retry_request(
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 958, in _retry_request
return self._request(
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 916, in _request
return self._retry_request(
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 958, in _retry_request
return self._request(
File "C:\Users\Asus\anaconda3\envs\pyautogen\lib\site-packages\openai_base_client.py", line 930, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 500 - {'detail': ''functions'\n\nTraceback (most recent call last):\n File "/usr/local/lib/python3.10/dist-packages/litellm/main.py", line 556, in completion\n optional_params = get_optional_params(\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 3198, in get_optional_params\n "tools", non_default_params.pop("functions")\nKeyError: 'functions'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/usr/local/lib/python3.10/dist-packages/litellm/main.py", line 215, in acompletion\n response = await loop.run_in_executor(None, func_with_context)\n File "/usr/lib/python3.10/concurrent/futures/thread.py", line 58, in run\n result = self.fn(*self.args, **self.kwargs)\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 2130, in wrapper\n raise e\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 2037, in wrapper\n result = original_function(*args, **kwargs)\n File "/usr/local/lib/python3.10/dist-packages/litellm/main.py", line 1746, in completion\n raise exception_type(\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 6628, in exception_type\n raise e\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 6603, in exception_type\n raise APIConnectionError(\nlitellm.exceptions.APIConnectionError: 'functions'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/usr/local/lib/python3.10/dist-packages/litellm/proxy/proxy_server.py", line 1464, in chat_completion\n response = await litellm.acompletion(**data)\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 2366, in wrapper_async\n raise e\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 2258, in wrapper_async\n result = await original_function(*args, **kwargs)\n File "/usr/local/lib/python3.10/dist-packages/litellm/main.py", line 227, in acompletion\n raise exception_type(\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 6628, in exception_type\n raise e\n File "/usr/local/lib/python3.10/dist-packages/litellm/utils.py", line 6596, in exception_type\n raise APIConnectionError(\nlitellm.exceptions.APIConnectionError: 'functions'\n'}

@davorrunje
Copy link
Collaborator

It looks like we broke it with all of the latest changes :/ I opened an issue for it and will fix it soon #1206

@nileshtrivedi
Copy link

Is function calling working now? It seems to fail with gemini-1.5-flash-latest for me.
Here are Gemini's docs: https://ai.google.dev/gemini-api/docs/function-calling

@davorrunje
Copy link
Collaborator

@nileshtrivedi function calling is working, but it might be broken in your use case. Can you please create a detailed issue with a code example and mention me it? I'll take it over from there

@rysweet rysweet added 0.2 Issues which are related to the pre 0.4 codebase needs-triage labels Oct 2, 2024
@rysweet
Copy link
Collaborator

rysweet commented Oct 18, 2024

closing as resolved.

@rysweet rysweet closed this as completed Oct 18, 2024
@jckw
Copy link

jckw commented Oct 21, 2024

@rysweet was this resolved as fixed or as not needed? I still haven't been able to figure out how to use function calling with Gemini. It would be great to see some docs on how to enable it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
0.2 Issues which are related to the pre 0.4 codebase
Projects
None yet
Development

No branches or pull requests

5 participants