-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Issue]: Function calling is not working properly for gemini model. #1198
Comments
We are migrating code from function calls to tool calls. Do you have an option to use tool calls with Gemini? |
No It is not working.Unlike function calling which is partially working It is giving altleast json output.But when i use tool calls with litellm its throwing erros. import autogen config_list = [ llm_config = { currency_bot = autogen.AssistantAgent( user_proxy = autogen.UserProxyAgent( CurrencySymbol = Literal["USD", "EUR"] def exchange_rate(base_currency: CurrencySymbol, quote_currency: CurrencySymbol) -> float: @user_proxy.register_for_execution() start the conversationuser_proxy.initiate_chat( Error: How much is 123.45 USD in EUR? Traceback (most recent call last): |
It looks like we broke it with all of the latest changes :/ I opened an issue for it and will fix it soon #1206 |
Is function calling working now? It seems to fail with gemini-1.5-flash-latest for me. |
@nileshtrivedi function calling is working, but it might be broken in your use case. Can you please create a detailed issue with a code example and mention me it? I'll take it over from there |
closing as resolved. |
@rysweet was this resolved as fixed or as not needed? I still haven't been able to figure out how to use function calling with Gemini. It would be great to see some docs on how to enable it |
Describe the issue
I have used litellm to get open ai compatable api for google's gemini pro model and used it in base_url.So when i tried function calling it is returning a json object but it is not the used format.I have tried different prompts but it still's not working.
Steps to reproduce
1.Step 1:
import yfinance as yf
import autogen
Configuration for OpenAI API
config_list = [
{
'model': 'gpt-3.5-turbo',
'api_key': 'anything',
'base_url': 'https://6ad3-35-245-36-6.ngrok-free.app',
}
]
Autogen configuration
llm_config = {
"functions": [
{
"name": "get_stock_price",
"description": "Get the current stock price for a given stock symbol.",
"parameters": {
"type": "object",
"properties": {
"symbol": {
"type": "string",
"description": "Stock symbol to get the price for.",
}
},
"required": ["symbol"],
},
},
],
"config_list": config_list,
"timeout": 120,
}
Autogen agent initialization
chatbot = autogen.AssistantAgent(
name="chatbot",
system_message="For coding tasks, only use the functions you have been provided with. Reply TERMINATE when the task is done.",
llm_config=llm_config,
)
Function to get stock price
def get_stock_price(symbol):
try:
# Use yfinance to get stock price
stock_data = yf.Ticker(symbol)
current_price = stock_data.history(period="1d")['Close'].iloc[-1]
return f"The current price of {symbol} is ${current_price:.2f}"
except Exception as e:
return f"Error fetching stock price: {str(e)}"
Autogen user proxy initialization
user_proxy = autogen.UserProxyAgent(
name="user_proxy",
is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),
human_input_mode="NEVER",
max_consecutive_auto_reply=10,
code_execution_config={"work_dir": "coding"},
)
Register the get_stock_price function
user_proxy.register_function(
function_map={
"get_stock_price": get_stock_price,
}
)
Start the conversation
user_proxy.initiate_chat(
chatbot,
message="Get the current stock price for symbol apple.",
)
Screenshots and logs
Additional Information
No response
The text was updated successfully, but these errors were encountered: