-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Vertex LLM does not Handle FunctionCall tools #16678
Comments
Hey @stfines-clgx! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this together! The error you're encountering, To address this issue, you should check the following:
If these steps do not resolve the issue, further investigation into the specific implementation details of the |
Probably could be related to the package version for the generative-ai package? I don't have access to vertex to test, but the code is here llama_index/llama-index-integrations/llms/llama-index-llms-vertex/llama_index/llms/vertex/base.py Line 498 in bfd2514
Happy to accept a PR to fix |
That does seem possible, esp since the pyproject.toml lists a aiplatform dependency of 1.39.0 and the current version is 1.70.0; I'll pull locally and see if I can build this lib with a newer dependency |
That would be amazing, thank you! |
Taking a look at this, the entire set of interfaces has changed significantly. I've forked and will re-work how to interact with gemini so that it uses the actual released interface and as such doesn't depend on unreleased/private features. which should address the problem. Once I can verify that it functions I'll submit a PR - it'll probably require a new version update. |
Ok this fork has a branch - dev/1_70_protobuf_only that seems to resolve this issue. My testing with several different types of functions and non-function invocations has it working well. It does not resolve @logan-markewich if you would take a look prior to my creating a PR and let me know what else needs doing to make a good PR for this project I would appreciate it. |
Bug Description
When using a FunctionTool, I receive the following error:
AttributeError: 'FunctionCall' object has no attribute '_pb'
It seems that this tool makes some assumptions about the input data types.
Version
0.11.7, llama-index-llms-vertex==0.3.7
Steps to Reproduce
Update the code in The multi-agent concierge example to use Vertex, then try and run.
Relevant Logs/Tracbacks
The text was updated successfully, but these errors were encountered: