Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: JSON mode with Ollama assumes Function Calling #7355

Open
sidjha1 opened this issue Dec 21, 2024 · 1 comment
Open

[Bug]: JSON mode with Ollama assumes Function Calling #7355

sidjha1 opened this issue Dec 21, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@sidjha1
Copy link

sidjha1 commented Dec 21, 2024

What happened?

It seems that specifying JSON response format leads to an error since the post-processing done by litellm assumes that the JSON response corresponds to a function call.

LiteLLM version: 1.55.9

from litellm import completion

response = completion(
    model="ollama/phi3:latest",
    messages=[{"role": "user", "content": "What is the capital of France?"}],
    response_format={"type": "json_object"}
)

The model actually ends up generating the JSON below, but it errors out in the postprocessing.

{"answer": "Paris"}

It seems to be due to this transformation.py code, which expects name and arguments fields. Is this the intended behavior?

Relevant log output

File "/<...>/litellm/llms/ollama/completion/transformation.py", line 263, in transform_response
    "name": function_call["name"],
KeyError: 'name'

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.55.9

Twitter / LinkedIn details

No response

@krrishdholakia
Copy link
Contributor

krrishdholakia commented Dec 24, 2024

File "/<...>/litellm/llms/ollama/completion/transformation.py", line 263, in transform_response
"name": function_call["name"],
KeyError: 'name'

i believe we're expecting the model to return the function call name

this is for scenarios where tool calling occurs and we try to handle/output parse it correctly

seems like this is overly aggressive and misses the base case of user passing in response_format: json

a pr + test here is welcome @sidjha1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants