Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for OpenAI's structured output via json_schema in the response_format #1084

Open
yaylinda opened this issue Dec 6, 2024 · 3 comments

Comments

@yaylinda
Copy link

yaylinda commented Dec 6, 2024

I would like to use OpenAI's structured output and pass my json_schema in the response_format when I call GenerateResponse (not using tools or function calls).

I came across this unit test that accomplishes this and uses ResponseFormatJSONSchema and ResponseFormatJSONSchemaProperty types within the ResponseFormat object.

However, the publicly exposed ResponseFormat struct only has the Type field, and ResponseFormatJSONSchema and ResponseFormatJSONSchemaProperty types are not exposed at all.

Could those structs be exposed so that we can construct a ResponseFormat like the one in your unit test?

Thank you!

@DarkCaster
Copy link

DarkCaster commented Dec 8, 2024

I'm having similar problem, but for latest Ollama (0.5.1).
I managed to overcome that by using ollama.WithHTTPClient option with my own "mitm" http-client that inject all needed things directly inside request-body. I used httputil/debug_transport.go as example to write my own request interceptor. I think you can try implementing same for your project with openai (or any other provider). I can provide link to my example if you want.

@yaylinda
Copy link
Author

I'm having similar problem, but for latest Ollama (0.5.1). I managed to overcome that by using ollama.WithHTTPClient option with my own "mitm" http-client that inject all needed things directly inside request-body. I used httputil/debug_transport.go as example to write my own request interceptor. I think you can try implementing same for your project with openai (or any other provider). I can provide link to my example if you want.

@DarkCaster ohh! please share the link if you don't mind! i'd love to see how you did that. thank you so much!

@DarkCaster
Copy link

@yaylinda sure. So, here is my mitm-http client implementation: https://github.com/DarkCaster/Perpetual/blob/main/llm/mitmHTTPClient.go
I registered it to work with Ollama client like that:

...
ollamaOptions := append([]ollama.Option{}, ollama.WithModel(p.Model))
ollamaOptions = append(ollamaOptions, ollama.WithHTTPClient(NewMitmHTTPClient(SchemaToInject)))
model, err := ollama.New(ollamaOptions...)
...

SchemaToInject is a map[string]interface{} object, that produce JSON-schema when serialized, like that:
"format": { <schema fields> } (Full schema format example for Ollama provided here: https://github.com/ollama/ollama/releases/tag/v0.5.0)

The logic of mitmHTTPClient::RoundTrip is the following:

  • extract body from intercepted request
  • deserialize it to map[string]interface{} object
  • merge the object with provided SchemaToInject object
  • serialize it back to JSON and replace old request body with new
  • send new request
  • ...
  • PROFIT!

I think you can use similar approach with OpenAI provider, according to documentation it also use schema from request body ("response_format" field): https://openai.com/index/introducing-structured-outputs-in-the-api , https://platform.openai.com/docs/guides/structured-outputs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants