-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Fireworks AI endpoint #883
Comments
oh yes for fireworks, can you try the "ollama" provider? that one doesn't include "stream_options" if that works, then I'll update the docs |
Great idea! I tried it, but got You can set it manually using |
yes our current workaround is to add the header with an environment variable like that, we'll see if we just add a quick provider for fireworks that works out-of-the-box in a cleaner way. |
@arunbahl to unblock you i think we can have an env variable thats something like But we'll look into a way to prevent and likely remove teh stream_options parameter Oops, just realized you noted that already. |
Makes sense and the workaround is great for now. Thanks for your responsiveness and all your work on this project, we're fans :) |
Fireworks.ai clients previously worked using an openai-type provider, but have since stopped working with a 400 Bad Request error:
Request failed: {"error":{"object":"error","type":"invalid_request_error","message":"Extra inputs are not permitted, field: 'stream_options'"}}
Is it possible to:
stream_options
from being sent by setting a field in options? orstream_options
?Fireworks generally has excellent token rates, it would be great to be able to use BAML with them.
The text was updated successfully, but these errors were encountered: