Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dropped chunks while streaming #6

Open
enzo2 opened this issue Nov 14, 2024 · 0 comments
Open

Dropped chunks while streaming #6

enzo2 opened this issue Nov 14, 2024 · 0 comments

Comments

@enzo2
Copy link

enzo2 commented Nov 14, 2024

I noticed that when streaming, chunks seemed to be missing, especially during tool calls. Looking at http.rb, each 'data' object is extracted: chunk.scan(/(?:data|error): (\{.*\})/i).flatten.each do |data|
So, I added logging and found that the 'data' could be split across chunks. For example:
Chunk 1: data: {\"id\":\"gen-123\",\"provider\":\"Anthropic\",\"model\":\"anthropic/claude-3.5-so"
Chunk 2: "nnet\",\"object\":\"chat.completion.chunk\",\"created\":1721502907,\"choices\":[{\"delta\":{\"role\":\"assistant\",\"content\":null,\"tool_calls\":[{\"index\":0,\"type\":\"function\",\"function\":{\"arguments\":\"v0F2\"}}]},\"index\":0}]}......

The split data does not even get passed on to JSON.parse since it doesn't match the regex.
So, it would seem that chunks should be buffered until the complete 'data' is present.

Edit: On second thought, since it's complicated, maybe should just use the gem 'event_stream_parser' for this task like ruby-openai does.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant