You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed that when streaming, chunks seemed to be missing, especially during tool calls. Looking at http.rb, each 'data' object is extracted: chunk.scan(/(?:data|error): (\{.*\})/i).flatten.each do |data|
So, I added logging and found that the 'data' could be split across chunks. For example:
Chunk 1: data: {\"id\":\"gen-123\",\"provider\":\"Anthropic\",\"model\":\"anthropic/claude-3.5-so"
Chunk 2: "nnet\",\"object\":\"chat.completion.chunk\",\"created\":1721502907,\"choices\":[{\"delta\":{\"role\":\"assistant\",\"content\":null,\"tool_calls\":[{\"index\":0,\"type\":\"function\",\"function\":{\"arguments\":\"v0F2\"}}]},\"index\":0}]}......
The split data does not even get passed on to JSON.parse since it doesn't match the regex.
So, it would seem that chunks should be buffered until the complete 'data' is present.
Edit: On second thought, since it's complicated, maybe should just use the gem 'event_stream_parser' for this task like ruby-openai does.
The text was updated successfully, but these errors were encountered:
I noticed that when streaming, chunks seemed to be missing, especially during tool calls. Looking at http.rb, each 'data' object is extracted:
chunk.scan(/(?:data|error): (\{.*\})/i).flatten.each do |data|
So, I added logging and found that the 'data' could be split across chunks. For example:
Chunk 1:
data: {\"id\":\"gen-123\",\"provider\":\"Anthropic\",\"model\":\"anthropic/claude-3.5-so"
Chunk 2:
"nnet\",\"object\":\"chat.completion.chunk\",\"created\":1721502907,\"choices\":[{\"delta\":{\"role\":\"assistant\",\"content\":null,\"tool_calls\":[{\"index\":0,\"type\":\"function\",\"function\":{\"arguments\":\"v0F2\"}}]},\"index\":0}]}......
The split data does not even get passed on to JSON.parse since it doesn't match the regex.
So, it would seem that chunks should be buffered until the complete 'data' is present.
Edit: On second thought, since it's complicated, maybe should just use the gem 'event_stream_parser' for this task like ruby-openai does.
The text was updated successfully, but these errors were encountered: