Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bedrock streaming breaks #3850

Closed
Yofial opened this issue Jan 1, 2024 · 2 comments · Fixed by #3854
Closed

Bedrock streaming breaks #3850

Yofial opened this issue Jan 1, 2024 · 2 comments · Fixed by #3854
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@Yofial
Copy link
Contributor

Yofial commented Jan 1, 2024

Streaming with bedrock breaks with an exception.
It does not always occure, it depends on the behaviour of the bedrock service that cannot be controlled by the client.

From my observations it typically happens after many successful chunks arrive successfuly.
I was able to reproduce it many times with the following setup:
Bedrock + Anthropic + Claude v2 + Long Hebrew completions.
This is not the only way to reproduce it, but it happens very ofen in the given setup.

Here is an example of a Hebrew prompt to reproduce:
כתוב מאמר ארוך על משבר האקלים

This is the exception:

ERROR bedrock-chat-web Logger AIQuestionStream: runtime error: Offset is outside the bounds of the DataView / stack: RangeError: Offset is outside the bounds of the DataView at DataView.getUint32 (<anonymous>) at getMessageLength (C:\Users\username\Desktop\bedrock-chat-web\server\node_modules\langchain\node_modules\@langchain\community\dist\chat_models\bedrock\web.cjs:335:25) at [Symbol.asyncIterator] (C:\Users\username\Desktop\bedrock-chat-web\server\node_modules\langchain\node_modules\@langchain\community\dist\chat_models\bedrock\web.cjs:344:41) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async BedrockChat._streamResponseChunks (C:\Users\username\Desktop\bedrock-chat-web\server\node_modules\langchain\node_modules\@langchain\community\dist\chat_models\bedrock\web.cjs:289:30) at async BedrockChat._call (C:\Users\username\Desktop\bedrock-chat-web\server\node_modules\langchain\node_modules\@langchain\community\dist\chat_models\bedrock\web.cjs:210:30) at async BedrockChat._generate (C:\Users\username\Desktop\bedrock-chat-web\server\node_modules\@langchain\core\dist\language_models\chat_models.cjs:344:22) at async Promise.allSettled (index 0) at async BedrockChat._generateUncached (C:\Users\username\Desktop\bedrock-chat-web\server\node_modules\@langchain\core\dist\language_models\chat_models.cjs:114:25) at async LLMChain._call (C:\Users\username\Desktop\bedrock-chat-web\server\node_modules\langchain\dist\chains\llm_chain.cjs:157:37)

--
I've looked into the source code.
There is a structure to the event stream.
The code reads the message length from the first 4 byts of the stream and waits for the buffer to fill before it tries to encode it and pass it on.

The code below does not check for buffer.byteLength between 1 and 4 and failes to read an Uint32Array (4 bytes) from a buffer of 1 to 3 bytes length.

It looks like a new bug introduced in #3364

Copy link

dosubot bot commented Jan 1, 2024

🤖

It's great that you've taken the time to investigate the issue and identify the specific code that's causing the problem. Your understanding of the event stream structure and the code's behavior is valuable. Your proposed fix and the context you've provided will be very helpful for the maintainers to understand the issue and evaluate the solution. I encourage you to go ahead and create a pull request with your proposed changes. Your contribution will be appreciated by the langchainjs community. Keep up the good work!


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Jan 1, 2024
@Yofial
Copy link
Contributor Author

Yofial commented Jan 1, 2024

I have a working fix. I'll have to setup an environment in order to submit a pr

Yofial added a commit to Yofial/langchainjs that referenced this issue Jan 2, 2024
jacoblee93 added a commit that referenced this issue Jan 3, 2024
* correctly access the message length from the buffer. closes #3850

* Format and add to LLM class

* Style

---------

Co-authored-by: Brace Sproul <[email protected]>
Co-authored-by: jacoblee93 <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant