Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reported message length does not match received message length with langchain/chat_models/bedrock/web #3236

Closed
Manouchehri opened this issue Nov 11, 2023 · 2 comments · Fixed by #3364
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@Manouchehri
Copy link
Contributor

When using BedrockChat with streaming, I always get Reported message length does not match received message length.

Might be releated to #3187.

Copy link

dosubot bot commented Nov 11, 2023

🤖

Hello again, @Manouchehri! It's good to see you're continuing to utilize langchainjs. I'm going to dive into this issue reported with BedrockChat and the mismatch in message length. I'll get back to you with a detailed response very soon.

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Nov 11, 2023
Copy link

dosubot bot commented Nov 11, 2023

🤖

The error "Reported message length does not match received message length" typically occurs when there is a mismatch between the expected length of a message and the actual length of the message received. In the context of the BedrockChat class in the langchain/chat_models/bedrock/web.ts file, this error could be related to the streaming functionality.

The _streamResponseChunks method in the BedrockChat class is responsible for handling the streaming of responses. It reads chunks of data from the response body, decodes them, and processes them. If the length of the chunk received does not match the expected length, it could lead to the error you're experiencing.

To debug this issue, you could add logging statements in the _streamResponseChunks method to log the length of the chunks received and compare them with the expected lengths. This could help you identify where the mismatch is occurring.

The expected message length when using the BedrockChat with streaming in the langchain/chat_models/bedrock/web is not explicitly defined in the provided context. The length of the message would depend on the response from the Bedrock service model and the specific language model used. However, there is a maxTokens parameter in the BedrockChat class, which could potentially limit the length of the generated text.

The exact implementation of the _streamResponseChunks method in the BedrockChat class can be found in the langchain/chat_models/bedrock/web.ts file in the repository. This method is an asynchronous generator that streams response chunks from the Bedrock service. It first determines the provider and the method to use for the Bedrock service. Then, it sends a signed fetch request to the service. If the response status is not successful, it throws an error.

If the provider is either "anthropic" or "cohere", it reads the response body in chunks. For each chunk, it decodes the event and checks if it is a valid event chunk. If it is, it decodes the body and checks if there is a message. If there is a message, it throws an error. If there are bytes, it decodes them and prepares the output. It then yields a new ChatGenerationChunk and handles the new token.

If the provider is not "anthropic" or "cohere", it simply prepares the output from the response json and yields a new ChatGenerationChunk, and handles the new token.

I hope this helps! If you need further assistance, please provide more specific information about the exact circumstances under which the error occurs.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant