release-21.1: changefeedccl: Increase message size limits for kafka sink. #76322
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Backport 1/1 commits from #76265.
/cc @cockroachdb/release
Sarama library, used by kafka sink, limits the maximum message
sizes locally. When those limits are exceeded, sarama library
returns confusing error message which seems to imply that the remote
kafka server rejected the message, even though this rejection happened
locally:
kafka server: Message was too large, server rejected it to avoid allocation error.
This PR addresses the problem by increasing sarama limits to 2GB
(max int32).
An alternative approach was to extend
kafka_sink_config
to specifymaximum message size. However, this alternative is less desirable.
For one, the user supplied configuration can run afoul other limits
imposed by sarama library (e.g.
MaxRequestSize
), so more configurationoption must be added. In addition, this really exposes very low level
implementation details in the sarama library -- something that we
probably should not do.
Fixes #76258
Release Notes (enterprise change): Kafka sink supports larger messages,
up to 2GB in size.