Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

changefeedccl: Bump default per changefeed limit #96340

Merged
merged 1 commit into from
Feb 2, 2023

Conversation

miretskiy
Copy link
Contributor

Inrease default changefeed limit to 512MiB.
The limit was previously reduced from 1GiB due to the concerns of changefeed impact on foreground latency. That impact in turn was due how changefeed allocated and management memory. The issue leading to the reduction in this setting default have been addressed (via
improvememnts in memory management, leading to the changefeed impact reduction on the Go runtime, as well as addition of CPU pacing).

The 128MiB default limit is too low for file based sinks. By default, those sinks buffer 16MB of data, and often times this is combined with compression. If the data is highly compressible (5-10x, which is not uncommon), changefeds wind up using the entirety of memory budget, before reaching their target file size, resulting in small files being produced.

This PR splits a difference between original, 1GiB setting, and the new 128MiB setting by increasing default setting to 1/2GiB.

Epic: None

Release note (enterprise change): Bump up default
changefeed.memory.per_changefeed_limit setting to 1/2GiB. This should result in changefeeds being able to produce larger files.

Inrease default changefeed limit to 512MiB.
The limit was previously reduced from 1GiB due to the
concerns of changefeed impact on foreground latency.
That impact in turn was due how changefeed allocated and
management memory.  The issue leading to the reduction
in this setting default have been addressed (via
improvememnts in memory management, leading to the
changefeed impact reduction on the Go runtime, as well as
addition of CPU pacing).

The 128MiB default limit is too low for file based sinks.
By default, those sinks buffer 16MB of data, and often
times this is combined with compression.  If the data
is highly compressible (5-10x, which is not uncommon),
changefeds wind up using the entirety of memory budget,
before reaching their target file size, resulting in small
files being produced.

This PR splits a difference between original, 1GiB setting,
and the new 128MiB setting by increasing default setting to 1/2GiB.

Epic: None

Release note (enterprise change): Bump up default
`changefeed.memory.per_changefeed_limit` setting to 1/2GiB.
This should result in changefeeds being able to produce larger
files.
@miretskiy miretskiy requested a review from shermanCRL February 1, 2023 13:04
@miretskiy miretskiy requested a review from a team as a code owner February 1, 2023 13:04
@blathers-crl
Copy link

blathers-crl bot commented Feb 1, 2023

It looks like your PR touches production code but doesn't add or edit any test code. Did you consider adding tests to your PR?

🦉 Hoot! I am a Blathers, a bot for CockroachDB. My owner is dev-inf.

@cockroach-teamcity
Copy link
Member

This change is Reviewable

@miretskiy
Copy link
Contributor Author

bors r+

@craig
Copy link
Contributor

craig bot commented Feb 1, 2023

This PR was included in a batch that was canceled, it will be automatically retried

@craig
Copy link
Contributor

craig bot commented Feb 2, 2023

Build failed (retrying...):

@craig
Copy link
Contributor

craig bot commented Feb 2, 2023

Build succeeded:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants