-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
compact/downsampling/rewrite: Read chunks directly from object storage. #3416
Comments
Can I work on this? |
Hello 👋 Looks like there was no activity on this issue for the last two months. |
Still valid and help wanted. Please @goku321 go ahead! We can assign this to you if you are still interested in. |
Thanks @kakkoyun . Yes, I'm still interested. Currently, I'm trying to finish up exemplars api feature and then, will move to this one. Please feel free to assign it to me. |
Has anyone considered using UploadPartCopy to do server-side concatenation of blocks during the compacting phase? This is exposed via the From the docs:
|
Hi @bwplotka, can you explain more about the implementation details about the same? I didn't find any similar implementation that solves this problem, so curious about the idea for implementation |
I've started looking into this. Thanks everyone for your patience. |
Hello 👋 Looks like there was no activity on this issue for the last two months. |
Closing for now as promised, let us know if you need this to be reopened! 🤗 |
Not stale 🥱 |
Hello 👋 Looks like there was no activity on this issue for the last two months. |
Hi, sounds like a great idea! |
Instead of downloading all bytes of chunks for blocks we want to process, we could easily just read and stream those through while using constant amount of mem / disk. We can do that because all of those operations go through chunks sequentially. We do that because series are sorted and chunks are placed by series ordered by oldest. See Prometheus compaction tests for confirmation:
With this println:
We have:
The text was updated successfully, but these errors were encountered: