Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sidecar: fix issue #396: split response into chunks no bigger than 2^16 samples #718

Merged
merged 3 commits into from
Jan 9, 2019

Conversation

robertjsullivan
Copy link
Contributor

split response into chunks no bigger than 2^16 to avoid overflowing XOR compression

Signed-off-by: Robert Sullivan [email protected]

Changes

In prometheus.go when building the response, create multiple AggrChunk structs each with a slice of samples not to exceed 2^16 number of samples. This is needed to not overflow the XOR encoding.

Verification

We added a test to prometheus_test.go, and we also shipped it to our environment where we were seeing the bug mentioned in issue #396. After pushing the changes, we saw a full 2 weeks of data in the same query that was previously cutting off the more recent week of data.

…2^16 to avoid overflowing XOR compression

Signed-off-by: Robert Sullivan <[email protected]>
Copy link
Member

@bwplotka bwplotka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! Thanks for this. Some suggestion.

@@ -176,6 +182,23 @@ func (p *PrometheusStore) Series(r *storepb.SeriesRequest, s storepb.Store_Serie
return nil
}

// XoR encoding supports a max size of 2^16 - 1 samples, so we need
// to chunk all samples into groups of no more than 2^16 - 1
func chunkSamples(series prompb.TimeSeries) [][]prompb.Sample {
Copy link
Member

@bwplotka bwplotka Jan 9, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This name could be misleading as proper chunking should be with max 120 samples (:

I think we should name it bit differently for reader to explicitly state it is fine to do biggestPossible chunks in our query case. Alternatively name it like this but add maxSamples int64 argument.

Also we could consider, as we have lot's of raw samples here to decrease mem consumption for sidecar, to reuse e.Samples instead of recreating multiple of arrays of prombp.Sample. Each prompb.Samples array with lenght 2^16 is 1MB worth of mem. Maybe not much to optimize this not sure. Depends on how many samples at the end you have. Plus with lazy GC and multiple series it might be quite large number.

Up to you, we can start with something readable like this and move for microoptimizations later on or move to slicing and operate on indexes only e.g (I wrote it in bus, so I did not test this)

var aggr []storepb.AggrChunk
var arr []prompb.Sample = e.Samples // arr will share same underlying array
for len(arr) > 0 {
    len := len(arr)
    if i > math.MaxUint16  {
       len = math.MaxUint16
   }

    enc, cb, err := p.encodeChunk(arr[:len-1])
    if err != nil {
 	return status.Error(codes.Unknown, err.Error())
    }
    aggr = append(aggr, storepb.AggrChunk{
	MinTime: int64(arr[0].Timestamp),
	MaxTime: int64(arr[len-1].Timestamp),
	Raw:     &storepb.Chunk{Type: enc, Data: cb},
   })
   arr = arr[len-1:]
}

What do you think?

Copy link
Member

@bwplotka bwplotka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is really good, LGTM and thanks.

Minor 3 comments, feel free to ignore.

// We generally expect all samples of the requested range to be traversed
// so we just encode all samples into one big chunk regardless of size.
enc, cb, err := p.encodeChunk(e.Samples)
// XoR encoding supports a max size of 2^16 - 1 samples, so we need
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
// XoR encoding supports a max size of 2^16 - 1 samples, so we need
// XOR encoding supports a max size of 2^16 - 1 samples, so we need

testutil.Ok(t, err)
srv := newStoreSeriesServer(ctx)

err = proxy.Series(&storepb.SeriesRequest{
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can inline testutil.Ok(t, proxy.Series... but not a blocker.


// Regression test for https://github.com/improbable-eng/thanos/issues/396.
func TestPrometheusStore_Series_SplitSamplesIntoChunksWithMaxSizeOfUint16_e2e(t *testing.T) {
defer leaktest.CheckTimeout(t, 10*time.Second)()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍


a := p.Appender()

offset := int64(math.MaxUint16 + 5)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Worth to make it 3 chunks maybe for solid test? (:

@kitemongerer
Copy link

@bwplotka I made those 3 changes you suggested

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants