-
Notifications
You must be signed in to change notification settings - Fork 228
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deserialization of MergingDigest BuferUnderflowException in 3.1 #90
Comments
Are you using the master branch?
Or a release?
Master should be clean. Release coming Real Soon Now.
Except for the day job.
…On Fri, Jun 9, 2017 at 3:35 AM, Chris Larsen ***@***.***> wrote:
Heya, I'm writing an adaptor and noticed that if I create and serialize a
MergingDigest, I can't deserialize it. Instead I have to switch to the
AVLTreeDigest implementation. Is that intentional? An example test:
@test
public void mergingDigestSerDes() throws Exception {
final TDigest out = MergingDigest.createDigest(100);
out.add(42.5);
out.add(1);
out.add(24.0);
assertEquals(40.649, out.quantile(0.95), 0.001);
final ByteBuffer output = ByteBuffer.allocate(out.smallByteSize());
out.asSmallBytes(output);
ByteBuffer input = ByteBuffer.wrap(output.array());
try {
MergingDigest.fromBytes(input);
} catch (BufferUnderflowException e) {
System.out.println("WTF?");
}
input = ByteBuffer.wrap(output.array());
final TDigest in = AVLTreeDigest.fromBytes(input);
assertEquals(40.649, in.quantile(0.95), 0.001);
}
And from #52 <#52> , I'd like
solution number 2 where the type of tree is also encoded in the byte array.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#90>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAPSep38fvoBwM2qPZyr3Vnv_k3BTLt_ks5sCKFogaJpZM4N0zkx>
.
|
I'm using master but I'm still getting this BufferUnderflowException when trying to do MergingDigest.fromBytes as well. org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 4.0 failed 4 times, most recent failure: Lost task 2.3 in stage 4.0 (TID 83, 10.0.0.16): java.nio.BufferUnderflowException |
That means that there is a bug.
The serialization without compression should work for you. Try that.
And if you have a handy test case, I will get this fixed before the next
release. There are currently three known blockers:
1) your bug
2) the repeated number bug in AVLTreeDigest
3) the paper is only half updated
(and I have very little time to work on this)
…On Tue, Jun 13, 2017 at 2:12 AM, perthcha ***@***.***> wrote:
I'm using master but I'm still getting this BufferUnderflowException when
trying to do MergingDigest.fromBytes as well.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 2
in stage 4.0 failed 4 times, most recent failure: Lost task 2.3 in stage
4.0 (TID 83, 10.0.0.16): java.nio.BufferUnderflowException
at java.nio.Buffer.nextGetIndex(Buffer.java:506)
at java.nio.HeapByteBuffer.getDouble(HeapByteBuffer.java:514)
at com.tdunning.math.stats.MergingDigest.fromBytes(MergingDigest.java:678)
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#90 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAPSeh59k4tly0cBQ-BwvNqPQ78TXZdXks5sDdPtgaJpZM4N0zkx>
.
|
Ah... I just looked at your code. The problem is that the serialization format isn't the same for the different t-digest implementations. We are looking at how to deal with that now under the topic of universal serialization. For now, this isn't intended to work. (but it will work before long) |
OK. There was also a bug in the test case for this that was preventing any testing of the large format. This should be all better now. Will be included in 3.2 release. |
Heya, I'm writing an adaptor and noticed that if I create and serialize a MergingDigest, I can't deserialize it. Instead I have to switch to the AVLTreeDigest implementation. Is that intentional? An example test:
And from #52 , I'd like solution number 2 where the type of tree is also encoded in the byte array.
The text was updated successfully, but these errors were encountered: