You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Need to support write large strings and lists in SPYT and provide setting to configure limit
Caused by: java.util.concurrent.CompletionException: tech.ytsaurus.core.common.YTsaurusError: 'Value of type "string" is too long: length 25093876, limit 16777216'; full error: {"code"=1;"message"="Value of type "string" is too long: length 25093876, limit 16777216";"attributes"={"host"="...";"pid"=1183;"tid"=1025820748059212498u;"thread"="Worker:6";"fid"=18443790158341985566u;"datetime"="2024-10-21T14:01:02.311714Z";"trace_id"="a1bda6cf-4f47070b-4cd5d931-e3cb1602";"span_id"=10358239722602468708u;};}
at java.base/java.util.concurrent.CompletableFuture.reportJoin(CompletableFuture.java:412)
at java.base/java.util.concurrent.CompletableFuture.join(CompletableFuture.java:2044)
at tech.ytsaurus.client.StreamWriterImpl.push(StreamImpls.java:337)
at tech.ytsaurus.client.RawTableWriterImpl.write(StreamImpls.java:427)
at tech.ytsaurus.client.TableWriterBaseImpl.write(TableWriterImpl.java:89)
at tech.ytsaurus.client.TableWriterImpl.write(TableWriterImpl.java:119)
at tech.ytsaurus.client.TableWriter.write(TableWriter.java:24)
at tech.ytsaurus.spyt.serializers.InternalRowSerializer$.writeRowsRecursive(InternalRowSerializer.scala:156)
at tech.ytsaurus.spyt.serializers.InternalRowSerializer$.$anonfun$writeRows$1(InternalRowSerializer.scala:148)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
at scala.util.Success.$anonfun$map$1(Try.scala:255)
at scala.util.Success.map(Try.scala:213)
at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
... 3 more
Also reproduces when try to write large Lists
The text was updated successfully, but these errors were encountered:
Need to support write large strings and lists in SPYT and provide setting to configure limit
Caused by: java.util.concurrent.CompletionException: tech.ytsaurus.core.common.YTsaurusError: 'Value of type "string" is too long: length 25093876, limit 16777216'; full error: {"code"=1;"message"="Value of type "string" is too long: length 25093876, limit 16777216";"attributes"={"host"="...";"pid"=1183;"tid"=1025820748059212498u;"thread"="Worker:6";"fid"=18443790158341985566u;"datetime"="2024-10-21T14:01:02.311714Z";"trace_id"="a1bda6cf-4f47070b-4cd5d931-e3cb1602";"span_id"=10358239722602468708u;};}
at java.base/java.util.concurrent.CompletableFuture.reportJoin(CompletableFuture.java:412)
at java.base/java.util.concurrent.CompletableFuture.join(CompletableFuture.java:2044)
at tech.ytsaurus.client.StreamWriterImpl.push(StreamImpls.java:337)
at tech.ytsaurus.client.RawTableWriterImpl.write(StreamImpls.java:427)
at tech.ytsaurus.client.TableWriterBaseImpl.write(TableWriterImpl.java:89)
at tech.ytsaurus.client.TableWriterImpl.write(TableWriterImpl.java:119)
at tech.ytsaurus.client.TableWriter.write(TableWriter.java:24)
at tech.ytsaurus.spyt.serializers.InternalRowSerializer$.writeRowsRecursive(InternalRowSerializer.scala:156)
at tech.ytsaurus.spyt.serializers.InternalRowSerializer$.$anonfun$writeRows$1(InternalRowSerializer.scala:148)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
at scala.util.Success.$anonfun$map$1(Try.scala:255)
at scala.util.Success.map(Try.scala:213)
at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
... 3 more
Also reproduces when try to write large Lists
The text was updated successfully, but these errors were encountered: