You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
CPU fails with duplicate key. Note that I tried setting mapKeyDedupPolicy=LAST_WIN and the test passes.
2023-11-02T17:07:56.059Z] ^[[1m^[[31mE : java.lang.RuntimeException: Duplicate map key NaN was found, please check the input data. If you want to remove the duplicated keys, you can set spark.sql.mapKeyDedupPolicy to LAST_WIN so that the key inserted at last takes precedence.^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.util.ArrayBasedMapBuilder.put(ArrayBasedMapBuilder.scala:72)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.expressions.CreateMap.eval(complexTypeCreator.scala:229)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.optimizer.ConstantFolding$$anonfun$apply$1$$anonfun$applyOrElse$1.applyOrElse(expressions.scala:66)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.optimizer.ConstantFolding$$anonfun$apply$1$$anonfun$applyOrElse$1.applyOrElse(expressions.scala:54)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDown$1(TreeNode.scala:317)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:73)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:317)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDown$3(TreeNode.scala:322)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.mapChild$2(TreeNode.scala:376)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$4(TreeNode.scala:437)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at scala.collection.TraversableLike.map(TraversableLike.scala:238)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at scala.collection.TraversableLike.map$(TraversableLike.scala:231)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at scala.collection.AbstractTraversable.map(Traversable.scala:108)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$1(TreeNode.scala:437)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:243)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:405)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:358)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:322)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDown$3(TreeNode.scala:322)^[[0m
[2023-11-02T17:07:56.059Z] ^[[1m^[[31mE at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$1(TreeNode.scala:407)^[[0m
The text was updated successfully, but these errors were encountered:
abellina
changed the title
[BUG] test_map_scalars_supported_key_types fails with DATAGEN_SEED=1698940723
[BUG] test_coalesce fails with DATAGEN_SEED=1698940723
Nov 13, 2023
CPU fails with duplicate key. Note that I tried setting
mapKeyDedupPolicy=LAST_WIN
and the test passes.The text was updated successfully, but these errors were encountered: