Skip to content

Commit

Permalink
[SPARK-50380][SQL][TESTS][FOLLOWUP] Enable ANSI for conditional branc…
Browse files Browse the repository at this point in the history
…hes with error expression test

### What changes were proposed in this pull request?

This is a follow-up to recover non-ANSI CI.
- #48918

### Why are the changes needed?

The original PR broke non-ANSI CI because the test case assumes ANSI setting.

- https://github.com/apache/spark/actions/runs/11964792566
- https://github.com/apache/spark/actions/runs/11982859814

### Does this PR introduce _any_ user-facing change?

No, this is a test-only change.

### How was this patch tested?

Manual tests.

**BEFORE**

```
$ SPARK_ANSI_SQL_MODE=false build/sbt "catalyst/testOnly *.ReorderAssociativeOperatorSuite -- -z SPARK-50380"
...
[info] *** 1 TEST FAILED ***
[error] Failed tests:
[error] 	org.apache.spark.sql.catalyst.optimizer.ReorderAssociativeOperatorSuite
[error] (catalyst / Test / testOnly) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 8 s, completed Nov 23, 2024, 11:50:45 AM
```

**AFTER**

```
$ SPARK_ANSI_SQL_MODE=false build/sbt "catalyst/testOnly *.ReorderAssociativeOperatorSuite -- -z SPARK-50380"
...
[info] ReorderAssociativeOperatorSuite:
[info] - SPARK-50380: conditional branches with error expression (508 milliseconds)
[info] Run completed in 1 second, 413 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 11 s, completed Nov 23, 2024, 11:51:34 AM
```

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #48943 from dongjoon-hyun/SPARK-50380.

Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
  • Loading branch information
dongjoon-hyun committed Nov 24, 2024
1 parent 656ece1 commit d356b17
Showing 1 changed file with 13 additions and 10 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ import org.apache.spark.sql.catalyst.expressions.aggregate.Count
import org.apache.spark.sql.catalyst.plans.{Inner, PlanTest}
import org.apache.spark.sql.catalyst.plans.logical.{LocalRelation, LogicalPlan}
import org.apache.spark.sql.catalyst.rules.RuleExecutor
import org.apache.spark.sql.internal.SQLConf

class ReorderAssociativeOperatorSuite extends PlanTest {

Expand Down Expand Up @@ -109,15 +110,17 @@ class ReorderAssociativeOperatorSuite extends PlanTest {
}

test("SPARK-50380: conditional branches with error expression") {
val originalQuery1 = testRelation.select(If($"a" === 1, 1L, Literal(1).div(0) + $"b")).analyze
val optimized1 = Optimize.execute(originalQuery1)
comparePlans(optimized1, originalQuery1)

val originalQuery2 = testRelation.select(
If($"a" === 1, 1, ($"b" + Literal(Int.MaxValue)) + 1).as("col")).analyze
val optimized2 = Optimize.execute(originalQuery2)
val correctAnswer2 = testRelation.select(
If($"a" === 1, 1, $"b" + (Literal(Int.MaxValue) + 1)).as("col")).analyze
comparePlans(optimized2, correctAnswer2)
withSQLConf(SQLConf.ANSI_ENABLED.key -> true.toString) {
val originalQuery1 = testRelation.select(If($"a" === 1, 1L, Literal(1).div(0) + $"b")).analyze
val optimized1 = Optimize.execute(originalQuery1)
comparePlans(optimized1, originalQuery1)

val originalQuery2 = testRelation.select(
If($"a" === 1, 1, ($"b" + Literal(Int.MaxValue)) + 1).as("col")).analyze
val optimized2 = Optimize.execute(originalQuery2)
val correctAnswer2 = testRelation.select(
If($"a" === 1, 1, $"b" + (Literal(Int.MaxValue) + 1)).as("col")).analyze
comparePlans(optimized2, correctAnswer2)
}
}
}

0 comments on commit d356b17

Please sign in to comment.