Skip to content

Commit

Permalink
[SPARK-23852][SQL] Add withSQLConf(...) to test case
Browse files Browse the repository at this point in the history
## What changes were proposed in this pull request?

Add a `withSQLConf(...)` wrapper to force Parquet filter pushdown for a test that relies on it.

## How was this patch tested?

Test passes

Author: Henry Robinson <[email protected]>

Closes #21323 from henryr/spark-23582.

(cherry picked from commit 061e008)
Signed-off-by: Marcelo Vanzin <[email protected]>
  • Loading branch information
henryr authored and Marcelo Vanzin committed May 14, 2018
1 parent a8ee570 commit 6dfb515
Showing 1 changed file with 8 additions and 6 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -604,13 +604,15 @@ class ParquetFilterSuite extends QueryTest with ParquetTest with SharedSQLContex
}

test("SPARK-23852: Broken Parquet push-down for partially-written stats") {
// parquet-1217.parquet contains a single column with values -1, 0, 1, 2 and null.
// The row-group statistics include null counts, but not min and max values, which
// triggers PARQUET-1217.
val df = readResourceParquetFile("test-data/parquet-1217.parquet")
withSQLConf(SQLConf.PARQUET_FILTER_PUSHDOWN_ENABLED.key -> "true") {
// parquet-1217.parquet contains a single column with values -1, 0, 1, 2 and null.
// The row-group statistics include null counts, but not min and max values, which
// triggers PARQUET-1217.
val df = readResourceParquetFile("test-data/parquet-1217.parquet")

// Will return 0 rows if PARQUET-1217 is not fixed.
assert(df.where("col > 0").count() === 2)
// Will return 0 rows if PARQUET-1217 is not fixed.
assert(df.where("col > 0").count() === 2)
}
}
}

Expand Down

0 comments on commit 6dfb515

Please sign in to comment.