Skip to content

Commit

Permalink
[KYUUBI #5362] Remove Spark 3.0 support for Authz
Browse files Browse the repository at this point in the history
### _Why are the changes needed?_
To close #5362 .

Considering the maintenance burden of the Kyuubi community and easy cross-support for data lake projects.
Drop support EOLs of Spark 3 for the coming Spark 4.x era in kyuubi v1.9.
We will still do bugfix release for these spark3.0.x users.

### _How was this patch tested?_
- [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible

- [ ] Add screenshots for manual tests if appropriate

- [x] [Run test](https://kyuubi.readthedocs.io/en/master/contributing/code/testing.html#running-tests) locally before make a pull request

### _Was this patch authored or co-authored using generative AI tooling?_
No

Closes #5363 from AngersZhuuuu/KYUUBI-5362.

Closes #5362

d34cd6e [Angerszhuuuu] Update build.md
99f414b [Angerszhuuuu] Update build.md
a5129e4 [Angerszhuuuu] Update build.md
6ee008c [Angerszhuuuu] Update README.md
af792cc [Angerszhuuuu] Update master.yml
69b3331 [Angerszhuuuu] Merge branch 'master' into KYUUBI-5362
528554e [Angerszhuuuu] Update IcebergCatalogPrivilegesBuilderSuite.scala
427ebd4 [Angerszhuuuu] Update DataMaskingForJDBCV2Suite.scala
64809a5 [Angerszhuuuu] update
f7d89fd [Angerszhuuuu] [KYUUBI-5362] Kyuubi remove Authz test for spark3.0.3

Authored-by: Angerszhuuuu <[email protected]>
Signed-off-by: Cheng Pan <[email protected]>
  • Loading branch information
AngersZhuuuu authored and pan3793 committed Oct 11, 2023
1 parent e1d213e commit 79b147a
Show file tree
Hide file tree
Showing 15 changed files with 113 additions and 259 deletions.
43 changes: 0 additions & 43 deletions .github/workflows/master.yml
Original file line number Diff line number Diff line change
Expand Up @@ -127,49 +127,6 @@ jobs:
**/kyuubi-spark-sql-engine.log*
**/kyuubi-spark-batch-submit.log*
authz:
name: Kyuubi-AuthZ and Spark Test
runs-on: ubuntu-22.04
strategy:
fail-fast: false
matrix:
java:
- 8
- 11
spark:
- '3.0.3'
comment: ["normal"]
env:
SPARK_LOCAL_IP: localhost
steps:
- uses: actions/checkout@v3
- name: Tune Runner VM
uses: ./.github/actions/tune-runner-vm
- name: Setup JDK ${{ matrix.java }}
uses: actions/setup-java@v3
with:
distribution: temurin
java-version: ${{ matrix.java }}
cache: 'maven'
check-latest: false
- name: Setup Maven
uses: ./.github/actions/setup-maven
- name: Cache Engine Archives
uses: ./.github/actions/cache-engine-archives
- name: Build and test Kyuubi AuthZ with supported Spark versions
run: |
TEST_MODULES="extensions/spark/kyuubi-spark-authz"
./build/mvn clean test ${MVN_OPT} -pl ${TEST_MODULES} -am \
-Dspark.version=${{ matrix.spark }}
- name: Upload test logs
if: failure()
uses: actions/upload-artifact@v3
with:
name: unit-tests-log-java-${{ matrix.java }}-spark-${{ matrix.spark }}-${{ matrix.comment }}
path: |
**/target/unit-tests.log
**/kyuubi-spark-sql-engine.log*
scala-test:
name: Scala Test
runs-on: ubuntu-22.04
Expand Down
2 changes: 1 addition & 1 deletion docs/security/authorization/spark/build.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ The available `spark.version`s are shown in the following table.
| 3.3.x || - |
| 3.2.x || - |
| 3.1.x || - |
| 3.0.x | | - |
| 3.0.x | x | EOL since v1.9.0 |
| 2.4.x and earlier | × | [PR 2367](https://github.com/apache/kyuubi/pull/2367) is used to track how we work with older releases with scala 2.11 |

Currently, Spark released with Scala 2.12 are supported.
Expand Down
2 changes: 1 addition & 1 deletion extensions/spark/kyuubi-spark-authz/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ build/mvn clean package -DskipTests -pl :kyuubi-spark-authz_2.12 -am -Dspark.ver
- [x] 3.3.x
- [x] 3.2.x
- [x] 3.1.x
- [x] 3.0.x
- [ ] 3.0.x
- [ ] 2.4.x and earlier

### Supported Apache Ranger Versions
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,17 +22,14 @@ import org.scalatest.Outcome
import org.apache.kyuubi.Utils
import org.apache.kyuubi.plugin.spark.authz.OperationType._
import org.apache.kyuubi.plugin.spark.authz.ranger.AccessType
import org.apache.kyuubi.plugin.spark.authz.util.AuthZUtils._
import org.apache.kyuubi.tags.IcebergTest
import org.apache.kyuubi.util.AssertionUtils._

@IcebergTest
class IcebergCatalogPrivilegesBuilderSuite extends V2CommandsPrivilegesSuite {
override protected val catalogImpl: String = "hive"
override protected val sqlExtensions: String =
if (isSparkV31OrGreater) {
"org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions"
} else ""
"org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions"
override protected def format = "iceberg"

override protected val supportsUpdateTable = false
Expand All @@ -42,20 +39,17 @@ class IcebergCatalogPrivilegesBuilderSuite extends V2CommandsPrivilegesSuite {
override protected val supportsPartitionManagement = false

override def beforeAll(): Unit = {
if (isSparkV31OrGreater) {
spark.conf.set(
s"spark.sql.catalog.$catalogV2",
"org.apache.iceberg.spark.SparkCatalog")
spark.conf.set(s"spark.sql.catalog.$catalogV2.type", "hadoop")
spark.conf.set(
s"spark.sql.catalog.$catalogV2.warehouse",
Utils.createTempDir("iceberg-hadoop").toString)
}
spark.conf.set(
s"spark.sql.catalog.$catalogV2",
"org.apache.iceberg.spark.SparkCatalog")
spark.conf.set(s"spark.sql.catalog.$catalogV2.type", "hadoop")
spark.conf.set(
s"spark.sql.catalog.$catalogV2.warehouse",
Utils.createTempDir("iceberg-hadoop").toString)
super.beforeAll()
}

override def withFixture(test: NoArgTest): Outcome = {
assume(isSparkV31OrGreater)
test()
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -662,7 +662,6 @@ abstract class PrivilegesBuilderSuite extends AnyFunSuite
}

test("RefreshFunctionCommand") {
assume(isSparkV31OrGreater)
sql(s"CREATE FUNCTION RefreshFunctionCommand AS '${getClass.getCanonicalName}'")
val plan = sql("REFRESH FUNCTION RefreshFunctionCommand")
.queryExecution.analyzed
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,6 @@ import scala.util.Try
import org.scalatest.Outcome

import org.apache.kyuubi.plugin.spark.authz.serde._
import org.apache.kyuubi.plugin.spark.authz.util.AuthZUtils._
import org.apache.kyuubi.util.AssertionUtils._

class V2JdbcTableCatalogPrivilegesBuilderSuite extends V2CommandsPrivilegesSuite {
Expand All @@ -39,15 +38,13 @@ class V2JdbcTableCatalogPrivilegesBuilderSuite extends V2CommandsPrivilegesSuite
val jdbcUrl: String = s"$dbUrl;create=true"

override def beforeAll(): Unit = {
if (isSparkV31OrGreater) {
spark.conf.set(
s"spark.sql.catalog.$catalogV2",
"org.apache.spark.sql.execution.datasources.v2.jdbc.JDBCTableCatalog")
spark.conf.set(s"spark.sql.catalog.$catalogV2.url", jdbcUrl)
spark.conf.set(
s"spark.sql.catalog.$catalogV2.driver",
"org.apache.derby.jdbc.AutoloadedDriver")
}
spark.conf.set(
s"spark.sql.catalog.$catalogV2",
"org.apache.spark.sql.execution.datasources.v2.jdbc.JDBCTableCatalog")
spark.conf.set(s"spark.sql.catalog.$catalogV2.url", jdbcUrl)
spark.conf.set(
s"spark.sql.catalog.$catalogV2.driver",
"org.apache.derby.jdbc.AutoloadedDriver")
super.beforeAll()
}

Expand All @@ -61,7 +58,6 @@ class V2JdbcTableCatalogPrivilegesBuilderSuite extends V2CommandsPrivilegesSuite
}

override def withFixture(test: NoArgTest): Outcome = {
assume(isSparkV31OrGreater)
test()
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,47 +37,42 @@ import org.apache.kyuubi.util.AssertionUtils._
class IcebergCatalogRangerSparkExtensionSuite extends RangerSparkExtensionSuite {
override protected val catalogImpl: String = "hive"
override protected val sqlExtensions: String =
if (isSparkV31OrGreater)
"org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions"
else ""
"org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions"

val catalogV2 = "local"
val namespace1 = icebergNamespace
val table1 = "table1"
val outputTable1 = "outputTable1"

override def withFixture(test: NoArgTest): Outcome = {
assume(isSparkV31OrGreater)
test()
}

override def beforeAll(): Unit = {
if (isSparkV31OrGreater) {
spark.conf.set(
s"spark.sql.catalog.$catalogV2",
"org.apache.iceberg.spark.SparkCatalog")
spark.conf.set(s"spark.sql.catalog.$catalogV2.type", "hadoop")
spark.conf.set(
s"spark.sql.catalog.$catalogV2.warehouse",
Utils.createTempDir("iceberg-hadoop").toString)

super.beforeAll()

doAs(admin, sql(s"CREATE DATABASE IF NOT EXISTS $catalogV2.$namespace1"))
doAs(
admin,
sql(s"CREATE TABLE IF NOT EXISTS $catalogV2.$namespace1.$table1" +
" (id int, name string, city string) USING iceberg"))
spark.conf.set(
s"spark.sql.catalog.$catalogV2",
"org.apache.iceberg.spark.SparkCatalog")
spark.conf.set(s"spark.sql.catalog.$catalogV2.type", "hadoop")
spark.conf.set(
s"spark.sql.catalog.$catalogV2.warehouse",
Utils.createTempDir("iceberg-hadoop").toString)

doAs(
admin,
sql(s"INSERT INTO $catalogV2.$namespace1.$table1" +
" (id , name , city ) VALUES (1, 'liangbowen','Guangzhou')"))
doAs(
admin,
sql(s"CREATE TABLE IF NOT EXISTS $catalogV2.$namespace1.$outputTable1" +
" (id int, name string, city string) USING iceberg"))
}
super.beforeAll()

doAs(admin, sql(s"CREATE DATABASE IF NOT EXISTS $catalogV2.$namespace1"))
doAs(
admin,
sql(s"CREATE TABLE IF NOT EXISTS $catalogV2.$namespace1.$table1" +
" (id int, name string, city string) USING iceberg"))

doAs(
admin,
sql(s"INSERT INTO $catalogV2.$namespace1.$table1" +
" (id , name , city ) VALUES (1, 'liangbowen','Guangzhou')"))
doAs(
admin,
sql(s"CREATE TABLE IF NOT EXISTS $catalogV2.$namespace1.$outputTable1" +
" (id int, name string, city string) USING iceberg"))
}

override def afterAll(): Unit = {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -567,11 +567,7 @@ class HiveCatalogRangerSparkExtensionSuite extends RangerSparkExtensionSuite {
someone, {
sql(s"select * from $db1.$permView").collect()
}))
if (isSparkV31OrGreater) {
assert(e1.getMessage.contains(s"does not have [select] privilege on [$db1/$permView/id]"))
} else {
assert(e1.getMessage.contains(s"does not have [select] privilege on [$db1/$table/id]"))
}
assert(e1.getMessage.contains(s"does not have [select] privilege on [$db1/$permView/id]"))
}
}

Expand All @@ -590,22 +586,12 @@ class HiveCatalogRangerSparkExtensionSuite extends RangerSparkExtensionSuite {
// query all columns of the permanent view
// with access privileges to the permanent view but no privilege to the source table
val sql1 = s"SELECT * FROM $db1.$permView"
if (isSparkV31OrGreater) {
doAs(userPermViewOnly, { sql(sql1).collect() })
} else {
val e1 = intercept[AccessControlException](doAs(userPermViewOnly, { sql(sql1).collect() }))
assert(e1.getMessage.contains(s"does not have [select] privilege on [$db1/$table/id]"))
}
doAs(userPermViewOnly, { sql(sql1).collect() })

// query the second column of permanent view with multiple columns
// with access privileges to the permanent view but no privilege to the source table
val sql2 = s"SELECT name FROM $db1.$permView"
if (isSparkV31OrGreater) {
doAs(userPermViewOnly, { sql(sql2).collect() })
} else {
val e2 = intercept[AccessControlException](doAs(userPermViewOnly, { sql(sql2).collect() }))
assert(e2.getMessage.contains(s"does not have [select] privilege on [$db1/$table/name]"))
}
doAs(userPermViewOnly, { sql(sql2).collect() })
}
}

Expand Down
Loading

0 comments on commit 79b147a

Please sign in to comment.