Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Shim Layer to support multiple Spark versions #402

Closed
wants to merge 49 commits into from
Closed
Show file tree
Hide file tree
Changes from 41 commits
Commits
Show all changes
49 commits
Select commit Hold shift + click to select a range
688c757
Working 30 and 31
tgravescs Jul 14, 2020
e9b3268
Build with both spark3.0 and spark 3.1
tgravescs Jul 14, 2020
ab09032
minor fixes
tgravescs Jul 14, 2020
e511110
Formatting
tgravescs Jul 14, 2020
0d97687
put back building configs
tgravescs Jul 16, 2020
773edfd
iMove GpuFileSourceScanExec to spark specific dirs
Jul 16, 2020
8b6ec0a
Fix order of params
tgravescs Jul 16, 2020
ad9f2e0
remove logging
tgravescs Jul 17, 2020
c48e918
move spark30 and spark31 into shims modules
tgravescs Jul 17, 2020
c98b38e
Add missing files
tgravescs Jul 17, 2020
918cbd4
Move packages and use serviceloader
tgravescs Jul 20, 2020
bcd5eb0
Move GpuFirst to shim
tgravescs Jul 20, 2020
edf01fa
First and Last moved
tgravescs Jul 20, 2020
af27493
Allow multiple serviceloaders in dist jar
tgravescs Jul 20, 2020
06c403d
Cleanup
tgravescs Jul 20, 2020
5d45aeb
Fixes
tgravescs Jul 21, 2020
e138385
Cleanup
tgravescs Jul 21, 2020
b77bf34
pom fixes to generate docs
tgravescs Jul 21, 2020
261bcc7
Fix Suite for shim classes and cleanup
tgravescs Jul 21, 2020
5731bb9
shim layer for Rapids Shuffle Manager
tgravescs Jul 21, 2020
46db449
Shim for shuffle manager
tgravescs Jul 21, 2020
af1d79d
add in getRapidsShuffleManagerClass
tgravescs Jul 21, 2020
7952d9f
Cleanup shuffle manager
tgravescs Jul 21, 2020
e495820
Changes for shuffle manager
tgravescs Jul 21, 2020
50bad9d
Cleanup
tgravescs Jul 21, 2020
350c34b
Change spark3.1 getGpuBuildSide
tgravescs Jul 21, 2020
0a9aeed
MapOutputTracker api
tgravescs Jul 21, 2020
9b611f4
shim for mapoutputTracker api
tgravescs Jul 21, 2020
27d786c
explicitly set version in shims
tgravescs Jul 21, 2020
7b7e26e
Merge remote-tracking branch 'origin/branch-0.2' into shimBranch0.2
tgravescs Jul 21, 2020
df7916d
Move ScalaUDF to Shim
tgravescs Jul 22, 2020
4632228
Remove unneeded use of GPUBuildSide
tgravescs Jul 22, 2020
b798a05
Revert some changes to joins
tgravescs Jul 22, 2020
456b784
Fix spacing in pom
tgravescs Jul 22, 2020
fa1b463
More join changes
tgravescs Jul 22, 2020
5bb4f99
more cleanup
tgravescs Jul 22, 2020
f9efe33
more cleanup
tgravescs Jul 22, 2020
6661e4a
more cleanup
tgravescs Jul 22, 2020
507fbe5
Merge remote-tracking branch 'origin/branch-0.2' into shimBranch0.2
tgravescs Jul 22, 2020
acece84
Fix merge issue
tgravescs Jul 22, 2020
ba377d4
Add newline
tgravescs Jul 22, 2020
8f26f12
fix line length
tgravescs Jul 22, 2020
b1b6155
Fix import order
tgravescs Jul 22, 2020
e637012
Remove unneeded changes in GpuFirst
tgravescs Jul 22, 2020
4aed4d0
Cleanup poms and versions check for 3.1
tgravescs Jul 22, 2020
9bddb28
move slf4j dep up
tgravescs Jul 22, 2020
c89bdf9
Change parent pom path
tgravescs Jul 22, 2020
787e12b
move rat exclude check to shim poms since parent changed
tgravescs Jul 22, 2020
9540447
Switch to use parent pom instead of aggregator module
tgravescs Jul 22, 2020
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 37 additions & 0 deletions dist/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,16 @@
<artifactId>rapids-4-spark-shuffle_${scala.binary.version}</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_${scala.binary.version}</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<scope>provided</scope>
</dependency>
</dependencies>

<build>
Expand All @@ -49,6 +59,9 @@
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this allows multiple serviceloaders to be properly combined into the single jar

</transformers>
<shadedArtifactAttached>false</shadedArtifactAttached>
<createDependencyReducedPom>true</createDependencyReducedPom>
<relocations>
Expand Down Expand Up @@ -94,6 +107,30 @@
</execution>
</executions>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<executions>
<execution>
<id>update_config</id>
<phase>verify</phase>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
<configuration>
<launchers>
<launcher>
<id>update_rapids_config</id>
<mainClass>com.nvidia.spark.rapids.RapidsConf</mainClass>
<args>
<arg>${project.basedir}/../docs/configs.md</arg>
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

doc generation had to move because it requires shim layer functions

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is that why spark-sql was added as a dependency? I wasn't clear on why that was needed otherwise.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes

</args>
</launcher>
</launchers>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.rat</groupId>
<artifactId>apache-rat-plugin</artifactId>
Expand Down
7 changes: 5 additions & 2 deletions docs/get-started/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -417,11 +417,14 @@ With `nv_peer_mem`, IB/RoCE-based transfers can perform zero-copy transfers dire
2) Install [UCX 1.8.1](https://github.com/openucx/ucx/releases/tag/v1.8.1).

3) You will need to configure your spark job with extra settings for UCX (we are looking to
simplify these settings in the near future):
simplify these settings in the near future). Choose the version of the shuffle manager
that matches your Spark version. Currently we support
Spark 3.0 (com.nvidia.spark.rapids.spark30.RapidsShuffleManager) and
Spark 3.1 (com.nvidia.spark.rapids.spark31.RapidsShuffleManager):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apache Spark really broke the shuffle manager API in a minor version? 😞

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it;s not minor version 3.0 to 3.1, plus shufflemanager api is officially private.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bummer. Wish we could setup the shuffle manager via the main plugin so users don't have to get out the decoder ring when configuring the shuffle.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sorry guess you mean FEATURE version, I was thinking maintenance release. But its private right now anyway.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The main issue is whether the executors need the shuffle manager defined before the SQL plugin gets instantiated I believe. I'd need to do some testing, but seems like it should be a follow up issue.


```shell
...
--conf spark.shuffle.manager=com.nvidia.spark.RapidsShuffleManager \
--conf spark.shuffle.manager=com.nvidia.spark.rapids.spark30.RapidsShuffleManager \
--conf spark.shuffle.service.enabled=false \
--conf spark.rapids.shuffle.transport.enabled=true \
--conf spark.executorEnv.UCX_TLS=cuda_copy,cuda_ipc,rc,tcp \
Expand Down
25 changes: 25 additions & 0 deletions integration_tests/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -28,14 +28,39 @@
<artifactId>rapids-4-spark-integration-tests_2.12</artifactId>
<version>0.2.0-SNAPSHOT</version>

<properties>
<slf4j.version>1.7.30</slf4j.version>
tgravescs marked this conversation as resolved.
Show resolved Hide resolved
<spark.test.version>3.0.0</spark.test.version>
</properties>
<profiles>
<profile>
<id>spark31tests</id>
<properties>
<spark.test.version>3.1.0-SNAPSHOT</spark.test.version>
</properties>
</profile>
</profiles>

<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jul-to-slf4j</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jcl-over-slf4j</artifactId>
<version>${slf4j.version}</version>
<!-- runtime scope is appropriate, but causes SBT build problems -->
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<version>${spark.test.version}</version>
tgravescs marked this conversation as resolved.
Show resolved Hide resolved
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@

package com.nvidia.spark.rapids.tests.mortgage

import com.nvidia.spark.RapidsShuffleManager
import com.nvidia.spark.rapids.ShimLoader
import org.scalatest.FunSuite

import org.apache.spark.sql.SparkSession
Expand All @@ -34,7 +34,7 @@ class MortgageSparkSuite extends FunSuite {
.config("spark.rapids.sql.test.enabled", false)
.config("spark.rapids.sql.incompatibleOps.enabled", true)
.config("spark.rapids.sql.hasNans", false)
val rapidsShuffle = classOf[RapidsShuffleManager].getCanonicalName
val rapidsShuffle = ShimLoader.getSparkShims.getRapidsShuffleManagerClass
val prop = System.getProperty("rapids.shuffle.manager.override", "false")
if (prop.equalsIgnoreCase("true")) {
println("RAPIDS SHUFFLE MANAGER ACTIVE")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@

package com.nvidia.spark.rapids.tests.tpch

import com.nvidia.spark.RapidsShuffleManager
import com.nvidia.spark.rapids.ShimLoader
import com.nvidia.spark.rapids.{ColumnarRdd, ExecutionPlanCaptureCallback}
import org.scalatest.{BeforeAndAfterAll, FunSuite}

Expand Down Expand Up @@ -44,7 +44,7 @@ class TpchLikeSparkSuite extends FunSuite with BeforeAndAfterAll {
.config("spark.rapids.sql.explain", true)
.config("spark.rapids.sql.incompatibleOps.enabled", true)
.config("spark.rapids.sql.hasNans", false)
val rapidsShuffle = classOf[RapidsShuffleManager].getCanonicalName
val rapidsShuffle = ShimLoader.getSparkShims.getRapidsShuffleManagerClass
val prop = System.getProperty("rapids.shuffle.manager.override", "false")
if (prop.equalsIgnoreCase("true")) {
println("RAPIDS SHUFFLE MANAGER ACTIVE")
Expand Down
15 changes: 15 additions & 0 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,7 @@
<module>sql-plugin</module>
<module>tests</module>
<module>integration_tests</module>
<module>shims</module>
<module>api_validation</module>
</modules>

Expand Down Expand Up @@ -128,6 +129,9 @@
<rat.consoleOutput>true</rat.consoleOutput>
</properties>
</profile>
<profile>
<id>spark31tests</id>
</profile>
</profiles>

<properties>
Expand Down Expand Up @@ -485,6 +489,7 @@
<exclude>*.jar</exclude>
<exclude>docs/demo/**/*.ipynb</exclude>
<exclude>docs/demo/**/*.zpln</exclude>
<exclude>**/META-INF/services/com.nvidia.spark.rapids.SparkShimServiceProvider</exclude>
</excludes>
</configuration>
</plugin>
Expand Down Expand Up @@ -547,5 +552,15 @@
<enabled>true</enabled>
</snapshots>
</repository>
<repository>
<id>apache-snapshots-repo</id>
<url>https://repository.apache.org/content/repositories/snapshots/</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
</project>
49 changes: 49 additions & 0 deletions shims/aggregator/pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright (c) 2020, NVIDIA CORPORATION.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
tgravescs marked this conversation as resolved.
Show resolved Hide resolved
<version>0.2.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_2.12</artifactId>
<packaging>jar</packaging>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Shim Aggregator</name>
<description>The RAPIDS SQL plugin for Apache Spark Shim Aggregator</description>
<version>0.2.0-SNAPSHOT</version>

<dependencies>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims-spark31_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims-spark30_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<scope>compile</scope>
</dependency>
</dependencies>
</project>
40 changes: 40 additions & 0 deletions shims/pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright (c) 2020, NVIDIA CORPORATION.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>0.2.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims_aggregator_2.12</artifactId>
<packaging>pom</packaging>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Shims</name>
<description>The RAPIDS SQL plugin for Apache Spark Shims</description>
<version>0.2.0-SNAPSHOT</version>

<modules>
<module>spark30</module>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just FYI First and Last changed recently between 3.0.0 and 3.0.1 so they are no longer compatible. Do we want to name these so it is clearer? perhaps spark3_0_0. Or is there going to be some kind of a hierarchy so a 3.0.1 shim can depend on spark30?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So I made it 30 as I wasn't sure what would change in minor release and figured we could create a spark301 if we needed. We can certainly make it 300 right now if you want. I haven't tested with it to see what all is different there. There is no hierarchy at this point, I think that could get confusing pretty quickly.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree that it could get confusing quickly. I am also concerned about a lot of copy/paste. The changes that went into 3.0.1 for First/Last are the same that went into 3.1. So does that mean we copy/paste these twice? What about for version 3.0.2 when nothing changed between 3.0.1 and 3.0.2 except one minor thing. Do we have to copy and paste code changes for each minor release from here on out?

Copy link
Collaborator Author

@tgravescs tgravescs Jul 22, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

right now yes, actually I take back my first statement, I would prefer to leave this as 30 and 31 until we pull in stuff from 3.0.1 and so forth. We can add modules and stuff ot that and rearrange that that point. My problem now is that it takes a long time to everyday upmerge to spark 3.1, and other rapids changes and things keep moving, then I have to constant retest all versions. I would like to get a base set of changes in and we can iterate on it. We could look at making other hierarchies but you have to watch for circular dependencies and such.

<module>spark31</module>
<module>aggregator</module>
</modules>
</project>
109 changes: 109 additions & 0 deletions shims/spark30/pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright (c) 2020, NVIDIA CORPORATION.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>0.2.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims-spark30_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark SQL Plugin Spark 3.0 Shim</name>
<description>The RAPIDS SQL plugin for Apache Spark 3.0 Shim</description>
<version>0.2.0-SNAPSHOT</version>

<properties>
<spark30.version>3.0.0</spark30.version>
</properties>

<dependencies>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-sql_${scala.binary.version}</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>ai.rapids</groupId>
<artifactId>cudf</artifactId>
<classifier>${cuda.version}</classifier>
</dependency>
<dependency>
<groupId>com.google.flatbuffers</groupId>
<artifactId>flatbuffers-java</artifactId>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<version>${spark30.version}</version>
tgravescs marked this conversation as resolved.
Show resolved Hide resolved
</dependency>
<dependency>
<groupId>org.apache.orc</groupId>
tgravescs marked this conversation as resolved.
Show resolved Hide resolved
<artifactId>orc-core</artifactId>
<classifier>${orc.classifier}</classifier>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.orc</groupId>
<artifactId>orc-mapreduce</artifactId>
<classifier>${orc.classifier}</classifier>
<exclusions>
<exclusion>
<groupId>com.google.code.findbugs</groupId>
<artifactId>jsr305</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-storage-api</artifactId>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
</dependency>
</dependencies>

<build>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
com.nvidia.spark.rapids.shims.spark30.Spark30ShimServiceProvider
Loading