Skip to content

Commit

Permalink
[SPARK-18136] Fix SPARK_JARS_DIR for Python pip install on Windows
Browse files Browse the repository at this point in the history
## What changes were proposed in this pull request?

Fix for setup of `SPARK_JARS_DIR` on Windows as it looks for `%SPARK_HOME%\RELEASE` file instead of `%SPARK_HOME%\jars` as it should. RELEASE file is not included in the `pip` build of PySpark.

## How was this patch tested?

Local install of PySpark on Anaconda 4.4.0 (Python 3.6.1).

Author: Jakub Nowacki <[email protected]>

Closes #19310 from jsnowacki/master.

(cherry picked from commit c11f24a)
Signed-off-by: hyukjinkwon <[email protected]>
  • Loading branch information
jsnowacki authored and HyukjinKwon committed Sep 23, 2017
1 parent de6274a commit c0a34a9
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion bin/spark-class2.cmd
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ if "x%1"=="x" (
)

rem Find Spark jars.
if exist "%SPARK_HOME%\RELEASE" (
if exist "%SPARK_HOME%\jars" (
set SPARK_JARS_DIR="%SPARK_HOME%\jars"
) else (
set SPARK_JARS_DIR="%SPARK_HOME%\assembly\target\scala-%SPARK_SCALA_VERSION%\jars"
Expand Down

0 comments on commit c0a34a9

Please sign in to comment.