Skip to content

Commit

Permalink
[SPARK-1089] fix the regression problem on ADD_JARS in 0.9
Browse files Browse the repository at this point in the history
https://spark-project.atlassian.net/browse/SPARK-1089

copied from JIRA, reported by @ash211

"Using the ADD_JARS environment variable with spark-shell used to add the jar to both the shell and the various workers. Now it only adds to the workers and importing a custom class in the shell is broken.
The workaround is to add custom jars to both ADD_JARS and SPARK_CLASSPATH.
We should fix ADD_JARS so it works properly again.
See various threads on the user list:
https://mail-archives.apache.org/mod_mbox/incubator-spark-user/201402.mbox/%3CCAJbo4neMLiTrnm1XbyqomWmp0m+EUcg4yE-txuRGSVKOb5KLeA@mail.gmail.com%3E
(another one that doesn't appear in the archives yet titled "ADD_JARS not working on 0.9")"

The reason of this bug is two-folds

in the current implementation of SparkILoop.scala, the settings.classpath is not set properly when the process() method is invoked

the weird behaviour of Scala 2.10, (I personally thought it is a bug)

if we simply set value of a PathSettings object (like settings.classpath), the isDefault is not set to true (this is a flag showing if the variable is modified), so it makes the PathResolver loads the default CLASSPATH environment variable value to calculated the path (see https://github.com/scala/scala/blob/2.10.x/src/compiler/scala/tools/util/PathResolver.scala#L215)

what we have to do is to manually make this flag set, (https://github.com/CodingCat/incubator-spark/blob/e3991d97ddc33e77645e4559b13bf78b9e68239a/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala#L884)

Author: CodingCat <[email protected]>

Closes #13 from CodingCat/SPARK-1089 and squashes the following commits:

8af81e7 [CodingCat] impose non-null settings
9aa2125 [CodingCat] code cleaning
ce36676 [CodingCat] code cleaning
e045582 [CodingCat] fix the regression problem on ADD_JARS in 0.9
(cherry picked from commit 345df5f)

Signed-off-by: Patrick Wendell <[email protected]>
  • Loading branch information
CodingCat authored and pwendell committed Feb 27, 2014
1 parent 349764d commit bc5e7d7
Showing 1 changed file with 7 additions and 2 deletions.
9 changes: 7 additions & 2 deletions repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala
Original file line number Diff line number Diff line change
Expand Up @@ -180,8 +180,13 @@ class SparkILoop(in0: Option[BufferedReader], protected val out: JPrintWriter,

/** Create a new interpreter. */
def createInterpreter() {
if (addedClasspath != "")
settings.classpath append addedClasspath
require(settings != null)

if (addedClasspath != "") settings.classpath.append(addedClasspath)
// work around for Scala bug
val totalClassPath = SparkILoop.getAddedJars.foldLeft(
settings.classpath.value)((l, r) => ClassPath.join(l, r))
this.settings.classpath.value = totalClassPath

intp = new SparkILoopInterpreter
}
Expand Down

0 comments on commit bc5e7d7

Please sign in to comment.