Skip to content

Commit

Permalink
Set spark.executor.uri from environment variable (needed by Mesos)
Browse files Browse the repository at this point in the history
The Mesos backend uses this property when setting up a slave process.  It is similarly set in the Scala repl (org.apache.spark.repl.SparkILoop), but I couldn't find any analogous for pyspark.

Author: Ivan Wick <[email protected]>

This patch had conflicts when merged, resolved by
Committer: Matei Zaharia <[email protected]>

Closes apache#311 from ivanwick/master and squashes the following commits:

da0c3e4 [Ivan Wick] Set spark.executor.uri from environment variable (needed by Mesos)
  • Loading branch information
ivanwick authored and mateiz committed Apr 11, 2014
1 parent 2c55783 commit 5cd11d5
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions python/pyspark/shell.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,9 @@
# this is the equivalent of ADD_JARS
add_files = os.environ.get("ADD_FILES").split(',') if os.environ.get("ADD_FILES") != None else None

if os.environ.get("SPARK_EXECUTOR_URI"):
SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"])

sc = SparkContext(os.environ.get("MASTER", "local[*]"), "PySparkShell", pyFiles=add_files)

print """Welcome to
Expand Down

0 comments on commit 5cd11d5

Please sign in to comment.