Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-21000][MESOS] Add Mesos labels support to the Spark Dispatcher #18220

Closed

Conversation

mgummelt
Copy link
Contributor

@mgummelt mgummelt commented Jun 6, 2017

What changes were proposed in this pull request?

Add Mesos labels support to the Spark Dispatcher

How was this patch tested?

unit tests

@SparkQA
Copy link

SparkQA commented Jun 6, 2017

Test build #77780 has finished for PR 18220 at commit ee10af6.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 6, 2017

Test build #77781 has finished for PR 18220 at commit 9c21758.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 6, 2017

Test build #77782 has finished for PR 18220 at commit e09f55a.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 6, 2017

Test build #77785 has finished for PR 18220 at commit 3d847a1.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 6, 2017

Test build #77786 has finished for PR 18220 at commit b32eb8a.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

}
}

while(i < labelsStr.length) {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally, this would be more functional. I tried to model it with map/fold, but I'm not smart enough. If someone cares, I can try to rewrite it to be recursive, at least.

@mgummelt
Copy link
Contributor Author

mgummelt commented Jun 7, 2017

@srowen Can we get a merge? @ArtRand is a engineer working on Spark here at Mesosphere, and has approved these changes.

Thanks.

@@ -469,6 +470,15 @@ See the [configuration page](configuration.html) for information on Spark config
</td>
</tr>
<tr>
<td><code>spark.mesos.driver.labels</code></td>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This naming differs a bit from the YARN label property, but I suppose it's supporting an expression. And we have .task.labels already. OK.

<td><code>spark.mesos.driver.labels</code></td>
<td><code>(none)</code></td>
<td>
Mesos labels to add to the driver. See spark.mesos.task.labels
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: format props and code with <code>

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

value = Some(currStr)
if (key.isEmpty) {
throw new SparkException(s"Error while parsing label string: ${labelsStr}. " +
s"Empty label key.")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit, don't need interpolation but whatever

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed

}
}

while(i < labelsStr.length) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: space after while

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this code was removed, so this is fixed.


// 0 -> parsing key
// 1 -> parsing value
var state = 0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This all looks excessively complex. Can't you do this with a regex in a few lines?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I simplified it, but I can't do a simple regex splitting, because I have to condition the match on characters (the escape sequence), that shouldn't actually be considered part of the matched string. So I just wrote a custom splitUnescaped method to implement what I need.

@SparkQA
Copy link

SparkQA commented Jun 8, 2017

Test build #77816 has finished for PR 18220 at commit 40707fe.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 8, 2017

Test build #77817 has finished for PR 18220 at commit 79c119c.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 8, 2017

Test build #77818 has finished for PR 18220 at commit a381419.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@mgummelt
Copy link
Contributor Author

mgummelt commented Jun 8, 2017

@srowen All comments addressed. Tests are passing.

def mesosLabels(labelsStr: String): Protos.Labels.Builder = {

// Return str split around unescaped occurrences of c.
def splitUnescaped(str: String, c: Char): Seq[String] = {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still think you can do this much more directly with regexes, even with escapes. It looks a little tricky but a negative lookbehind does the trick. For example:

scala> """key:value,key2:a\:b,key3:a\,b""".split("""(?<!\\),""")
res3: Array[String] = Array(key:value, key2:a\:b, key3:a\,b)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's exactly what I was looking for! Thanks! Fixed.

@SparkQA
Copy link

SparkQA commented Jun 9, 2017

Test build #77851 has finished for PR 18220 at commit 616c1b1.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

throw new SparkException(s"Malformed label: ${labelStr}")
}

val cleanedParts = parts
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could do this too, though I don't think it matters enough to bother:

val Array(key, value) = parts.map(_.replaceAll("""\\(.)""", "$1"))

@SparkQA
Copy link

SparkQA commented Jun 9, 2017

Test build #77852 has finished for PR 18220 at commit 218ab0d.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jun 9, 2017

Test build #77855 has finished for PR 18220 at commit fe916b2.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member

srowen commented Jun 11, 2017

Merged to master

@asfgit asfgit closed this in 8da3f70 Jun 11, 2017
dataknocker pushed a commit to dataknocker/spark that referenced this pull request Jun 16, 2017
## What changes were proposed in this pull request?

Add Mesos labels support to the Spark Dispatcher

## How was this patch tested?

unit tests

Author: Michael Gummelt <[email protected]>

Closes apache#18220 from mgummelt/SPARK-21000-dispatcher-labels.
ArtRand pushed a commit to d2iq-archive/spark that referenced this pull request Aug 22, 2017
## What changes were proposed in this pull request?

Add Mesos labels support to the Spark Dispatcher

## How was this patch tested?

unit tests

Author: Michael Gummelt <[email protected]>

Closes apache#18220 from mgummelt/SPARK-21000-dispatcher-labels.
susanxhuynh pushed a commit to d2iq-archive/spark that referenced this pull request Jan 8, 2018
Add Mesos labels support to the Spark Dispatcher

unit tests

Author: Michael Gummelt <[email protected]>

Closes apache#18220 from mgummelt/SPARK-21000-dispatcher-labels.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants