Skip to content
This repository has been archived by the owner on Jan 9, 2020. It is now read-only.

V0.2 dev #341

Closed

Conversation

duyanghao
Copy link

What changes were proposed in this pull request?

Set driver and executor Label:spark-app-id = --conf spark.kubernetes.driver.pod.name.
The main reason for this is that we can search all pods for a spark application with the specified label(spark-app-id) value through par spark.kubernetes.driver.pod.name,which can be more easily combined with web server(if there is any),guess that use A specifies par --conf spark.kubernetes.driver.pod.name=xxx,then web server could get all pods information through k8s with label:spark-app-id=xxx,that is good anyway!!!
Refs here.

How was this patch tested?

Manual tests were successful.Example is as follows:

i submit with pars:

--conf spark.kubernetes.driver.pod.name=spark-pi-today-a
--conf spark.executor.cores=2

then driver pod name and labels will be:

kubectl describe pods/spark-pi-today-a
Name:		spark-pi-today-a
Namespace:	default
Node:		xxx
Start Time:	Tue, 06 Jun 2017 11:12:01 +0800
Labels:		spark-app-id=spark-pi-today-a
		spark-app-name=spark-pi
		spark-role=driver
Status:		Succeeded

executor #1 pod name and labels will be:

kubectl describe pods/spark-pi-today-a-exec-1
Name:		spark-pi-today-a-exec-1
Namespace:	default
Node:		xxx
Start Time:	Tue, 06 Jun 2017 11:12:33 +0800
Labels:		spark-app-id=spark-pi-today-a
		spark-exec-id=1
		spark-role=executor
Status:		Succeeded

executor #2 pod name and labels will be:

kubectl describe pods/spark-pi-today-a-exec-2
Name:		spark-pi-today-a-exec-2
Namespace:	default
Node:		xxx
Start Time:	Tue, 06 Jun 2017 11:12:33 +0800
Labels:		spark-app-id=spark-pi-today-a
		spark-exec-id=2
		spark-role=executor
Status:		Succeeded

Copy link

@mccheah mccheah left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm just marking this as unresolved until we have finished discussion on #335.

@@ -261,7 +281,8 @@ private[spark] object Client {
.getOrElse(Array.empty[String])
val appName = sparkConf.getOption("spark.app.name")
.getOrElse("spark")
val kubernetesAppId = s"$appName-$launchTime".toLowerCase.replaceAll("\\.", "-")
val kubernetesAppId = sparkConf.getOption("spark.kubernetes.driver.pod.name")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is out of date with branch-2.1-kubernetes - since #331 has merged, the semantics of these variables have changed.

@@ -97,6 +102,15 @@ private[spark] class Client(
.withValue(classPath)
.build()
}
val driverCpuQuantity = new QuantityBuilder(false)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is already included in #340

@ash211
Copy link

ash211 commented Aug 21, 2017

@duyanghao is this PR still active? I'm not sure what's left since some of the pieces have been merged in through separate PRs.

If there are still changes that need to be made, please open a new PR against branch-2.2-kubernetes so we can get those merged in to this repo.

I figured that between fixing merge conflicts and changing the destination branch from 2.1 to 2.2 it's easier to start a new PR than continue working on this one.

Thanks again for helping make this project better!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants