-
Notifications
You must be signed in to change notification settings - Fork 53
add support for batchScheduler in Spark Tasks #216
Conversation
Signed-off-by: Miguel <[email protected]>
Codecov Report
@@ Coverage Diff @@
## master #216 +/- ##
=======================================
Coverage 61.20% 61.21%
=======================================
Files 137 137
Lines 8524 8526 +2
=======================================
+ Hits 5217 5219 +2
Misses 2826 2826
Partials 481 481
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
Signed-off-by: Miguel <[email protected]>
@@ -47,6 +47,7 @@ var ( | |||
"spark.flyte.feature1.enabled": "true", | |||
"spark.flyteorg.feature2.enabled": "true", | |||
"spark.flyteorg.feature3.enabled": "true", | |||
"spark.batchScheduler": "volcano", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why add it to spark-conf? Can we add it to just regular config? And I would prefer the config to say
scheduler: default
- as default
and users can then suplement it as scheduler
: volcano
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @akhurana001
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reason for spark conf is that for our use case we don't want to enable it for everyone, we'd like to try it out with a specific workflow. Adding it to regular config will set it for all unless we move to a config for each project/workflow. And I have no preference on whether batchScheduler
or scheduler
. Spark uses batch scheduler so figured I'd keep the same naming convention
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kumare3 @akhurana001 any thoughts on this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we reuse the spark feature support here as well instead of explicitly handling this ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you check if the SparkOperator lets you set the scheduler via a conf or is its CRD spec only ?
If its only the spec, then I don,t think we have a perfect solution here but it probably still makes sense to have the same UX for customers even if we have to handle the impl. separately (unless @kumare3 has any other thoughts ? )
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I spent some time looking at the various configs and tried setting it as part of the config in our deployment but looks like it can only be set in the CRD spec.
I agree that there is no perfect solution. For this reason I proposed the change where a user can specify spark.batchScheduler
and that would be used in the CRD spec.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Makes sense
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kumare3 @akhurana001 in that case are there any further changes needed for this PR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
* add support for batchScheduler Signed-off-by: Miguel <[email protected]> * upd test Signed-off-by: Miguel <[email protected]>
Signed-off-by: Miguel [email protected]
TL;DR
Spark has support for batchScheduler such as Volcano however even if it enabled in the spark operator it cannot currently be used because we don't set it as part of the sparkApp.
Here we allow a user to specify
spark.batchScheduler
as part of their config to leverage the batchScheduler option. This can also be enabled for all spark tasks as part of the default spark config an admin sets in the plugins configType
Are all requirements met?
Complete description
Pass the batchScheduler option to the spark app if specified
Tracking Issue
fixes flyteorg/flyte#1558