Skip to content
This repository has been archived by the owner on Oct 9, 2023. It is now read-only.

add support for batchScheduler in Spark Tasks #216

Merged
merged 2 commits into from
Oct 26, 2021

Conversation

migueltol22
Copy link
Contributor

@migueltol22 migueltol22 commented Oct 4, 2021

Signed-off-by: Miguel [email protected]

TL;DR

Spark has support for batchScheduler such as Volcano however even if it enabled in the spark operator it cannot currently be used because we don't set it as part of the sparkApp.

Here we allow a user to specify spark.batchScheduler as part of their config to leverage the batchScheduler option. This can also be enabled for all spark tasks as part of the default spark config an admin sets in the plugins config

Type

  • Bug Fix
  • Feature
  • Plugin

Are all requirements met?

  • Code completed
  • Smoke tested
  • Unit tests added
  • Code documentation added
  • Any pending items have an associated Issue

Complete description

Pass the batchScheduler option to the spark app if specified

Tracking Issue

fixes flyteorg/flyte#1558

@codecov
Copy link

codecov bot commented Oct 4, 2021

Codecov Report

Merging #216 (a4b300c) into master (b4f6fa9) will increase coverage by 0.00%.
The diff coverage is 100.00%.

❗ Current head a4b300c differs from pull request most recent head b33a9e5. Consider uploading reports for the commit b33a9e5 to get more accurate results
Impacted file tree graph

@@           Coverage Diff           @@
##           master     #216   +/-   ##
=======================================
  Coverage   61.20%   61.21%           
=======================================
  Files         137      137           
  Lines        8524     8526    +2     
=======================================
+ Hits         5217     5219    +2     
  Misses       2826     2826           
  Partials      481      481           
Flag Coverage Δ
unittests ?

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
go/tasks/plugins/k8s/spark/spark.go 78.22% <100.00%> (+0.16%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b4f6fa9...b33a9e5. Read the comment docs.

Signed-off-by: Miguel <[email protected]>
@@ -47,6 +47,7 @@ var (
"spark.flyte.feature1.enabled": "true",
"spark.flyteorg.feature2.enabled": "true",
"spark.flyteorg.feature3.enabled": "true",
"spark.batchScheduler": "volcano",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why add it to spark-conf? Can we add it to just regular config? And I would prefer the config to say
scheduler: default - as default
and users can then suplement it as scheduler: volcano

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reason for spark conf is that for our use case we don't want to enable it for everyone, we'd like to try it out with a specific workflow. Adding it to regular config will set it for all unless we move to a config for each project/workflow. And I have no preference on whether batchScheduler or scheduler. Spark uses batch scheduler so figured I'd keep the same naming convention

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kumare3 @akhurana001 any thoughts on this?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we reuse the spark feature support here as well instead of explicitly handling this ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you check if the SparkOperator lets you set the scheduler via a conf or is its CRD spec only ?

If its only the spec, then I don,t think we have a perfect solution here but it probably still makes sense to have the same UX for customers even if we have to handle the impl. separately (unless @kumare3 has any other thoughts ? )

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I spent some time looking at the various configs and tried setting it as part of the config in our deployment but looks like it can only be set in the CRD spec.

I agree that there is no perfect solution. For this reason I proposed the change where a user can specify spark.batchScheduler and that would be used in the CRD spec.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kumare3 @akhurana001 in that case are there any further changes needed for this PR?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@migueltol22 migueltol22 merged commit e7fcac0 into master Oct 26, 2021
eapolinario pushed a commit that referenced this pull request Sep 6, 2023
* add support for batchScheduler

Signed-off-by: Miguel <[email protected]>

* upd test

Signed-off-by: Miguel <[email protected]>
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Plugins Feature] Enable BatchScheduler for spark tasks.
3 participants