-
Notifications
You must be signed in to change notification settings - Fork 276
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Skip flaky e2e MultiKueue test for XGBoost #2861
Skip flaky e2e MultiKueue test for XGBoost #2861
Conversation
Skipping CI for Draft Pull Request. |
✅ Deploy Preview for kubernetes-sigs-kueue canceled.
|
LGTM label has been added. Git tree hash: 4fd59aee4ac5eaee2e826f0b64b4cb38ba2a73a5
|
@@ -633,6 +633,7 @@ var _ = ginkgo.Describe("MultiKueue", func() { | |||
}) | |||
|
|||
ginkgo.It("Should run a kubeflow XGBoostJob on worker if admitted", func() { | |||
ginkgo.Skip("Skipped due to state transitioning bug in training-operator") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Put a comment with a link to the open issue in the training-operator repo
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree and I would have done that, but they don't have actual issue open, just the reference - kubeflow/training-operator#1711
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's use that issue
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe that only PyTorchJob e2e would be enough since we already commonized the KFJobs MK adapters into the kubeflowjob.MKAdapter
in #2795.
So, after we merge this PR, can you refactor the e2e and integration tests for the KFJob MK adapter?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, then I will use this.
@tenzen-y Yes I will publish the test refactor - I pretty much have it.
/lgtm |
LGTM label has been added. Git tree hash: 5af49054b2e5f745ccd882c565a1700d93ff9186
|
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: alculquicondor, mszadkow The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
* Skip flaky e2e multikueue test for XGBoost * add comment with the link issue
What type of PR is this?
/kind bug
What this PR does / why we need it:
Skips flaky test for e2e MultiKueue XGBoost, due to transitioning bug in training-operator.
Found here: kubeflow/training-operator#1711
Which issue(s) this PR fixes:
Part of #2838
Special notes for your reviewer:
Does this PR introduce a user-facing change?