Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Remove dup code #1022

Merged
merged 2 commits into from
Jun 5, 2019
Merged

fix: Remove dup code #1022

merged 2 commits into from
Jun 5, 2019

Conversation

gaocegege
Copy link
Member

@gaocegege gaocegege commented Jun 5, 2019

/assign @johnugeorge

I think your suggestion is right, we should not update LastUpdateTime if the status does not changed. It is the convention in Kubernetes community. This PR is to remove dup code.

Signed-off-by: Ce Gao [email protected]


This change is Reviewable

Signed-off-by: Ce Gao <[email protected]>
@coveralls
Copy link

coveralls commented Jun 5, 2019

Coverage Status

Coverage remained the same at 76.744% when pulling e4fdb60 on gaocegege:fix into d0b973b on kubeflow:master.

@gaocegege
Copy link
Member Author

/retest

@johnugeorge
Copy link
Member

@gaocegege how will it solve kubeflow/pytorch-operator#88?

@gaocegege
Copy link
Member Author

@johnugeorge It is solved by not updating the condition if the addTFJob is called several times.

@johnugeorge
Copy link
Member

/lgtm

@johnugeorge
Copy link
Member

/approve

@k8s-ci-robot
Copy link

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: johnugeorge

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants