-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
UPSTREAM: <carry>: Add the upstream code rebase document #76
Conversation
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: The full list of commands accepted by this bot can be found here.
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
Commit Checker results:
|
A set of new images have been built to help with testing out this PR: |
An OCP cluster where you are logged in as cluster admin is required. The Data Science Pipelines team recommends testing this using the Data Science Pipelines Operator. Check here for more information on using the DSPO. To use and deploy a DSP stack with these images (assuming the DSPO is deployed), first save the following YAML to a file named apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
name: pr-76
spec:
dspVersion: v2
apiServer:
image: "quay.io/opendatahub/ds-pipelines-api-server:pr-76"
argoDriverImage: "quay.io/opendatahub/ds-pipelines-driver:pr-76"
argoLauncherImage: "quay.io/opendatahub/ds-pipelines-launcher:pr-76"
persistenceAgent:
image: "quay.io/opendatahub/ds-pipelines-persistenceagent:pr-76"
scheduledWorkflow:
image: "quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-76"
mlmd:
deploy: true # Optional component
grpc:
image: "quay.io/opendatahub/mlmd-grpc-server:latest"
envoy:
image: "registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2"
mlpipelineUI:
deploy: true # Optional component
image: "quay.io/opendatahub/ds-pipelines-frontend:pr-76"
objectStorage:
minio:
deploy: true
image: 'quay.io/opendatahub/minio:RELEASE.2019-08-14T20-37-41Z-license-compliance' Then run the following: cd $(mktemp -d)
git clone [email protected]:opendatahub-io/data-science-pipelines.git
cd data-science-pipelines/
git fetch origin pull/76/head
git checkout -b pullrequest f5a03d13022b1e1ba3ba09129e840633982522ac
oc apply -f dspa.pr-76.yaml More instructions here on how to deploy and test a Data Science Pipelines Application. |
Commit Checker results:
|
Change to PR detected. A new PR build was completed. |
/hold Steps need change. |
09d4881
to
11e7dd9
Compare
/unhold |
Commit Checker results:
|
Change to PR detected. A new PR build was completed. |
REBASE.opendatahub.md
Outdated
Clone from a personal fork, and add the remote for upstream and opendatahub, fetching its branches: | ||
|
||
``` | ||
git clone [email protected]:<user id>/data-science-pipelines |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
git clone [email protected]:<user id>/data-science-pipelines | |
git clone [email protected]:<user id>/argo-workflows |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good catch! Will change that.
Argo Workflows git history diverges completely across versions, so it's important to create a backup branch from the current dsp repo in case we need to revert changes. | ||
|
||
``` | ||
git checkout -b dsp-backup opendatahub/main |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
git checkout -b dsp-backup opendatahub/main | |
git fetch opendatahub main | |
git checkout -b dsp-backup opendatahub/main |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you check the previous section, you will see that I'm already adding the remotes and fetching them. If you think this fetch task must be explicit, then I need to remove the --fetch
flag from the git remote add
commands.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's fine to leave the --fetch
flag, but for repeatability, a user is only going to add the remotes once and therefore may skip that section if they already have them added. IMO It's safer to re-fetch and remove doubt
### Create the Pull-Request in opendatahub-io/argo-workflows repository | ||
|
||
Create a PR with the result of the previous tasks with the following description: `Upgrade argo-workflows code to x.y.z` | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
### Create the Pull-Request in opendatahub-io/argo-workflows repository | |
Create a PR with the result of the previous tasks with the following description: `Upgrade argo-workflows code to x.y.z` | |
### Create a Draft/WIP Pull Request in opendatahub-io/argo-workflows repository | |
**NOTE**: This is only to show the diff/changeset and accept feedback from the team before proceeding. DO NOT ACTUALLY MERGE THIS | |
Create a PR with the result of the previous tasks with the following description: `Upgrade argo-workflows code to x.y.z` | |
### Force-push Version Upgrade branch to main | |
Upon acceptance of the Draft PR (again, do not actually merge this), force the `opendatahub/main` branch to now match the upgrade version 'feature' branch: | |
\`\`\`bash | |
git push -f origin argo-upgrade:main | |
\`\`\` | |
Obviously, this will completely overwrite the git history of the `opendatahub/main` remote branch so please ensure a backup branch (`dsp-backup`) was created as instructed above | |
### Disclaimer / Future Work | |
This process this obviously very heavy-handed and destructive, and depends on there being no carries or downstream-only commits. We should adjust the procedure to account for this |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
update the codeblock escapes (\`\`\`) before accepting suggestion
Commit Checker results:
|
Change to PR detected. A new PR build was completed. |
Signed-off-by: Ricardo M. Oliveira <[email protected]>
Commit Checker results:
|
Change to PR detected. A new PR build was completed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/lgtm
content lgtm
Didn't we decide we're going to do the rebase very differently next time and going forward? I recall that we agreed to not do the regular merge + merge conflict resolution (like we did this time) next time. If that's correct, then I disagree with adding this document to this repository at all. This should just be a WIP google doc. |
That wasn't my understanding, and we should keep doing the same process for next kfp upgrades. Anyway, I believe we can only confirm that when we upgrade to KFP 2.3.0, so let me know if I should keep this PR open until the next KFP upgrade or close it. |
/lgtm |
Commit Checker results:
|
Description of your changes:
Create the upstream code rebase document, with the steps to upgrade Kubeflow Pipelines code with the Data Science Pipelines repository.
Checklist: