Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UPSTREAM: <carry>: Add the upstream code rebase document #76

Merged
merged 1 commit into from
Nov 8, 2024

Conversation

rimolive
Copy link

Description of your changes:
Create the upstream code rebase document, with the steps to upgrade Kubeflow Pipelines code with the Data Science Pipelines repository.

Checklist:

Copy link

openshift-ci bot commented Aug 27, 2024

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by:
Once this PR has been reviewed and has the lgtm label, please ask for approval from rimolive. For more information see the Kubernetes Code Review Process.

The full list of commands accepted by this bot can be found here.

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 1 commits between a8fbbd2020d280f4cab31245a9639a350207b3f9...f5a03d13022b1e1ba3ba09129e840633982522ac

@dsp-developers
Copy link

A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-76
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-76
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-76
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-76
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-76
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-76

@dsp-developers
Copy link

An OCP cluster where you are logged in as cluster admin is required.

The Data Science Pipelines team recommends testing this using the Data Science Pipelines Operator. Check here for more information on using the DSPO.

To use and deploy a DSP stack with these images (assuming the DSPO is deployed), first save the following YAML to a file named dspa.pr-76.yaml:

apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
  name: pr-76
spec:
  dspVersion: v2
  apiServer:
    image: "quay.io/opendatahub/ds-pipelines-api-server:pr-76"
    argoDriverImage: "quay.io/opendatahub/ds-pipelines-driver:pr-76"
    argoLauncherImage: "quay.io/opendatahub/ds-pipelines-launcher:pr-76"
  persistenceAgent:
    image: "quay.io/opendatahub/ds-pipelines-persistenceagent:pr-76"
  scheduledWorkflow:
    image: "quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-76"
  mlmd:  
    deploy: true  # Optional component
    grpc:
      image: "quay.io/opendatahub/mlmd-grpc-server:latest"
    envoy:
      image: "registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2"
  mlpipelineUI:
    deploy: true  # Optional component 
    image: "quay.io/opendatahub/ds-pipelines-frontend:pr-76"
  objectStorage:
    minio:
      deploy: true
      image: 'quay.io/opendatahub/minio:RELEASE.2019-08-14T20-37-41Z-license-compliance'

Then run the following:

cd $(mktemp -d)
git clone [email protected]:opendatahub-io/data-science-pipelines.git
cd data-science-pipelines/
git fetch origin pull/76/head
git checkout -b pullrequest f5a03d13022b1e1ba3ba09129e840633982522ac
oc apply -f dspa.pr-76.yaml

More instructions here on how to deploy and test a Data Science Pipelines Application.

@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 1 commits between a8fbbd2020d280f4cab31245a9639a350207b3f9...09d488109b1e5278677af9eeb05d5f4fa63774ce

@dsp-developers
Copy link

Change to PR detected. A new PR build was completed.
A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-76
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-76
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-76
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-76
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-76
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-76

@rimolive
Copy link
Author

/hold

Steps need change.

@rimolive
Copy link
Author

rimolive commented Oct 1, 2024

/unhold

@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 1 commits between 0f96e42b1bd0968e898f6912a7b23f0a1285c6db...6c79f96c699c33f1c12f3124a3cbf0a3141f1661

@dsp-developers
Copy link

Change to PR detected. A new PR build was completed.
A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-76
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-76
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-76
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-76
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-76
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-76

Clone from a personal fork, and add the remote for upstream and opendatahub, fetching its branches:

```
git clone [email protected]:<user id>/data-science-pipelines
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
git clone [email protected]:<user id>/data-science-pipelines
git clone [email protected]:<user id>/argo-workflows

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good catch! Will change that.

Argo Workflows git history diverges completely across versions, so it's important to create a backup branch from the current dsp repo in case we need to revert changes.

```
git checkout -b dsp-backup opendatahub/main
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
git checkout -b dsp-backup opendatahub/main
git fetch opendatahub main
git checkout -b dsp-backup opendatahub/main

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you check the previous section, you will see that I'm already adding the remotes and fetching them. If you think this fetch task must be explicit, then I need to remove the --fetch flag from the git remote add commands.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's fine to leave the --fetch flag, but for repeatability, a user is only going to add the remotes once and therefore may skip that section if they already have them added. IMO It's safer to re-fetch and remove doubt

Comment on lines +94 to +98
### Create the Pull-Request in opendatahub-io/argo-workflows repository

Create a PR with the result of the previous tasks with the following description: `Upgrade argo-workflows code to x.y.z`

Copy link
Member

@gmfrasca gmfrasca Oct 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
### Create the Pull-Request in opendatahub-io/argo-workflows repository
Create a PR with the result of the previous tasks with the following description: `Upgrade argo-workflows code to x.y.z`
### Create a Draft/WIP Pull Request in opendatahub-io/argo-workflows repository
**NOTE**: This is only to show the diff/changeset and accept feedback from the team before proceeding. DO NOT ACTUALLY MERGE THIS
Create a PR with the result of the previous tasks with the following description: `Upgrade argo-workflows code to x.y.z`
### Force-push Version Upgrade branch to main
Upon acceptance of the Draft PR (again, do not actually merge this), force the `opendatahub/main` branch to now match the upgrade version 'feature' branch:
\`\`\`bash
git push -f origin argo-upgrade:main
\`\`\`
Obviously, this will completely overwrite the git history of the `opendatahub/main` remote branch so please ensure a backup branch (`dsp-backup`) was created as instructed above
### Disclaimer / Future Work
This process this obviously very heavy-handed and destructive, and depends on there being no carries or downstream-only commits. We should adjust the procedure to account for this

Copy link
Member

@gmfrasca gmfrasca Oct 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

update the codeblock escapes (\`\`\`) before accepting suggestion

@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 1 commits between 0f96e42b1bd0968e898f6912a7b23f0a1285c6db...6c3fcd2ffb4471ac7b898e3b7176681cf60283cd

@dsp-developers
Copy link

Change to PR detected. A new PR build was completed.
A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-76
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-76
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-76
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-76
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-76
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-76

@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 1 commits between 0f96e42b1bd0968e898f6912a7b23f0a1285c6db...96941d2b96471ca3c1dfaa865024b7701ea24a5e

@dsp-developers
Copy link

Change to PR detected. A new PR build was completed.
A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-76
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-76
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-76
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-76
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-76
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-76

Copy link
Member

@DharmitD DharmitD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/lgtm

@openshift-ci openshift-ci bot added the lgtm label Oct 9, 2024
@gregsheremeta
Copy link

content lgtm

This process this obviously very heavy-handed and destructive, and depends on there being no carries or downstream-only commits. We should adjust the procedure to account for this.

Didn't we decide we're going to do the rebase very differently next time and going forward? I recall that we agreed to not do the regular merge + merge conflict resolution (like we did this time) next time.

If that's correct, then I disagree with adding this document to this repository at all. This should just be a WIP google doc.

@rimolive
Copy link
Author

content lgtm

This process this obviously very heavy-handed and destructive, and depends on there being no carries or downstream-only commits. We should adjust the procedure to account for this.

Didn't we decide we're going to do the rebase very differently next time and going forward? I recall that we agreed to not do the regular merge + merge conflict resolution (like we did this time) next time.

If that's correct, then I disagree with adding this document to this repository at all. This should just be a WIP google doc.

That wasn't my understanding, and we should keep doing the same process for next kfp upgrades. Anyway, I believe we can only confirm that when we upgrade to KFP 2.3.0, so let me know if I should keep this PR open until the next KFP upgrade or close it.

@gregsheremeta
Copy link

/lgtm

@HumairAK HumairAK merged commit e27b687 into opendatahub-io:master Nov 8, 2024
1 of 2 checks passed
@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 0 commits between e27b68731776c35efe5e363295c3cc4a782cec4f...96941d2b96471ca3c1dfaa865024b7701ea24a5e

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants