Skip to content

Commit

Permalink
update: remove dsp with v1(tekton)backend related code (opendatahub-i…
Browse files Browse the repository at this point in the history
…o#1281)

* update: remove dsp with v1(tekton)backend related code

- images
- tekton rbac
- descriptions

Signed-off-by: Wen Zhou <[email protected]>
Co-authored-by: Humair Khan <[email protected]>

---------

Signed-off-by: Wen Zhou <[email protected]>
Co-authored-by: Humair Khan <[email protected]>
  • Loading branch information
zdtsw and HumairAK authored Oct 27, 2024
1 parent bde4b4e commit 133b710
Show file tree
Hide file tree
Showing 6 changed files with 17 additions and 52 deletions.
22 changes: 5 additions & 17 deletions bundle/manifests/opendatahub-operator.clusterserviceversion.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -175,11 +175,11 @@ spec:
built on top of Kubeflow Notebook Controller with support for OAuth\n* Jupyter
Notebooks - JupyterLab notebook that provide Python support for GPU workloads\n*
Data Science Pipelines - Pipeline solution for end to end MLOps workflows that
support the Kubeflow Pipelines SDK and Tekton\n* Model Mesh - ModelMesh Serving
is the Controller for managing ModelMesh, a general-purpose model serving management/routing
layer\n* Distributed Workloads(Incubation) - Stack built to make managing distributed
compute infrastructure in the cloud easy and intuitive for Data Scientists. This
stack consists of three components \n Codeflare
support the Kubeflow Pipelines SDK and Argo Workflows\n* Model Mesh - ModelMesh
Serving is the Controller for managing ModelMesh, a general-purpose model serving
management/routing layer\n* Distributed Workloads(Incubation) - Stack built to
make managing distributed compute infrastructure in the cloud easy and intuitive
for Data Scientists. This stack consists of three components \n Codeflare
, KubeRay and Kueue.\n* Kserve - Kserve is the Controller for for serving machine
learning (ML) models on arbitrary frameworks"
displayName: Open Data Hub Operator
Expand Down Expand Up @@ -491,12 +491,6 @@ spec:
- patch
- update
- watch
- apiGroups:
- custom.tekton.dev
resources:
- pipelineloops
verbs:
- '*'
- apiGroups:
- dashboard.opendatahub.io
resources:
Expand Down Expand Up @@ -1024,12 +1018,6 @@ spec:
- delete
- get
- patch
- apiGroups:
- tekton.dev
resources:
- '*'
verbs:
- '*'
- apiGroups:
- template.openshift.io
resources:
Expand Down
27 changes: 10 additions & 17 deletions components/datasciencepipelines/datasciencepipelines.go
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
// Package datasciencepipelines provides utility functions to config Data Science Pipelines:
// Pipeline solution for end to end MLOps workflows that support the Kubeflow Pipelines SDK, Tekton and Argo Workflows.
// Pipeline solution for end to end MLOps workflows that support the Kubeflow Pipelines SDK and Argo Workflows.
// +groupName=datasciencecluster.opendatahub.io
package datasciencepipelines

Expand Down Expand Up @@ -45,22 +45,15 @@ func (d *DataSciencePipelines) Init(ctx context.Context, _ cluster.Platform) err
log := logf.FromContext(ctx).WithName(ComponentName)

var imageParamMap = map[string]string{
// v1
"IMAGES_APISERVER": "RELATED_IMAGE_ODH_ML_PIPELINES_API_SERVER_IMAGE",
"IMAGES_ARTIFACT": "RELATED_IMAGE_ODH_ML_PIPELINES_ARTIFACT_MANAGER_IMAGE",
"IMAGES_PERSISTENTAGENT": "RELATED_IMAGE_ODH_ML_PIPELINES_PERSISTENCEAGENT_IMAGE",
"IMAGES_SCHEDULEDWORKFLOW": "RELATED_IMAGE_ODH_ML_PIPELINES_SCHEDULEDWORKFLOW_IMAGE",
"IMAGES_CACHE": "RELATED_IMAGE_ODH_ML_PIPELINES_CACHE_IMAGE",
"IMAGES_DSPO": "RELATED_IMAGE_ODH_DATA_SCIENCE_PIPELINES_OPERATOR_CONTROLLER_IMAGE",
// v2
"IMAGESV2_ARGO_APISERVER": "RELATED_IMAGE_ODH_ML_PIPELINES_API_SERVER_V2_IMAGE",
"IMAGESV2_ARGO_PERSISTENCEAGENT": "RELATED_IMAGE_ODH_ML_PIPELINES_PERSISTENCEAGENT_V2_IMAGE",
"IMAGESV2_ARGO_SCHEDULEDWORKFLOW": "RELATED_IMAGE_ODH_ML_PIPELINES_SCHEDULEDWORKFLOW_V2_IMAGE",
"IMAGESV2_ARGO_ARGOEXEC": "RELATED_IMAGE_ODH_DATA_SCIENCE_PIPELINES_ARGO_ARGOEXEC_IMAGE",
"IMAGESV2_ARGO_WORKFLOWCONTROLLER": "RELATED_IMAGE_ODH_DATA_SCIENCE_PIPELINES_ARGO_WORKFLOWCONTROLLER_IMAGE",
"V2_DRIVER_IMAGE": "RELATED_IMAGE_ODH_ML_PIPELINES_DRIVER_IMAGE",
"V2_LAUNCHER_IMAGE": "RELATED_IMAGE_ODH_ML_PIPELINES_LAUNCHER_IMAGE",
"IMAGESV2_ARGO_MLMDGRPC": "RELATED_IMAGE_ODH_MLMD_GRPC_SERVER_IMAGE",
"IMAGES_DSPO": "RELATED_IMAGE_ODH_DATA_SCIENCE_PIPELINES_OPERATOR_CONTROLLER_IMAGE",
"IMAGES_APISERVER": "RELATED_IMAGE_ODH_ML_PIPELINES_API_SERVER_V2_IMAGE",
"IMAGES_PERSISTENCEAGENT": "RELATED_IMAGE_ODH_ML_PIPELINES_PERSISTENCEAGENT_V2_IMAGE",
"IMAGES_SCHEDULEDWORKFLOW": "RELATED_IMAGE_ODH_ML_PIPELINES_SCHEDULEDWORKFLOW_V2_IMAGE",
"IMAGES_ARGO_EXEC": "RELATED_IMAGE_ODH_DATA_SCIENCE_PIPELINES_ARGO_ARGOEXEC_IMAGE",
"IMAGES_ARGO_WORKFLOWCONTROLLER": "RELATED_IMAGE_ODH_DATA_SCIENCE_PIPELINES_ARGO_WORKFLOWCONTROLLER_IMAGE",
"IMAGES_DRIVER": "RELATED_IMAGE_ODH_ML_PIPELINES_DRIVER_IMAGE",
"IMAGES_LAUNCHER": "RELATED_IMAGE_ODH_ML_PIPELINES_LAUNCHER_IMAGE",
"IMAGES_MLMDGRPC": "RELATED_IMAGE_ODH_MLMD_GRPC_SERVER_IMAGE",
}

if err := deploy.ApplyParams(Path, imageParamMap); err != nil {
Expand Down
2 changes: 1 addition & 1 deletion config/manifests/description-patch.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ spec:
* Open Data Hub Dashboard - A web dashboard that displays installed Open Data Hub components with easy access to component UIs and documentation
* ODH Notebook Controller - Secure management of Jupyter Notebook in Kubernetes environments built on top of Kubeflow Notebook Controller with support for OAuth
* Jupyter Notebooks - JupyterLab notebook that provide Python support for GPU workloads
* Data Science Pipelines - Pipeline solution for end to end MLOps workflows that support the Kubeflow Pipelines SDK and Tekton
* Data Science Pipelines - Pipeline solution for end to end MLOps workflows that support the Kubeflow Pipelines SDK and Argo Workflows
* Model Mesh - ModelMesh Serving is the Controller for managing ModelMesh, a general-purpose model serving management/routing layer
* Distributed Workloads(Incubation) - Stack built to make managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists. This stack consists of three components
Codeflare , KubeRay and Kueue.
Expand Down
12 changes: 0 additions & 12 deletions config/rbac/role.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -305,12 +305,6 @@ rules:
- patch
- update
- watch
- apiGroups:
- custom.tekton.dev
resources:
- pipelineloops
verbs:
- '*'
- apiGroups:
- dashboard.opendatahub.io
resources:
Expand Down Expand Up @@ -838,12 +832,6 @@ rules:
- delete
- get
- patch
- apiGroups:
- tekton.dev
resources:
- '*'
verbs:
- '*'
- apiGroups:
- template.openshift.io
resources:
Expand Down
4 changes: 0 additions & 4 deletions controllers/datasciencecluster/kubebuilder_rbac.go
Original file line number Diff line number Diff line change
Expand Up @@ -52,8 +52,6 @@ package datasciencecluster

// +kubebuilder:rbac:groups="template.openshift.io",resources=templates,verbs=*

// +kubebuilder:rbac:groups="tekton.dev",resources=*,verbs=*

// +kubebuilder:rbac:groups="snapshot.storage.k8s.io",resources=volumesnapshots,verbs=create;delete;patch;get

// +kubebuilder:rbac:groups="serving.kserve.io",resources=trainedmodels/status,verbs=update;patch;delete;get
Expand Down Expand Up @@ -151,8 +149,6 @@ package datasciencecluster
// +kubebuilder:rbac:groups="extensions",resources=replicasets,verbs=*
// +kubebuilder:rbac:groups="extensions",resources=ingresses,verbs=list;watch;patch;delete;get

// +kubebuilder:rbac:groups="custom.tekton.dev",resources=pipelineloops,verbs=*

// +kubebuilder:rbac:groups="core",resources=services/finalizers,verbs=create;delete;list;update;watch;patch;get
// +kubebuilder:rbac:groups="core",resources=services,verbs=get;create;watch;update;patch;list;delete
// +kubebuilder:rbac:groups="core",resources=services,verbs=*
Expand Down
2 changes: 1 addition & 1 deletion docs/api-overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,7 @@ _Appears in:_
## datasciencecluster.opendatahub.io/datasciencepipelines

Package datasciencepipelines provides utility functions to config Data Science Pipelines:
Pipeline solution for end to end MLOps workflows that support the Kubeflow Pipelines SDK, Tekton and Argo Workflows.
Pipeline solution for end to end MLOps workflows that support the Kubeflow Pipelines SDK and Argo Workflows.



Expand Down

0 comments on commit 133b710

Please sign in to comment.