This is a general overview of the new Operator Bundle certification. Detailed instructions can be found: HERE.
The Project ID can be obtained from connect.redhat.com
- Navigate to connect.redhat.com
- Click the login icon
- Click the Log in for technology partners button
- Login with your Red Hat Credentials
- In the top navigation click the Product certification dropdown
- In the dropdown click the Manage certification projects link
- Click the project link with Type: Operator Bundle Image
- From the Operator Bundle Image project page, pull the project ID from the URL
Example: project id:
617954f8c41de7ab3fe03dc3
The container API key can be obtained from connect.redhat.com
- Navigate to connect.redhat.com
- Click the login icon
- Click the Log in for technology partners button
- Login with your Red Hat Credentials
- In the top navigation click the
Product certification
dropdown - In the dropdown click the
Container API keys
link - From the
My API Keys
screen click theGenerate New Key
button - In the popup screen enter a descriptive Key Name and click the
save
button - Your API Key will be generated and displayed on screen.
Important: Copy the API Key from the screen as you won't be able to view it again. If you are having issues obtaining an API Key, please reach out to the Success Desk by going to: Red Hat Help Request.
The OpenShift Operator Certification Pipeline is based upon the open source cloud native CI/CD project named Tekton
To install the Pipeline follow the Pipeline Instructions. In particular you will need to execute Step 1, Step 2, Step 3, Step 4, Step 5, Step 6 and Step 7
The OpenShift Operator Certification workflow is based upon Tekton Pipelines and GitHub Pull Requests. In this step you create a forked copy of the Red Hat Certified Operators repository.
Instructions for forking a GitHub repo can be found here.
Once you have forked the upstream repo you will need to add your Operator bundle to the forked repo. The forked repo will have a directory structure similar to the structure outlined below.
├── config.yaml
├── operators
└── my-operator
├── 1.4.6
│ ├── manifests
│ │ ├── cache.example.com_my-operators.yaml
│ │ ├── my-operator-controller-manager-metrics-service_v1_service.yaml
│ │ ├── my-operator-manager-config_v1_configmap.yaml
│ │ ├── my-operator-metrics-reader_rbac.authorization.k8s.io_v1_clusterrole.yaml
│ │ └── my-operator.clusterserviceversion.yaml
│ └── metadata
│ └── annotations.yaml
└── ci.yaml
Take note of the operators
directory in the forked repo. Add your Operator Bundle under this operators
directory following the example format.
- Under the
operators
directory, create a new directory with the name of your operator. - Inside of this newly created directory add your
ci.yaml
. - Also, under the new directory create a subdirectory for each version of your Operator.
- In each version directory there should be a
manifests/
directory containing your OpenShift yaml files and ametadata/
directory containing yourannotations.yaml
file.
For details on creating an Operator Bundle, see the instructions here.
Once your Operator Bundle has been added to your forked version of the Preprod repo, you are ready to run the pipeline.
This is the most basic pipeline run. It does not include digest pinning, which will be required for submission, but the minimal pipeline run is useful for testing and iteration. The minimal pipeline run leverages the internal OpenShift container registry
Instructions for a minimal pipeline run.
Digest Pinning is required when submitting the results to Red Hat. Container images can be referenced by tags or by SHA digest. For Red Hat Certification we require the SHA digest for all containers. This requirement is due to the fact that tags are mutable and can be changed underneath us whereas SHA digest are immutable. The Digest Pinning function in the pipeline will scan through all the manifest files looking for tags and if found, the tool will replace the tags with SHA digest.
Instructions for a pipeline run with image digest pinning.
By default the Certification Pipeline will run with the OpenShift internal registry. If you want to use a different public or private registry, that is supported as well.
Instructions for running the pipeline with an external public or private registry.
With the CI Pipeline you will be able to iterate on our Operator Bundle to ensure it meets all the requirements for Red Hat Certification. When you execute the pipeline you will be able to view logs that contain details about any errors or failures that will need to be addressed before obtaining certification.
When executing the pipeline using the tkn
cli tool logs will be printed in the terminal.
If you have access to the OpenShift Console you can review the results of the pipeline and the logs in the Console as well.
Once you have successful results for all the certification checks you can then submit the results to Red Hat for verification and publication in the Operator catalog.