Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add more detailed instructions on how to check pre-release candidates #32948

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
111 changes: 98 additions & 13 deletions TESTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1442,6 +1442,104 @@ You can also run complete k8s tests with
This will create cluster, build images, deploy airflow run tests and finally delete clusters as single
command. It is the way it is run in our CI, you can also run such complete tests in parallel.

Manually testing release candidate packages
===========================================

Breeze can be used to test new release candidates of packages - both Airflow and providers. You can easily
turn the CI image of Breeze to install and start Airflow for both Airflow and provider packages - both,
packages that are built from sources and packages that are downloaded from PyPI when they are released
there as release candidates.

The way to test it is rather straightforward:

1) Make sure that the packages - both ``airflow`` and ``providers`` are placed in the ``dist`` folder
of your Airflow source tree. You can either build them there or download from PyPI (see the next chapter)

2) You can run ```breeze shell`` or ``breeze start-airflow`` commands with adding the following flags -
``--mount-sources remove`` and ``--use-packages-from-dist``. The first one removes the ``airflow``
source tree from the container when starting it, the second one installs ``airflow`` and ``providers``
packages from the ``dist`` folder when entering breeze.

Testing pre-release packages
----------------------------

There are two ways how you can get Airflow packages in ``dist`` folder - by building them from sources or
downloading them from PyPI.

.. note ::
Make sure you run ``rm dist/*`` before you start building packages or downloading them from PyPI because
the packages built there already are not removed manually.
In order to build apache-airflow from sources, you need to run the following command:

.. code-block:: bash
breeze release-management prepare-airflow-package
In order to build providers from sources, you need to run the following command:

.. code-block:: bash
breeze release-management prepare-provider-packages <PROVIDER_1> <PROVIDER_2> ... <PROVIDER_N>
The packages are built in ``dist`` folder and the command will summarise what packages are available in the
``dist`` folder after it finishes.

If you want to download the packages from PyPI, you need to run the following command:

.. code-block:: bash
pip download apache-airflow-providers-<PROVIDER_NAME>==X.Y.Zrc1 --dest dist --no-deps
You can use it for both release and pre-release packages.

Examples of testing pre-release packages
----------------------------------------

Few examples below explain how you can test pre-release packages, and combine them with locally build
and released packages.

The following example downloads ``apache-airflow`` and ``celery`` and ``kubernetes`` provider packages from PyPI and
eventually starts Airflow with the Celery Executor. It also loads example dags and default connections:

.. code:: bash
rm dist/*
pip download apache-airflow==2.7.0rc1 --dest dist --no-deps
pip download apache-airflow-providers-cncf-kubernetes==7.4.0rc1 --dest dist --no-deps
pip download apache-airflow-providers-cncf-kubernetes==3.3.0rc1 --dest dist --no-deps
breeze start-airflow --mount-sources remove --use-packages-from-dist --executor CeleryExecutor --load-default-connections --load-example-dags
The following example downloads ``celery`` and ``kubernetes`` provider packages from PyPI, builds
``apache-airflow`` package from the main sources and eventually starts Airflow with the Celery Executor.
It also loads example dags and default connections:

.. code:: bash
rm dist/*
breeze release-management prepare-airflow-package
pip download apache-airflow-providers-cncf-kubernetes==7.4.0rc1 --dest dist --no-deps
pip download apache-airflow-providers-cncf-kubernetes==3.3.0rc1 --dest dist --no-deps
breeze start-airflow --mount-sources remove --use-packages-from-dist --executor CeleryExecutor --load-default-connections --load-example-dags
The following example builds ``celery``, ``kubernetes`` provider packages from PyPI, downloads 2.6.3 version
of ``apache-airflow`` package from PyPI and eventually starts Airflow using default executor
for the backend chosen (no example dags, no default connections):

.. code:: bash
rm dist/*
pip download apache-airflow==2.6.3 --dest dist --no-deps
breeze release-management prepare-provider-packages celery cncf.kubernetes
breeze start-airflow --mount-sources remove --use-packages-from-dist
You can mix and match packages from PyPI (final or pre-release candidates) with locally build packages. You
can also choose which providers to install this way since the ``--remove-sources`` flag makes sure that Airflow
installed does not contain all the providers - only those that you explicitly downloaded or built in the
``dist`` folder. This way you can test all the combinations of Airflow + Providers you might need.


Airflow System Tests
====================
Expand Down Expand Up @@ -1562,19 +1660,6 @@ A simple example of a system test is available in:

It runs two DAGs defined in ``airflow.providers.google.cloud.example_dags.example_compute.py``.

Preparing provider packages for System Tests for Airflow 1.10.* series
----------------------------------------------------------------------

To run system tests with the older Airflow version, you need to prepare provider packages. This
can be done by running ``./breeze-legacy prepare-provider-packages <PACKAGES TO BUILD>``. For
example, the below command will build google, postgres and mysql wheel packages:

.. code-block:: bash
breeze release-management prepare-provider-packages google postgres mysql
Those packages will be prepared in ./dist folder. This folder is mapped to /dist folder
when you enter Breeze, so it is easy to automate installing those packages for testing.

The typical system test session
-------------------------------
Expand Down
30 changes: 21 additions & 9 deletions dev/README_RELEASE_AIRFLOW.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,13 +29,14 @@
- [Build RC artifacts](#build-rc-artifacts)
- [Prepare production Docker Image RC](#prepare-production-docker-image-rc)
- [Prepare Vote email on the Apache Airflow release candidate](#prepare-vote-email-on-the-apache-airflow-release-candidate)
- [Verify the release candidate by PMCs](#verify-the-release-candidate-by-pmcs)
- [Verify the release candidate by PMC members](#verify-the-release-candidate-by-pmc-members)
- [SVN check](#svn-check)
- [Licence check](#licence-check)
- [Signature check](#signature-check)
- [SHA512 sum check](#sha512-sum-check)
- [Source code check](#source-code-check)
- [Verify release candidates by Contributors](#verify-release-candidates-by-contributors)
- [Verify the release candidate by Contributors](#verify-the-release-candidate-by-contributors)
- [Installing release candidate in your local virtual environment](#installing-release-candidate-in-your-local-virtual-environment)
- [Publish the final Apache Airflow release](#publish-the-final-apache-airflow-release)
- [Summarize the voting for the Apache Airflow release](#summarize-the-voting-for-the-apache-airflow-release)
- [Publish release to SVN](#publish-release-to-svn)
Expand Down Expand Up @@ -376,8 +377,12 @@ Please vote accordingly:
Only votes from PMC members are binding, but all members of the community
are encouraged to test the release and vote with "(non-binding)".
The test procedure for PMCs and Contributors who would like to test this RC are described in
https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md#verify-the-release-candidate-by-pmcs
The test procedure for PMC members is described in:
https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md#verify-the-release-candidate-by-pmc-members
The test procedure for and Contributors who would like to test this RC is described in:
https://github.com/apache/airflow/blob/main/dev/README_RELEASE_AIRFLOW.md#verify-the-release-candidate-by-contributors
Please note that the version number excludes the \`rcX\` string, so it's now
simply ${VERSION_WITHOUT_RC}. This will allow us to rename the artifact without modifying
Expand Down Expand Up @@ -416,9 +421,9 @@ EOF
Note, For RC2/3 you may refer to shorten vote period as agreed in mailing list [thread](https://lists.apache.org/thread/cv194w1fqqykrhswhmm54zy9gnnv6kgm).
# Verify the release candidate by PMCs
# Verify the release candidate by PMC members
The PMCs should verify the releases in order to make sure the release is following the
PMC members should verify the releases in order to make sure the release is following the
[Apache Legal Release Policy](http://www.apache.org/legal/release-policy.html).
At least 3 (+1) votes should be recorded in accordance to
Expand Down Expand Up @@ -646,13 +651,17 @@ Only in /Users/jarek/code/airflow: .bash_history
...
```

# Verify release candidates by Contributors
# Verify the release candidate by Contributors

This can be done (and we encourage to) by any of the Contributors. In fact, it's best if the
actual users of Apache Airflow test it in their own staging/test installations. Each release candidate
is available on PyPI apart from SVN packages, so everyone should be able to install
the release candidate version of Airflow via simply (<VERSION> is 2.0.2 for example, and <X> is
release candidate number 1,2,3,....).
the release candidate version.
But you can use any of the installation methods you prefer (you can even install it via the binary wheels
downloaded from the SVN).
## Installing release candidate in your local virtual environment
```shell script
pip install apache-airflow==<VERSION>rc<X>
Expand Down Expand Up @@ -680,6 +689,9 @@ breeze start-airflow --use-airflow-version <VERSION>rc<X> --python 3.8 --backend
Once you install and run Airflow, you should perform any verification you see as necessary to check
that the Airflow works as you expected.
Breeze also allows you to easily build and install pre-release candidates including providers by following
simple instructions described in [TESTING.rst](https://github.com/apache/airflow/blob/main/TESTING.rst#testing-pre-release-packages)
# Publish the final Apache Airflow release
## Summarize the voting for the Apache Airflow release
Expand Down
22 changes: 12 additions & 10 deletions dev/README_RELEASE_PROVIDER_PACKAGES.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,8 +36,8 @@
- [Prepare documentation](#prepare-documentation)
- [Prepare issue in GitHub to keep status of testing](#prepare-issue-in-github-to-keep-status-of-testing)
- [Prepare voting email for Providers release candidate](#prepare-voting-email-for-providers-release-candidate)
- [Verify the release by PMC members](#verify-the-release-by-pmc-members)
- [Verify by Contributors](#verify-by-contributors)
- [Verify the release candidate by PMC members](#verify-the-release-candidate-by-pmc-members)
- [Verify the release candidate by Contributors](#verify-the-release-candidate-by-contributors)
- [Publish release](#publish-release)
- [Summarize the voting for the Apache Airflow release](#summarize-the-voting-for-the-apache-airflow-release)
- [Publish release to SVN](#publish-release-to-svn)
Expand Down Expand Up @@ -483,12 +483,11 @@ https://dist.apache.org/repos/dist/dev/airflow/providers/
*apache_airflow_providers_<PROVIDER>-*.whl are the binary
Python "wheel" release.

The test procedure for PMC members who would like to test the RC candidates are described in
https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-the-release-by-pmc-members
The test procedure for PMC members is described in
https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-the-release-candidate-by-pmc-members

and for Contributors:

https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-by-contributors
The test procedure for and Contributors who would like to test this RC is described in:
https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-the-release-candidate-by-contributors


Public keys are available at:
Expand Down Expand Up @@ -530,7 +529,7 @@ Please modify the message above accordingly to clearly exclude those packages.

Note, For RC2/3 you may refer to shorten vote period as agreed in mailing list [thread](https://lists.apache.org/thread/cv194w1fqqykrhswhmm54zy9gnnv6kgm).

## Verify the release by PMC members
## Verify the release candidate by PMC members

### SVN check

Expand Down Expand Up @@ -680,14 +679,17 @@ Checking apache-airflow-providers-google-1.0.0rc1.tar.gz.sha512
Checking apache_airflow-providers-google-1.0.0rc1-py3-none-any.whl.sha512
```

## Verify by Contributors
## Verify the release candidate by Contributors

This can be done (and we encourage to) by any of the Contributors. In fact, it's best if the
actual users of Apache Airflow test it in their own staging/test installations. Each release candidate
is available on PyPI apart from SVN packages, so everyone should be able to install
the release candidate version.

You can use any of the installation methods you prefer (you can even install it via the binary wheels
Breeze allows you to easily install and run pre-release candidates by following simple instructions
described in [TESTING.rst](https://github.com/apache/airflow/blob/main/TESTING.rst#testing-pre-release-packages)

But you can use any of the installation methods you prefer (you can even install it via the binary wheels
downloaded from the SVN).

### Installing in your local virtualenv
Expand Down