Skip to content

Commit

Permalink
Capitalize names in docs (apache#19893)
Browse files Browse the repository at this point in the history
Co-authored-by: Bas Harenslak <[email protected]>
  • Loading branch information
BasPH and BasPH authored Nov 30, 2021
1 parent de9fa7b commit 9a469d8
Show file tree
Hide file tree
Showing 35 changed files with 55 additions and 55 deletions.
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-apache-drill/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ are in ``airflow.providers.apache.drill`` python package.
Installation
------------

You can install this package on top of an existing airflow 2.1+ installation via
You can install this package on top of an existing Airflow 2.1+ installation via
``pip install apache-airflow-providers-apache-drill``

PIP requirements
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-apache-druid/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ are in ``airflow.providers.apache.druid`` python package.
Installation
------------

You can install this package on top of an existing airflow 2.1+ installation via
You can install this package on top of an existing Airflow 2.1+ installation via
``pip install apache-airflow-providers-apache-druid``

PIP requirements
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ Kube config (JSON format)
that used to connect to Kubernetes client.

Namespace
Default kubernetes namespace for the connection.
Default Kubernetes namespace for the connection.

When specifying the connection in environment variable you should specify
it using URI syntax.
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-dingding/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ are in ``airflow.providers.dingding`` python package.
Installation
------------

You can install this package on top of an existing airflow 2.1+ installation via
You can install this package on top of an existing Airflow 2.1+ installation via
``pip install apache-airflow-providers-dingding``

Cross provider package dependencies
Expand Down
10 changes: 5 additions & 5 deletions docs/apache-airflow-providers-docker/connections/docker.rst
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ The Docker connection type enables connection to the Docker registry.
Authenticating to Docker
------------------------

Authenticate to docker by using the login information for docker registry.
Authenticate to Docker by using the login information for Docker registry.
More information on `Docker authentication here
<https://docker-py.readthedocs.io/en/1.2.3/api/>`_.

Expand All @@ -40,13 +40,13 @@ Configuring the Connection
--------------------------

Login
Specify the docker registry username.
Specify the Docker registry username.

Password
Specify the docker registry plaintext password.
Specify the Docker registry plaintext password.

Host
Specify the URL to the docker registry. Ex: ``https://index.docker.io/v1``
Specify the URL to the Docker registry. Ex: ``https://index.docker.io/v1``

Port (optional)
Specify the port if not specified in host.
Expand All @@ -56,7 +56,7 @@ Extra
The following parameters are all optional:

* ``email``: Specify the email used for the registry account.
* ``reauth``: Specify whether refresh existing authentication on the docker server. (bool)
* ``reauth``: Specify whether refresh existing authentication on the Docker server. (bool)

When specifying the connection in environment variable you should specify
it using URI syntax.
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-grpc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ are in ``airflow.providers.grpc`` python package.
Installation
------------

You can install this package on top of an existing airflow 2.1+ installation via
You can install this package on top of an existing Airflow 2.1+ installation via
``pip install apache-airflow-providers-grpc``

PIP requirements
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-jenkins/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ are in ``airflow.providers.jenkins`` python package.
Installation
------------

You can install this package on top of an existing airflow 2.1+ installation via
You can install this package on top of an existing Airflow 2.1+ installation via
``pip install apache-airflow-providers-jenkins``

PIP requirements
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-mongo/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ are in ``airflow.providers.mongo`` python package.
Installation
------------

You can install this package on top of an existing airflow 2.1+ installation via
You can install this package on top of an existing Airflow 2.1+ installation via
``pip install apache-airflow-providers-mongo``

PIP requirements
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-mysql/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ are in ``airflow.providers.mysql`` python package.
Installation
------------

You can install this package on top of an existing airflow 2.1+ installation via
You can install this package on top of an existing Airflow 2.1+ installation via
``pip install apache-airflow-providers-mysql``

PIP requirements
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Password (required)
Specify the password to connect.

Extra (optional)
Specify the extra parameters (as json dictionary) that can be used in postgres
Specify the extra parameters (as json dictionary) that can be used in Postgres
connection. The following parameters out of the standard python parameters
are supported:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,8 @@ Under the hood, the :class:`~airflow.providers.postgres.operators.postgres.Postg
Common Database Operations with PostgresOperator
------------------------------------------------

To use the postgres operator to carry out SQL request, two parameters are required: ``sql`` and ``postgres_conn_id``.
These two parameters are eventually fed to the postgres hook object that interacts directly with the postgres database.
To use the PostgresOperator to carry out SQL request, two parameters are required: ``sql`` and ``postgres_conn_id``.
These two parameters are eventually fed to the PostgresHook object that interacts directly with the Postgres database.

Creating a Postgres database table
----------------------------------
Expand Down Expand Up @@ -100,10 +100,10 @@ We can then create a PostgresOperator task that populate the ``pet`` table.
)
Fetching records from your postgres database table
Fetching records from your Postgres database table
--------------------------------------------------

Fetching records from your postgres database table can be as simple as:
Fetching records from your Postgres database table can be as simple as:

.. code-block:: python
Expand Down Expand Up @@ -171,5 +171,5 @@ Conclusion
In this how-to guide we explored the Apache Airflow PostgreOperator. Let's quickly highlight the key takeaways.
In Airflow-2.0, PostgresOperator class now resides in the ``providers`` package. It is best practice to create subdirectory
called ``sql`` in your ``dags`` directory where you can store your sql files. This will make your code more elegant and more
maintainable. And finally, we looked at the different ways you can dynamically pass parameters into our postgres operator
tasks using ``parameters`` or ``params`` attribute.
maintainable. And finally, we looked at the different ways you can dynamically pass parameters into our PostgresOperator
tasks using ``parameters`` or ``params`` attribute.
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-segment/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ are in ``airflow.providers.segment`` python package.
Installation
------------

You can install this package on top of an existing airflow 2.1+ installation via
You can install this package on top of an existing Airflow 2.1+ installation via
``pip install apache-airflow-providers-segment``

PIP requirements
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-singularity/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ are in ``airflow.providers.singularity`` python package.
Installation
------------

You can install this package on top of an existing airflow 2.1+ installation via
You can install this package on top of an existing Airflow 2.1+ installation via
``pip install apache-airflow-providers-singularity``

PIP requirements
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-sqlite/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ are in ``airflow.providers.sqlite`` python package.
Installation
------------

You can install this package on top of an existing airflow 2.1+ installation via
You can install this package on top of an existing Airflow 2.1+ installation via
``pip install apache-airflow-providers-sqlite``

.. include:: ../../airflow/providers/sqlite/CHANGELOG.rst
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-ssh/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ are in ``airflow.providers.ssh`` python package.
Installation
------------

You can install this package on top of an existing airflow 2.1+ installation via
You can install this package on top of an existing Airflow 2.1+ installation via
``pip install apache-airflow-providers-ssh``

PIP requirements
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-yandex/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ are in ``airflow.providers.yandex`` python package.
Installation
------------

You can install this package on top of an existing airflow 2.1+ installation via
You can install this package on top of an existing Airflow 2.1+ installation via
``pip install apache-airflow-providers-yandex``

PIP requirements
Expand Down
4 changes: 2 additions & 2 deletions docs/apache-airflow/concepts/smart-sensors.rst
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ store poke context at sensor_instance table and then exits with a ‘sensing’

When the smart sensor mode is enabled, a special set of builtin smart sensor DAGs
(named smart_sensor_group_shard_xxx) is created by the system; These DAGs contain ``SmartSensorOperator``
task and manage the smart sensor jobs for the airflow cluster. The SmartSensorOperator task can fetch
task and manage the smart sensor jobs for the Airflow cluster. The SmartSensorOperator task can fetch
hundreds of ‘sensing’ instances from sensor_instance table and poke on behalf of them in batches.
Users don’t need to change their existing DAGs.

Expand All @@ -79,7 +79,7 @@ Add the following settings in the ``airflow.cfg``:
* ``use_smart_sensor``: This config indicates if the smart sensor is enabled.
* ``shards``: This config indicates the number of concurrently running smart sensor jobs for
the airflow cluster.
the Airflow cluster.
* ``sensors_enabled``: This config is a list of sensor class names that will use the smart sensor.
The users use the same class names (e.g. HivePartitionSensor) in their DAGs and they don’t have
the control to use smart sensors or not, unless they exclude their tasks explicitly.
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/deprecated-rest-api-ref.rst
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Endpoints
.. http:post:: /api/experimental/dags/<DAG_ID>/dag_runs
Creates a dag_run for a given dag id.
Note: If execution_date is not specified in the body, airflow by default creates only one DAG per second for a given DAG_ID.
Note: If execution_date is not specified in the body, Airflow by default creates only one DAG per second for a given DAG_ID.
In order to create multiple DagRun within one second, you should set parameter ``"replace_microseconds"`` to ``"false"`` (boolean as string).

The execution_date must be specified with the format ``YYYY-mm-DDTHH:MM:SS.ssssss``.
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/executor/kubernetes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ With these requirements in mind, here are some examples of basic ``pod_template_

.. note::

The examples below should work when using default airflow configuration values. However, many custom
The examples below should work when using default Airflow configuration values. However, many custom
configuration values need to be explicitly passed to the pod via this template too. This includes,
but is not limited to, sql configuration, required Airflow connections, dag folder path and
logging settings. See :doc:`../configurations-ref` for details.
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/extra-packages-ref.rst
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ python dependencies for the provided package.
+---------------------+-----------------------------------------------------+----------------------------------------------------------------------------+
| cgroups | ``pip install 'apache-airflow[cgroups]'`` | Needed To use CgroupTaskRunner |
+---------------------+-----------------------------------------------------+----------------------------------------------------------------------------+
| cncf.kubernetes | ``pip install 'apache-airflow[cncf.kubernetes]'`` | Kubernetes Executor (also installs the kubernetes provider package) |
| cncf.kubernetes | ``pip install 'apache-airflow[cncf.kubernetes]'`` | Kubernetes Executor (also installs the Kubernetes provider package) |
+---------------------+-----------------------------------------------------+----------------------------------------------------------------------------+
| dask | ``pip install 'apache-airflow[dask]'`` | DaskExecutor |
+---------------------+-----------------------------------------------------+----------------------------------------------------------------------------+
Expand Down
4 changes: 2 additions & 2 deletions docs/apache-airflow/howto/connection.rst
Original file line number Diff line number Diff line change
Expand Up @@ -204,10 +204,10 @@ If storing the environment variable in something like ``~/.bashrc``, add as foll
export AIRFLOW_CONN_MY_PROD_DATABASE='my-conn-type://login:password@host:port/schema?param1=val1&param2=val2'
Using docker .env
Using Docker .env
^^^^^^^^^^^^^^^^^

If using with a docker ``.env`` file, you may need to remove the single quotes.
If using with a Docker ``.env`` file, you may need to remove the single quotes.

.. code-block::
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/howto/custom-operator.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ There are two methods that you need to override in a derived class:
You can specify the ``default_args`` in the dag file. See :ref:`Default args <concepts:default-arguments>` for more details.

* Execute - The code to execute when the runner calls the operator. The method contains the
airflow context as a parameter that can be used to read config values.
Airflow context as a parameter that can be used to read config values.

.. note::

Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/howto/define_extra_link.rst
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ The following code shows how to add extra links to an operator via Plugins:
.. note:: Operator Extra Links should be registered via Airflow Plugins or custom Airflow Provider to work.

You can also add a global operator extra link that will be available to
all the operators through an airflow plugin or through airflow providers. You can learn more about it in the
all the operators through an Airflow plugin or through Airflow providers. You can learn more about it in the
:ref:`plugin example <plugin-example>` and in :doc:`apache-airflow-providers:index`.

You can see all the extra links available via community-managed providers in
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/installation/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,7 @@ and official constraint files- same that are used for installing Airflow from Py

* Users who are familiar with Containers and Docker stack and understand how to build their own container images.
* Users who understand how to install providers and dependencies from PyPI with constraints if they want to extend or customize the image.
* Users who know how to create deployments using Docker by linking together multiple docker containers and maintaining such deployments.
* Users who know how to create deployments using Docker by linking together multiple Docker containers and maintaining such deployments.

**What are you expected to handle**

Expand Down
4 changes: 2 additions & 2 deletions docs/apache-airflow/installation/installing-from-pypi.rst
Original file line number Diff line number Diff line change
Expand Up @@ -89,8 +89,8 @@ In order to simplify the installation, we have prepared examples of how to upgra
Installing Airflow with extras and providers
============================================

If you need to install extra dependencies of airflow, you can use the script below to make an installation
a one-liner (the example below installs postgres and google provider, as well as ``async`` extra.
If you need to install extra dependencies of Airflow, you can use the script below to make an installation
a one-liner (the example below installs Postgres and Google providers, as well as ``async`` extra).

.. code-block:: bash
:substitutions:
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/logging-monitoring/logging-tasks.rst
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,7 @@ Some external systems require specific configuration in Airflow for redirection
Serving logs from workers
-------------------------

Most task handlers send logs upon completion of a task. In order to view logs in real time, airflow automatically starts an http server to serve the logs in the following cases:
Most task handlers send logs upon completion of a task. In order to view logs in real time, Airflow automatically starts an http server to serve the logs in the following cases:

- If ``SchedulerExecutor`` or ``LocalExecutor`` is used, then when ``airflow scheduler`` is running.
- If ``CeleryExecutor`` is used, then when ``airflow worker`` is running.
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/plugins.rst
Original file line number Diff line number Diff line change
Expand Up @@ -300,7 +300,7 @@ Plugins as Python packages
--------------------------

It is possible to load plugins via `setuptools entrypoint <https://packaging.python.org/guides/creating-and-discovering-plugins/#using-package-metadata>`_ mechanism. To do this link
your plugin using an entrypoint in your package. If the package is installed, airflow
your plugin using an entrypoint in your package. If the package is installed, Airflow
will automatically load the registered plugins from the entrypoint list.

.. note::
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/production-deployment.rst
Original file line number Diff line number Diff line change
Expand Up @@ -141,7 +141,7 @@ is capable of retrieving the authentication token.

The best practice to implement proper security mechanism in this case is to make sure that worker
workloads have no access to the Keytab but only have access to the periodically refreshed, temporary
authentication tokens. This can be achieved in docker environment by running the ``airflow kerberos``
authentication tokens. This can be achieved in Docker environment by running the ``airflow kerberos``
command and the worker command in separate containers - where only the ``airflow kerberos`` token has
access to the Keytab file (preferably configured as secret resource). Those two containers should share
a volume where the temporary token should be written by the ``airflow kerberos`` and read by the workers.
Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow/start/docker.rst
Original file line number Diff line number Diff line change
Expand Up @@ -324,7 +324,7 @@ runtime user id which is unknown at the time of building the image.
functionality - only added confusion - so it has been removed.


Those additional variables are useful in case you are trying out/testing Airflow installation via docker compose.
Those additional variables are useful in case you are trying out/testing Airflow installation via Docker Compose.
They are not intended to be used in production, but they make the environment faster to bootstrap for first time
users with the most common customizations.

Expand Down
6 changes: 3 additions & 3 deletions docs/apache-airflow/tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -374,11 +374,11 @@ Lets look at another example; we need to get some data from a file which is host

Initial setup
''''''''''''''''''''
We need to have docker and postgres installed.
We need to have Docker and Postgres installed.
We will be using this `docker file <https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html#docker-compose-yaml>`_
Follow the instructions properly to set up Airflow.

Create a Employee table in postgres using this:
Create a Employee table in Postgres using this:

.. code-block:: sql
Expand All @@ -400,7 +400,7 @@ Create a Employee table in postgres using this:
"Leave" INTEGER
);
We also need to add a connection to postgres. Go to the UI and click "Admin" >> "Connections". Specify the following for each field:
We also need to add a connection to Postgres. Go to the UI and click "Admin" >> "Connections". Specify the following for each field:

- Conn id: LOCAL
- Conn Type: postgres
Expand Down
4 changes: 2 additions & 2 deletions docs/docker-stack/build-arg-ref.rst
Original file line number Diff line number Diff line change
Expand Up @@ -211,7 +211,7 @@ You can see some examples of those in:
| ``AIRFLOW_CONSTRAINTS_LOCATION`` | | If not empty, it will override the |
| | | source of the constraints with the |
| | | specified URL or file. Note that the |
| | | file has to be in docker context so |
| | | file has to be in Docker context so |
| | | it's best to place such file in |
| | | one of the folders included in |
| | | ``.dockerignore`` file. |
Expand Down Expand Up @@ -246,7 +246,7 @@ When image is build from PIP, by default pre-caching of PIP dependencies is used
builds during development. When pre-cached PIP dependencies are used and ``setup.py`` or ``setup.cfg`` changes, the
PIP dependencies are already pre-installed, thus resulting in much faster image rebuild. This is purely an optimization
of time needed to build the images and should be disabled if you want to install Airflow from
docker context files.
Docker context files.

+------------------------------------------+------------------------------------------+------------------------------------------+
| Build argument | Default value | Description |
Expand Down
Loading

0 comments on commit 9a469d8

Please sign in to comment.