Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DOCS: fixed styling & spacing #5572

Merged
merged 1 commit into from
Nov 9, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions docs/src/further_topics/dask_best_practices/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -144,8 +144,8 @@ Iris provides a basic chunking shape to Dask, attempting to set the shape for
best performance. The chunking that is used can depend on the file format that
is being loaded. See below for how chunking is performed for:

* :ref:`chunking_netcdf`
* :ref:`chunking_pp_ff`
* :ref:`chunking_netcdf`
* :ref:`chunking_pp_ff`

It can in some cases be beneficial to re-chunk the arrays in Iris cubes.
For information on how to do this, see :ref:`dask_rechunking`.
Expand Down Expand Up @@ -208,9 +208,9 @@ If you feel you have an example of a Dask best practice that you think may be he
please share them with us by raising a new `discussion on the Iris repository <https://github.com/SciTools/iris
/discussions/>`_.

* :doc:`dask_pp_to_netcdf`
* :doc:`dask_parallel_loop`
* :doc:`dask_bags_and_greed`
* :doc:`dask_pp_to_netcdf`
* :doc:`dask_parallel_loop`
* :doc:`dask_bags_and_greed`

.. toctree::
:hidden:
Expand Down
28 changes: 14 additions & 14 deletions docs/src/further_topics/ugrid/data_model.rst
Original file line number Diff line number Diff line change
Expand Up @@ -484,20 +484,20 @@ How UGRID information is stored
| Described in detail in `MeshCoords`_.
| Stores the following information:

* | :attr:`~iris.experimental.ugrid.MeshCoord.mesh`
| The :class:`~iris.experimental.ugrid.Mesh` associated with this
:class:`~iris.experimental.ugrid.MeshCoord`. This determines the
:attr:`~iris.cube.Cube.mesh` attribute of any :class:`~iris.cube.Cube`
this :class:`~iris.experimental.ugrid.MeshCoord` is attached to (see
`The Basics`_)

* | :attr:`~iris.experimental.ugrid.MeshCoord.location`
| ``node``/``edge``/``face`` - the element detailed by this
:class:`~iris.experimental.ugrid.MeshCoord`. This determines the
:attr:`~iris.cube.Cube.location` attribute of any
:class:`~iris.cube.Cube` this
:class:`~iris.experimental.ugrid.MeshCoord` is attached to (see
`The Basics`_).
* | :attr:`~iris.experimental.ugrid.MeshCoord.mesh`
| The :class:`~iris.experimental.ugrid.Mesh` associated with this
:class:`~iris.experimental.ugrid.MeshCoord`. This determines the
:attr:`~iris.cube.Cube.mesh` attribute of any :class:`~iris.cube.Cube`
this :class:`~iris.experimental.ugrid.MeshCoord` is attached to (see
`The Basics`_)

* | :attr:`~iris.experimental.ugrid.MeshCoord.location`
| ``node``/``edge``/``face`` - the element detailed by this
:class:`~iris.experimental.ugrid.MeshCoord`. This determines the
:attr:`~iris.cube.Cube.location` attribute of any
:class:`~iris.cube.Cube` this
:class:`~iris.experimental.ugrid.MeshCoord` is attached to (see
`The Basics`_).

.. _ugrid MeshCoords:

Expand Down
42 changes: 20 additions & 22 deletions docs/src/techpapers/um_files_loading.rst
Original file line number Diff line number Diff line change
Expand Up @@ -125,21 +125,21 @@ with latitude and longitude axes are also supported).
For an ordinary latitude-longitude grid, the cubes have coordinates called
'longitude' and 'latitude':

* These are mapped to the appropriate data dimensions.
* They have units of 'degrees'.
* They have a coordinate system of type :class:`iris.coord_systems.GeogCS`.
* The coordinate points are normally set to the regular sequence
``ZDX/Y + BDX/Y * (1 .. LBNPT/LBROW)`` (*except*, if BDX/BDY is zero, the
values are taken from the extra data vector X/Y, if present).
* If X/Y_LOWER_BOUNDS extra data is available, this appears as bounds values
of the horizontal coordinates.
* These are mapped to the appropriate data dimensions.
* They have units of 'degrees'.
* They have a coordinate system of type :class:`iris.coord_systems.GeogCS`.
* The coordinate points are normally set to the regular sequence
``ZDX/Y + BDX/Y * (1 .. LBNPT/LBROW)`` (*except*, if BDX/BDY is zero, the
values are taken from the extra data vector X/Y, if present).
* If X/Y_LOWER_BOUNDS extra data is available, this appears as bounds values
of the horizontal coordinates.

For **rotated** latitude-longitude coordinates (as for LBCODE=101), the
horizontal coordinates differ only slightly --

* The names are 'grid_latitude' and 'grid_longitude'.
* The coord_system is a :class:`iris.coord_systems.RotatedGeogCS`, created
with a pole defined by BPLAT, BPLON.
* The names are 'grid_latitude' and 'grid_longitude'.
* The coord_system is a :class:`iris.coord_systems.RotatedGeogCS`, created
with a pole defined by BPLAT, BPLON.

For example:
>>> # Load a PP field.
Expand Down Expand Up @@ -304,10 +304,9 @@ For hybrid height levels (LBVC=65):
multidimensional or non-monotonic.

See an example printout of a hybrid height cube,
:ref:`here <hybrid_cube_printout>`:

Notice that this contains all of the above coordinates --
'model_level_number', 'sigma', 'level_height' and the derived 'altitude'.
:ref:`here <hybrid_cube_printout>`. Notice that this contains all of the
above coordinates -- ``model_level_number``, ``sigma``, ``level_height`` and
the derived ``altitude``.

.. note::

Expand Down Expand Up @@ -364,7 +363,7 @@ Data at a single measurement timepoint (LBTIM.IB=0):
defined according to LBTIM.IC.

Values forecast from T2, valid at T1 (LBTIM.IB=1):
Coordinates ``time` and ``forecast_reference_time`` are created from the T1
Coordinates ``time`` and ``forecast_reference_time`` are created from the T1
and T2 values, respectively. These have no bounds, and units of
'hours since 1970-01-01 00:00:00', with the appropriate calendar.
A ``forecast_period`` coordinate is also created, with values T1-T2, no
Expand All @@ -383,12 +382,11 @@ these may become dimensions of the resulting data cube. This will depend on
the values actually present in the source fields for each of the elements.

See an example printout of a forecast data cube,
:ref:`here <cube-statistics_forecast_printout>` :

Notice that this example contains all of the above coordinates -- 'time',
'forecast_period' and 'forecast_reference_time'. In this case the data are
forecasts, so 'time' is a dimension, 'forecast_period' varies with time and
'forecast_reference_time' is a constant.
:ref:`here <cube-statistics_forecast_printout>`. Notice that this example
contains all of the above coordinates -- ``time``, ``forecast_period`` and
``forecast_reference_time``. In this case the data are forecasts, so ``time``
is a dimension, ``forecast_period``` varies with time and
``forecast_reference_time`` is a constant.


Statistical Measures
Expand Down
8 changes: 4 additions & 4 deletions docs/src/userguide/navigating_a_cube.rst
Original file line number Diff line number Diff line change
Expand Up @@ -191,10 +191,10 @@ Adding and Removing Metadata to the Cube at Load Time
Sometimes when loading a cube problems occur when the amount of metadata is more or less than expected.
This is often caused by one of the following:

* The file does not contain enough metadata, and therefore the cube cannot know everything about the file.
* Some of the metadata of the file is contained in the filename, but is not part of the actual file.
* There is not enough metadata loaded from the original file as Iris has not handled the format fully. *(in which case,
please let us know about it)*
* The file does not contain enough metadata, and therefore the cube cannot know everything about the file.
* Some of the metadata of the file is contained in the filename, but is not part of the actual file.
* There is not enough metadata loaded from the original file as Iris has not handled the format fully. *(in which case,
please let us know about it)*

To solve this, all of :func:`iris.load`, :func:`iris.load_cube`, and :func:`iris.load_cubes` support a callback keyword.

Expand Down
23 changes: 12 additions & 11 deletions docs/src/userguide/real_and_lazy_data.rst
Original file line number Diff line number Diff line change
Expand Up @@ -247,20 +247,21 @@ output file, to be performed by `Dask <https://docs.dask.org/en/stable/>`_ lat
thus enabling parallel save operations.

This works in the following way :
1. an :func:`iris.save` call is made, with a NetCDF file output and the additional
keyword ``compute=False``.
This is currently *only* available when saving to NetCDF, so it is documented in
the Iris NetCDF file format API. See: :func:`iris.fileformats.netcdf.save`.

2. the call creates the output file, but does not fill in variables' data, where
the data is a lazy array in the Iris object. Instead, these variables are
initially created "empty".
1. an :func:`iris.save` call is made, with a NetCDF file output and the additional
keyword ``compute=False``.
This is currently *only* available when saving to NetCDF, so it is documented in
the Iris NetCDF file format API. See: :func:`iris.fileformats.netcdf.save`.

3. the :meth:`~iris.save` call returns a ``result`` which is a
:class:`~dask.delayed.Delayed` object.
2. the call creates the output file, but does not fill in variables' data, where
the data is a lazy array in the Iris object. Instead, these variables are
initially created "empty".

4. the save can be completed later by calling ``result.compute()``, or by passing it
to the :func:`dask.compute` call.
3. the :meth:`~iris.save` call returns a ``result`` which is a
:class:`~dask.delayed.Delayed` object.

4. the save can be completed later by calling ``result.compute()``, or by passing it
to the :func:`dask.compute` call.

The benefit of this, is that costly data transfer operations can be performed in
parallel with writes to other data files. Also, where array contents are calculated
Expand Down
Loading