diff --git a/.gitignore b/.gitignore index 4a65981ad5..f0420cbc22 100644 --- a/.gitignore +++ b/.gitignore @@ -1,5 +1,8 @@ *.py[co] +# Environment file which should be autogenerated +*conda_requirements.txt* + # Packages *.egg? *.egg-info diff --git a/.travis.yml b/.travis.yml index b92ecdbfb2..ac5d341307 100644 --- a/.travis.yml +++ b/.travis.yml @@ -31,6 +31,15 @@ install: export IRIS_TEST_DATA_REF="2f3a6bcf25f81bd152b3d66223394074c9069a96"; export IRIS_TEST_DATA_SUFFIX=$(echo "${IRIS_TEST_DATA_REF}" | sed "s/^v//"); + # Cut short doctest phase under Python 2 : now only supports Python 3 + # SEE : https://github.com/SciTools/iris/pull/3134 + # ------------ + - > + if [[ $TEST_TARGET == 'doctest' && ${TRAVIS_PYTHON_VERSION} != 3* ]]; then + echo "DOCTEST phase only valid in Python 3 : ABORTING during 'install'." + exit 0 + fi + # Install miniconda # ----------------- - > diff --git a/INSTALL b/INSTALL index 3ae4855e25..c8a61769fe 100644 --- a/INSTALL +++ b/INSTALL @@ -32,8 +32,8 @@ Installing from source The latest Iris source release is available from https://github.com/SciTools/iris. -Iris makes use of a range of other libraries and python modules. These -dependencies must be in place before you can successfully install +Iris makes use of a range of other libraries and python modules. These +dependencies must be in place before you can successfully install Iris. Once you have satisfied the requirements detailed in the ``requirements`` directory, go to the root of Iris' and run:: @@ -42,8 +42,8 @@ Iris. Once you have satisfied the requirements detailed in the In-place build - an alternative for developers ============================================== -We are very keen to encourage contributions to Iris. For this type of -development activity an in-place build can be useful. Once you've cloned +We are very keen to encourage contributions to Iris. For this type of +development activity an in-place build can be useful. Once you've cloned the Iris git repository you can perform an in-place build with:: pip install -e . @@ -66,6 +66,22 @@ Alternatively, a full requirements file that includes all optional dependencies python requirements/gen_conda_requirements.py --groups all > conda_requirements.txt +Running the tests +''''''''''''''''' + +In order to run the tests, you will need to use the `test` and `docs` groups (we include the `docs` group so that you can run the pull request tests locally). +Hence the commands change to:: + + python requirements/gen_conda_requirements.py --groups test docs > conda_requirements.txt + conda create -n my_iris_env --file conda_requirements.txt + conda activate my_iris_env # or whatever other name you gave it + pip install -e . + +The tests can then be run with + + python setup.py test + + Custom site configuration ========================= The default site configuration values can be overridden by creating the file diff --git a/README.md b/README.md index d7b4900c2c..a6e9dd525b 100644 --- a/README.md +++ b/README.md @@ -44,6 +44,7 @@ + [Documentation](#documentation) + [Installation](#installation) + [Copyright and licence](#copyright-and-licence) ++ [Contributing](#contributing) [](TOC) @@ -98,5 +99,7 @@ are available in [INSTALL](INSTALL). Iris may be freely distributed, modified and used commercially under the terms of its [GNU LGPLv3 license](COPYING.LESSER). +# Contributing +Information on how to contribute can be found in the [Iris developer guide](https://scitools.org.uk/iris/docs/latest/developers_guide/index.html). (C) British Crown Copyright 2010 - 2018, Met Office diff --git a/docs/iris/src/_templates/index.html b/docs/iris/src/_templates/index.html index 0c4b5d958f..31acded447 100644 --- a/docs/iris/src/_templates/index.html +++ b/docs/iris/src/_templates/index.html @@ -134,7 +134,7 @@ extra information on specific technical issues

  • -
  • diff --git a/docs/iris/src/_templates/layout.html b/docs/iris/src/_templates/layout.html index 53ae5105cb..8ecc35bade 100644 --- a/docs/iris/src/_templates/layout.html +++ b/docs/iris/src/_templates/layout.html @@ -37,7 +37,7 @@

    - Iris v2.1 + Iris v2.2

    A powerful, format-agnostic, community-driven Python library for analysing and diff --git a/docs/iris/src/conf.py b/docs/iris/src/conf.py index 2833a01674..6cdfe634c4 100644 --- a/docs/iris/src/conf.py +++ b/docs/iris/src/conf.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2010 - 2017, Met Office +# (C) British Crown Copyright 2010 - 2018, Met Office # # This file is part of Iris. # diff --git a/docs/iris/src/developers_guide/tests.rst b/docs/iris/src/developers_guide/tests.rst index d2ac826f22..929073b569 100644 --- a/docs/iris/src/developers_guide/tests.rst +++ b/docs/iris/src/developers_guide/tests.rst @@ -3,6 +3,9 @@ Testing ******* +The Iris tests may be run with ``python setup.py test`` which has a +command line utility included. + There are three categories of tests within Iris: - Unit tests - Integration tests diff --git a/docs/iris/src/userguide/interpolation_and_regridding.rst b/docs/iris/src/userguide/interpolation_and_regridding.rst index 4f2b06de44..e3cd622541 100644 --- a/docs/iris/src/userguide/interpolation_and_regridding.rst +++ b/docs/iris/src/userguide/interpolation_and_regridding.rst @@ -5,6 +5,8 @@ import numpy as np import iris + import warnings + warnings.simplefilter('ignore') ================================= Cube interpolation and regridding @@ -137,10 +139,9 @@ This cube has a "hybrid-height" vertical coordinate system, meaning that the ver coordinate is unevenly spaced in altitude: >>> print(column.coord('altitude').points) - [418.6983642578125 434.57049560546875 456.79278564453125 485.3664855957031 - 520.2932739257812 561.5751953125 609.2144775390625 663.214111328125 - 723.5769653320312 790.306640625 863.4072265625 942.88232421875 - 1028.737060546875 1120.9764404296875 1219.6051025390625] + [ 418.69836 434.5705 456.7928 485.3665 520.2933 561.5752 + 609.2145 663.2141 723.57697 790.30664 863.4072 942.8823 + 1028.737 1120.9764 1219.6051 ] We could regularise the vertical coordinate by defining 10 equally spaced altitude sample points between 400 and 1250 and interpolating our vertical coordinate onto @@ -184,9 +185,8 @@ For example, to mask values that lie beyond the range of the original data: >>> scheme = iris.analysis.Linear(extrapolation_mode='mask') >>> new_column = column.interpolate(sample_points, scheme) >>> print(new_column.coord('altitude').points) - [nan 494.44451904296875 588.888916015625 683.333251953125 777.77783203125 - 872.2222290039062 966.666748046875 1061.111083984375 1155.555419921875 - nan] + [ nan 494.44452 588.8889 683.33325 777.77783 872.2222 + 966.66675 1061.1111 1155.5554 nan] .. _caching_an_interpolator: diff --git a/docs/iris/src/whatsnew/2.1.rst b/docs/iris/src/whatsnew/2.1.rst index f4e2f587e2..00f7115431 100644 --- a/docs/iris/src/whatsnew/2.1.rst +++ b/docs/iris/src/whatsnew/2.1.rst @@ -78,30 +78,6 @@ Incompatible Changes * The deprecated :mod:`iris.experimental.um` was removed. Please use consider using `mule `_ as an alternative. -* Stash-to-name mapping changes in :mod:`ib/iris/fileformats/um_cf_map.py`: - * m01s00i121: tendency_of_atmosphere_mass_content_of_sulfur_dioxide_expressed_as_sulfur_due_to_volcanic_emission -> "3D NATURAL SO2 EMISSIONS" - * m01s02i207: surface_downwelling_longwave_flux -> surface_downwelling_longwave_flux_in_air - * m01s02i208: surface_downwelling_longwave_flux_assuming_clear_sky -> surface_downwelling_longwave_flux_in_air_assuming_clear_sky - * m01s04i203: stratiform_rainfall_rate -> stratiform_rainfall_flux - * m01s04i204: stratiform_snowfall_rate -> stratiform_snowfall_flux - * m01s05i181: tendency_of_air_temperature_due_to_convection -> change_over_time_in_air_temperature_due_to_convection - * m01s05i182: tendency_of_specific_humidity_due_to_convection -> change_over_time_in_specific_humidity_due_to_convection - * m01s05i185: tendency_of_eastward_wind_due_to_convection -> change_over_time_in_x_wind_due_to_convection - * m01s05i186: tendency_of_northward_wind_due_to_convection -> change_over_time_in_y_wind_due_to_convection - * m01s05i205: convective_rainfall_rate -> convective_rainfall_flux - * m01s05i233: mass_fraction_of_convective_cloud_liquid_water_in_air -> undilute_cape - * m01s12i181': tendency_of_air_temperature_due_to_advection -> change_over_time_in_air_temperature_due_to_advection - * m01s12i182': tendency_of_specific_humidity_due_to_advection -> change_over_time_in_specific_humidity_due_to_advection - * m01s12i183': tendency_of_mass_fraction_of_cloud_liquid_water_in_air_due_to_advection -> change_over_time_in_mass_fraction_of_cloud_liquid_water_in_air_due_to_advection - * m01s12i184': tendency_of_mass_fraction_of_cloud_ice_in_air_due_to_advection -> change_over_time_in_mass_fraction_of_cloud_ice_in_air_due_to_advection - * m01s12i185': tendency_of_eastward_wind_due_to_advection -> change_over_time_in_x_wind_due_to_advection - * m01s12i186': tendency_of_northward_wind_due_to_advection -> change_over_time_in_y_wind_due_to_advection - * m01s30i218: product_of_eastward_wind_and_omega -> product_of_x_wind_and_omega - * m01s30i228: product_of_northward_wind_and_omega -> product_of_y_wind_and_omega - * m01s34i101: number_of_soluble_nucleation_mode_particles_per_molecule_of_air -> number_of_particles_per_air_molecule_of_soluble_nucleation_mode_aerosol_in_air - * m01s34i103: number_of_soluble_Aitken_mode_particles_per_molecule_of_air -> number_of_particles_per_air_molecule_of_soluble_aitken_mode_aerosol_in_air - * m01s34i107: number_of_soluble_accumulation_mode_particles_per_molecule_of_air -> number_of_particles_per_air_molecule_of_soluble_accumulation_mode_aerosol_in_air - * m01s34i113: number_of_soluble_coarse_mode_particles_per_molecule_of_air -> number_of_particles_per_air_molecule_of_soluble_coarse_mode_aerosol_in_air - * m01s34i119: number_of_insoluble_Aitken_mode_particles_per_molecule_of_air -> number_of_particles_per_air_molecule_of_insoluble_aitken_mode_aerosol_in_air - * m01s34i122: number_of_insoluble_accumulation_mode_particles_per_molecule_of_air -> number_of_particles_per_air_molecule_of_insoluble_accumulation_mode_aerosol_in_air - * m01s34i124: number_of_insoluble_coarse_mode_particles_per_molecule_of_air -> number_of_particles_per_air_molecule_of_insoluble_coarse_mode_aerosol_in_air +* This release of Iris contains a number of updated metadata translations. + See [this changelist](https://github.com/SciTools/iris/commit/69597eb3d8501ff16ee3d56aef1f7b8f1c2bb316#diff-1680206bdc5cfaa83e14428f5ba0f848) + for further information. diff --git a/docs/iris/src/whatsnew/2.2.rst b/docs/iris/src/whatsnew/2.2.rst new file mode 100644 index 0000000000..7b069f455e --- /dev/null +++ b/docs/iris/src/whatsnew/2.2.rst @@ -0,0 +1,35 @@ +What's New in Iris 2.2.0 +************************ + +:Release: 2.2.0a0 +:Date: + + +This document explains the new/changed features of Iris in the alpha release +of version 2.2.0 +(:doc:`View all changes `). + + +Iris 2.2.0 Features +=================== +.. _showcase: + +.. admonition:: 2-Dimensional Coordinate Plotting + + The iris plot functions :func:`~iris.plot.pcolor` and + :func:`~iris.plot.pcolormesh` now accommodate the plotting of 2-dimensional + coordinates as well as 1-dimensional coordinates. + + To enable this feature, each coordinate passed in for plotting will be + automatically checked for contiguity. Coordinate bounds must either be + contiguous, or the cube's data must be masked at the discontiguities in + order to avoid plotting errors. + + The iris plot functions :func:`iris.plot.quiver` has been added, and this + also works with 2-dimensional plot coordinates. + +.. admonition:: 2-Dimensional Grid Vectors + + The iris functions :func:`iris.analysis.cartography.gridcell_angles` and + :func:`iris.analysis.cartography.rotate_grid_vectors` have been added, + allowing you to convert gridcell-oriented vectors to true-North/East ones. diff --git a/docs/iris/src/whatsnew/contributions_2.2.0/bugfix_2018-Aug-20_print-long-time-interval-bounds.txt b/docs/iris/src/whatsnew/contributions_2.2.0/bugfix_2018-Aug-20_print-long-time-interval-bounds.txt new file mode 100644 index 0000000000..3651598b72 --- /dev/null +++ b/docs/iris/src/whatsnew/contributions_2.2.0/bugfix_2018-Aug-20_print-long-time-interval-bounds.txt @@ -0,0 +1,2 @@ +* Fixed a bug that prevented printing time coordinates with bounds when the time + coordinate was measured on a long interval (that is, ``months`` or ``years``). \ No newline at end of file diff --git a/docs/iris/src/whatsnew/contributions_2.2.0/bugfix_2018-Sep-19_gracefully-filling-warning-only-when-masked.txt b/docs/iris/src/whatsnew/contributions_2.2.0/bugfix_2018-Sep-19_gracefully-filling-warning-only-when-masked.txt new file mode 100644 index 0000000000..c720e6f07a --- /dev/null +++ b/docs/iris/src/whatsnew/contributions_2.2.0/bugfix_2018-Sep-19_gracefully-filling-warning-only-when-masked.txt @@ -0,0 +1 @@ +* "Gracefully filling..." warnings are only issued when the co-ordinate or bound data is actually masked. diff --git a/docs/iris/src/whatsnew/contributions_2.2.0/newfeature_2018-Aug-14_load-nc-chunks.txt b/docs/iris/src/whatsnew/contributions_2.2.0/newfeature_2018-Aug-14_load-nc-chunks.txt new file mode 100644 index 0000000000..2a3780115d --- /dev/null +++ b/docs/iris/src/whatsnew/contributions_2.2.0/newfeature_2018-Aug-14_load-nc-chunks.txt @@ -0,0 +1 @@ +* NetCDF data variable chunk sizes are utilised at load time for significant performance improvements. \ No newline at end of file diff --git a/docs/iris/src/whatsnew/contributions_2.2.0/newfeature_2018-Oct-03_reverse_cube.txt b/docs/iris/src/whatsnew/contributions_2.2.0/newfeature_2018-Oct-03_reverse_cube.txt new file mode 100644 index 0000000000..c7e3bef8a4 --- /dev/null +++ b/docs/iris/src/whatsnew/contributions_2.2.0/newfeature_2018-Oct-03_reverse_cube.txt @@ -0,0 +1 @@ +* :func:`iris.util.reverse` can now be used to reverse a cube by specifying one or more coordinates. diff --git a/docs/iris/src/whatsnew/index.rst b/docs/iris/src/whatsnew/index.rst index 104c5074ca..c3a34303f0 100644 --- a/docs/iris/src/whatsnew/index.rst +++ b/docs/iris/src/whatsnew/index.rst @@ -9,6 +9,7 @@ Iris versions. .. toctree:: :maxdepth: 2 + 2.2.rst 2.1.rst 2.0.rst 1.13.rst diff --git a/lib/iris/__init__.py b/lib/iris/__init__.py index 2e7eba5892..882b52aaea 100644 --- a/lib/iris/__init__.py +++ b/lib/iris/__init__.py @@ -121,7 +121,7 @@ def callback(cube, field, filename): # Iris revision. -__version__ = '2.2.0dev0' +__version__ = '2.2.0a0' # Restrict the names imported when using "from iris import *" __all__ = ['load', 'load_cube', 'load_cubes', 'load_raw', diff --git a/lib/iris/_lazy_data.py b/lib/iris/_lazy_data.py index 0613ba57ec..f5312b7d3a 100644 --- a/lib/iris/_lazy_data.py +++ b/lib/iris/_lazy_data.py @@ -23,6 +23,8 @@ from __future__ import (absolute_import, division, print_function) from six.moves import (filter, input, map, range, zip) # noqa +from functools import wraps + import dask import dask.array as da import dask.context @@ -31,6 +33,19 @@ import numpy.ma as ma +def non_lazy(func): + """ + Turn a lazy function into a function that returns a result immediately. + + """ + @wraps(func) + def inner(*args, **kwargs): + """Immediately return the results of a lazy function.""" + result = func(*args, **kwargs) + return dask.compute(result)[0] + return inner + + def is_lazy_data(data): """ Return whether the argument is an Iris 'lazy' data array. diff --git a/lib/iris/analysis/__init__.py b/lib/iris/analysis/__init__.py index 8cf1a5d2a7..ec4d341d25 100644 --- a/lib/iris/analysis/__init__.py +++ b/lib/iris/analysis/__init__.py @@ -65,7 +65,7 @@ from iris.analysis._regrid import RectilinearRegridder import iris.coords from iris.exceptions import LazyAggregatorError -import iris._lazy_data as iris_lazy_data +import iris._lazy_data __all__ = ('COUNT', 'GMEAN', 'HMEAN', 'MAX', 'MEAN', 'MEDIAN', 'MIN', 'PEAK', 'PERCENTILE', 'PROPORTION', 'RMS', 'STD_DEV', 'SUM', @@ -468,7 +468,7 @@ def lazy_aggregate(self, data, axis, **kwargs): # provided to __init__. kwargs = dict(list(self._kwargs.items()) + list(kwargs.items())) - return self.lazy_func(data, axis, **kwargs) + return self.lazy_func(data, axis=axis, **kwargs) def aggregate(self, data, axis, **kwargs): """ @@ -1015,6 +1015,40 @@ def post_process(self, collapsed_cube, data_result, coords, **kwargs): return result +def _build_dask_mdtol_function(dask_stats_function): + """ + Make a wrapped dask statistic function that supports the 'mdtol' keyword. + + 'dask_function' must be a dask statistical function, compatible with the + call signature : "dask_stats_function(data, axis=axis, **kwargs)". + It must be masked-data tolerant, i.e. it ignores masked input points and + performs a calculation on only the unmasked points. + For example, mean([1, --, 2]) = (1 + 2) / 2 = 1.5. + + The returned value is a new function operating on dask arrays. + It has the call signature `stat(data, axis=-1, mdtol=None, **kwargs)`. + + """ + @wraps(dask_stats_function) + def inner_stat(array, axis=-1, mdtol=None, **kwargs): + # Call the statistic to get the basic result (missing-data tolerant). + dask_result = dask_stats_function(array, axis=axis, **kwargs) + if mdtol is None or mdtol >= 1.0: + result = dask_result + else: + # Build a lazy computation to compare the fraction of missing + # input points at each output point to the 'mdtol' threshold. + point_mask_counts = da.sum(da.ma.getmaskarray(array), axis=axis) + points_per_calc = array.size / dask_result.size + masked_point_fractions = point_mask_counts / points_per_calc + boolean_mask = masked_point_fractions > mdtol + # Return an mdtol-masked version of the basic result. + result = da.ma.masked_array(da.ma.getdata(dask_result), + boolean_mask) + return result + return inner_stat + + def _percentile(data, axis, percent, fast_percentile_method=False, **kwargs): """ @@ -1191,20 +1225,24 @@ def _weighted_percentile(data, axis, weights, percent, returned=False, return result -def _count(array, function, axis, **kwargs): - if not callable(function): - raise ValueError('function must be a callable. Got %s.' - % type(function)) - return ma.sum(function(array), axis=axis, **kwargs) +@_build_dask_mdtol_function +def _lazy_count(array, **kwargs): + array = iris._lazy_data.as_lazy_data(array) + func = kwargs.pop('function', None) + if not callable(func): + emsg = 'function must be a callable. Got {}.' + raise TypeError(emsg.format(type(func))) + return da.sum(func(array), **kwargs) def _proportion(array, function, axis, **kwargs): + count = iris._lazy_data.non_lazy(_lazy_count) # if the incoming array is masked use that to count the total number of # values if ma.isMaskedArray(array): # calculate the total number of non-masked values across the given axis - total_non_masked = _count(array.mask, np.logical_not, - axis=axis, **kwargs) + total_non_masked = count( + array.mask, axis=axis, function=np.logical_not, **kwargs) total_non_masked = ma.masked_equal(total_non_masked, 0) else: total_non_masked = array.shape[axis] @@ -1215,7 +1253,7 @@ def _proportion(array, function, axis, **kwargs): # a dtype for its data that is different to the dtype of the fill-value, # which can cause issues outside this function. # Reference - tests/unit/analyis/test_PROPORTION.py Test_masked.test_ma - numerator = _count(array, function, axis=axis, **kwargs) + numerator = count(array, axis=axis, function=function, **kwargs) result = ma.asarray(numerator / total_non_masked) return result @@ -1228,21 +1266,23 @@ def _rms(array, axis, **kwargs): return rval -def _sum(array, **kwargs): +@_build_dask_mdtol_function +def _lazy_sum(array, **kwargs): + array = iris._lazy_data.as_lazy_data(array) # weighted or scaled sum axis_in = kwargs.get('axis', None) weights_in = kwargs.pop('weights', None) returned_in = kwargs.pop('returned', False) if weights_in is not None: - wsum = ma.sum(weights_in * array, **kwargs) + wsum = da.sum(weights_in * array, **kwargs) else: - wsum = ma.sum(array, **kwargs) + wsum = da.sum(array, **kwargs) if returned_in: if weights_in is None: - weights = np.ones_like(array) + weights = iris._lazy_data.as_lazy_data(np.ones_like(array)) else: weights = weights_in - rvalue = (wsum, ma.sum(weights, axis=axis_in)) + rvalue = (wsum, da.sum(weights, axis=axis_in)) else: rvalue = wsum return rvalue @@ -1352,8 +1392,9 @@ def interp_order(length): # # Common partial Aggregation class constructors. # -COUNT = Aggregator('count', _count, - units_func=lambda units: 1) +COUNT = Aggregator('count', iris._lazy_data.non_lazy(_lazy_count), + units_func=lambda units: 1, + lazy_func=_lazy_count) """ An :class:`~iris.analysis.Aggregator` instance that counts the number of :class:`~iris.cube.Cube` data occurrences that satisfy a particular @@ -1419,56 +1460,6 @@ def interp_order(length): """ -MAX = Aggregator('maximum', ma.max) -""" -An :class:`~iris.analysis.Aggregator` instance that calculates -the maximum over a :class:`~iris.cube.Cube`, as computed by -:func:`numpy.ma.max`. - -**For example**: - -To compute zonal maximums over the *longitude* axis of a cube:: - - result = cube.collapsed('longitude', iris.analysis.MAX) - -This aggregator handles masked data. - -""" - - -def _build_dask_mdtol_function(dask_stats_function): - """ - Make a wrapped dask statistic function that supports the 'mdtol' keyword. - - 'dask_function' must be a dask statistical function, compatible with the - call signature : "dask_stats_function(data, axis, **kwargs)". - It must be masked-data tolerant, i.e. it ignores masked input points and - performs a calculation on only the unmasked points. - For example, mean([1, --, 2]) = (1 + 2) / 2 = 1.5. - - The returned value is a new function operating on dask arrays. - It has the call signature `stat(data, axis=-1, mdtol=None, **kwargs)`. - - """ - @wraps(dask_stats_function) - def inner_stat(array, axis=-1, mdtol=None, **kwargs): - # Call the statistic to get the basic result (missing-data tolerant). - dask_result = dask_stats_function(array, axis=axis, **kwargs) - if mdtol is None or mdtol >= 1.0: - result = dask_result - else: - # Build a lazy computation to compare the fraction of missing - # input points at each output point to the 'mdtol' threshold. - point_mask_counts = da.sum(da.ma.getmaskarray(array), axis=axis) - points_per_calc = array.size / dask_result.size - masked_point_fractions = point_mask_counts / points_per_calc - boolean_mask = masked_point_fractions > mdtol - # Return an mdtol-masked version of the basic result. - result = da.ma.masked_array(da.ma.getdata(dask_result), - boolean_mask) - return result - return inner_stat - MEAN = WeightedAggregator('mean', ma.average, lazy_func=_build_dask_mdtol_function(da.mean)) """ @@ -1534,7 +1525,8 @@ def inner_stat(array, axis=-1, mdtol=None, **kwargs): """ -MIN = Aggregator('minimum', ma.min) +MIN = Aggregator('minimum', ma.min, + lazy_func=_build_dask_mdtol_function(da.min)) """ An :class:`~iris.analysis.Aggregator` instance that calculates the minimum over a :class:`~iris.cube.Cube`, as computed by @@ -1551,6 +1543,24 @@ def inner_stat(array, axis=-1, mdtol=None, **kwargs): """ +MAX = Aggregator('maximum', ma.max, + lazy_func=_build_dask_mdtol_function(da.max)) +""" +An :class:`~iris.analysis.Aggregator` instance that calculates +the maximum over a :class:`~iris.cube.Cube`, as computed by +:func:`numpy.ma.max`. + +**For example**: + +To compute zonal maximums over the *longitude* axis of a cube:: + + result = cube.collapsed('longitude', iris.analysis.MAX) + +This aggregator handles masked data. + +""" + + PEAK = Aggregator('peak', _peak) """ An :class:`~iris.analysis.Aggregator` instance that calculates @@ -1700,7 +1710,8 @@ def inner_stat(array, axis=-1, mdtol=None, **kwargs): """ -SUM = WeightedAggregator('sum', _sum) +SUM = WeightedAggregator('sum', iris._lazy_data.non_lazy(_lazy_sum), + lazy_func=_build_dask_mdtol_function(_lazy_sum)) """ An :class:`~iris.analysis.Aggregator` instance that calculates the sum over a :class:`~iris.cube.Cube`, as computed by :func:`numpy.ma.sum`. diff --git a/lib/iris/analysis/_grid_angles.py b/lib/iris/analysis/_grid_angles.py new file mode 100644 index 0000000000..90876840af --- /dev/null +++ b/lib/iris/analysis/_grid_angles.py @@ -0,0 +1,451 @@ +# (C) British Crown Copyright 2010 - 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +""" +Code to implement vector rotation by angles, and inferring gridcell angles +from coordinate points and bounds. + +""" +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +import numpy as np + +import cartopy.crs as ccrs +import iris + + +def _3d_xyz_from_latlon(lon, lat): + """ + Return locations of (lon, lat) in 3D space. + + Args: + + * lon, lat: (float array) + Arrays of longitudes and latitudes, in degrees. + Both the same shape. + + Returns: + + * xyz : (array, dtype=float64) + Cartesian coordinates on a unit sphere. + Shape is (3, ). + The x / y / z coordinates are in xyz[0 / 1 / 2]. + + """ + lon1 = np.deg2rad(lon).astype(np.float64) + lat1 = np.deg2rad(lat).astype(np.float64) + + x = np.cos(lat1) * np.cos(lon1) + y = np.cos(lat1) * np.sin(lon1) + z = np.sin(lat1) + + result = np.concatenate([array[np.newaxis] for array in (x, y, z)]) + + return result + + +def _latlon_from_xyz(xyz): + """ + Return arrays of lons+lats angles from xyz locations. + + Args: + + * xyz: (array) + Array of 3-D cartesian coordinates. + Shape (3, ). + x / y / z values are in xyz[0 / 1 / 2], + + Returns: + + * lonlat : (array) + longitude and latitude position angles, in degrees. + Shape (2, ). + The longitudes / latitudes are in lonlat[0 / 1]. + + """ + lons = np.rad2deg(np.arctan2(xyz[1], xyz[0])) + radii = np.sqrt(np.sum(xyz * xyz, axis=0)) + lats = np.rad2deg(np.arcsin(xyz[2] / radii)) + return np.array([lons, lats]) + + +def _angle(p, q, r): + """ + Estimate grid-angles to true-Eastward direction from positions in the same + grid row, but at increasing column (grid-Eastward) positions. + + {P, Q, R} are locations of consecutive points in the same grid row. + These could be successive points in a single grid, + e.g. {T(i-1,j), T(i,j), T(i+1,j)} + or a mixture of U/V and T gridpoints if row positions are aligned, + e.g. {v(i,j), f(i,j), v(i+1,j)}. + + Method: + + Calculate dot product of a unit-vector parallel to P-->R, unit-scaled, + with the unit eastward (true longitude) vector at Q. + This value is cos(required angle). + Discriminate between +/- angles by comparing latitudes of P and R. + Return NaN where any P-->R are zero. + + .. NOTE:: + + This method assumes that the vector PR is parallel to the surface + at the longitude of Q, as it uses the length of PR as the basis for + the cosine ratio. + That is only exact when Q is at the same longitude as the midpoint + of PR, and this typically causes errors which grow with increasing + gridcell angle. + However, we retain this method because it reproduces the "standard" + gridcell-orientation-angle arrays found in files output by the CICE + model, which presumably uses an equivalent calculation. + + Args: + + * p, q, r : (float array) + Arrays of angles, in degrees. + All the same shape. + Shape is (2, ). + Longitudes / latitudes are in array[0 / 1]. + + Returns: + + * angle : (float array) + Grid angles relative to true-East, in degrees. + Positive when grid-East is anticlockwise from true-East. + Shape is same as . + + """ + mid_lons = np.deg2rad(q[0]) + + pr = _3d_xyz_from_latlon(r[0], r[1]) - _3d_xyz_from_latlon(p[0], p[1]) + pr_norm = np.sqrt(np.sum(pr**2, axis=0)) + pr_top = pr[1] * np.cos(mid_lons) - pr[0] * np.sin(mid_lons) + + index = pr_norm == 0 + pr_norm[index] = 1 + + cosine = np.maximum(np.minimum(pr_top / pr_norm, 1), -1) + cosine[index] = 0 + + psi = np.arccos(cosine) * np.sign(r[1] - p[1]) + psi[index] = np.nan + + return np.rad2deg(psi) + + +def gridcell_angles(x, y=None, cell_angle_boundpoints='mid-lhs, mid-rhs'): + """ + Calculate gridcell orientations for an arbitrary 2-dimensional grid. + + The input grid is defined by two 2-dimensional coordinate arrays with the + same dimensions (ny, nx), specifying the geolocations of a 2D mesh. + + Input values may be coordinate points (ny, nx) or bounds (ny, nx, 4). + However, if points, the edges in the X direction are assumed to be + connected by wraparound. + + Input can be either two arrays, two coordinates, or a single cube + containing two suitable coordinates identified with the 'x' and'y' axes. + + Args: + + The inputs (x [,y]) can be any of the folliwing : + + * x (:class:`~iris.cube.Cube`): + a grid cube with 2D X and Y coordinates, identified by 'axis'. + The coordinates must be 2-dimensional with the same shape. + The two dimensions represent grid dimensions in the order Y, then X. + + * x, y (:class:`~iris.coords.Coord`): + X and Y coordinates, specifying grid locations on the globe. + The coordinates must be 2-dimensional with the same shape. + The two dimensions represent grid dimensions in the order Y, then X. + If there is no coordinate system, they are assumed to be true + longitudes and latitudes. Units must convertible to 'degrees'. + + * x, y (2-dimensional arrays of same shape (ny, nx)): + longitude and latitude cell center locations, in degrees. + The two dimensions represent grid dimensions in the order Y, then X. + + * x, y (3-dimensional arrays of same shape (ny, nx, 4)): + longitude and latitude cell bounds, in degrees. + The first two dimensions are grid dimensions in the order Y, then X. + The last index maps cell corners anticlockwise from bottom-left. + + Optional Args: + + * cell_angle_boundpoints (string): + Controls which gridcell bounds locations are used to calculate angles, + if the inputs are bounds or bounded coordinates. + Valid values are 'lower-left, lower-right', which takes the angle from + the lower left to the lower right corner, and 'mid-lhs, mid-rhs' which + takes an angles between the average of the left-hand and right-hand + pairs of corners. The default is 'mid-lhs, mid-rhs'. + + Returns: + + angles : (2-dimensional cube) + + Cube of angles of grid-x vector from true Eastward direction for + each gridcell, in degrees. + It also has "true" longitude and latitude coordinates, with no + coordinate system. + When the input has coords, then the output ones are identical if + the inputs are true-latlons, otherwise they are transformed + true-latlon versions. + When the input has bounded coords, then the output coords have + matching bounds and centrepoints (possibly transformed). + When the input is 2d arrays, or has unbounded coords, then the + output coords have matching points and no bounds. + When the input is 3d arrays, then the output coords have matching + bounds, and the centrepoints are an average of the 4 boundpoints. + + """ + cube = None + if hasattr(x, 'add_aux_coord'): + # Passed a cube : extract 'x' and ;'y' axis coordinates. + cube = x # Save for later checking. + x, y = cube.coord(axis='x'), cube.coord(axis='y') + + # Now should have either 2 coords or 2 arrays. + if not hasattr(x, 'shape') or not hasattr(y, 'shape'): + msg = ('Inputs (x,y) must have array shape property.' + 'Got type(x)={} and type(y)={}.') + raise ValueError(msg.format(type(x), type(y))) + + x_coord, y_coord = None, None + if hasattr(x, 'bounds') and hasattr(y, 'bounds'): + # x and y are Coords. + x_coord, y_coord = x.copy(), y.copy() + + # They must be angles : convert into degrees + for coord in (x_coord, y_coord): + if not coord.units.is_convertible('degrees'): + msg = ('Input X and Y coordinates must have angular ' + 'units. Got units of "{!s}" and "{!s}".') + raise ValueError(msg.format(x_coord.units, y_coord.units)) + coord.convert_units('degrees') + + if x_coord.ndim != 2 or y_coord.ndim != 2: + msg = ('Coordinate inputs must have 2-dimensional shape. ' + 'Got x-shape of {} and y-shape of {}.') + raise ValueError(msg.format(x_coord.shape, y_coord.shape)) + if x_coord.shape != y_coord.shape: + msg = ('Coordinate inputs must have same shape. ' + 'Got x-shape of {} and y-shape of {}.') + raise ValueError(msg.format(x_coord.shape, y_coord.shape)) + if cube: + x_dims, y_dims = (cube.coord_dims(co) for co in (x, y)) + if x_dims != y_dims: + msg = ('X and Y coordinates must have the same cube ' + 'dimensions. Got x-dims = {} and y-dims = {}.') + raise ValueError(msg.format(x_dims, y_dims)) + cs = x_coord.coord_system + if y_coord.coord_system != cs: + msg = ('Coordinate inputs must have same coordinate system. ' + 'Got x of {} and y of {}.') + raise ValueError(msg.format(cs, y_coord.coord_system)) + + # Base calculation on bounds if we have them, or points as a fallback. + if x_coord.has_bounds() and y_coord.has_bounds(): + x, y = x_coord.bounds, y_coord.bounds + else: + x, y = x_coord.points, y_coord.points + + # Make sure these arrays are ordinary lats+lons, in degrees. + if cs is not None: + # Transform points into true lats + lons. + crs_src = cs.as_cartopy_crs() + crs_pc = ccrs.PlateCarree() + + def transform_xy_arrays(x, y): + # Note: flatten, as transform_points is limited to 2D arrays. + shape = x.shape + x, y = (arr.flatten() for arr in (x, y)) + pts = crs_pc.transform_points(crs_src, x, y) + x = pts[..., 0].reshape(shape) + y = pts[..., 1].reshape(shape) + return x, y + + # Transform the main reference points into standard lats+lons. + x, y = transform_xy_arrays(x, y) + + # Likewise replace the original coordinates with transformed ones, + # because the calculation also needs the centrepoint values. + xpts, ypts = (coord.points for coord in (x_coord, y_coord)) + xbds, ybds = (coord.bounds for coord in (x_coord, y_coord)) + xpts, ypts = transform_xy_arrays(xpts, ypts) + xbds, ybds = transform_xy_arrays(xbds, ybds) + x_coord = iris.coords.AuxCoord( + points=xpts, bounds=xbds, + standard_name='longitude', units='degrees') + y_coord = iris.coords.AuxCoord( + points=ypts, bounds=ybds, + standard_name='latitude', units='degrees') + + elif hasattr(x, 'bounds') or hasattr(y, 'bounds'): + # One was a Coord, and the other not ? + is_and_not = ('x', 'y') + if hasattr(y, 'bounds'): + is_and_not = reversed(is_and_not) + msg = 'Input {!r} is a Coordinate, but {!r} is not.' + raise ValueError(msg.format(*is_and_not)) + + # Now have either 2 points arrays (ny, nx) or 2 bounds arrays (ny, nx, 4). + # Construct (lhs, mid, rhs) where these represent 3 points at increasing + # grid-x indices (columns). + # Also make suitable X and Y coordinates for the result cube. + if x.ndim == 2: + # Data is points arrays. + # Use previous + subsequent points along grid-x-axis as references. + + # PROBLEM: we assume that the rhs connects to the lhs, so we should + # really only use this if data is full-longitudes (as a 'circular' + # coordinate). + # This is mentioned in the docstring, but we have no general means of + # checking it. + + # NOTE: we take the 2d grid as presented, so the second dimension is + # the 'X' dim. Again, that is implicit + can't be checked. + mid = np.array([x, y]) + lhs = np.roll(mid, 1, 2) + rhs = np.roll(mid, -1, 2) + if not x_coord: + # Create coords for result cube : with no bounds. + y_coord = iris.coords.AuxCoord(x, standard_name='latitude', + units='degrees') + x_coord = iris.coords.AuxCoord(y, standard_name='longitude', + units='degrees') + else: + # Data is bounds arrays. + # Use gridcell corners at different grid-x positions as references. + # NOTE: so with bounds, we *don't* need full circular longitudes. + xyz = _3d_xyz_from_latlon(x, y) + # Support two different choices of reference points locations. + angle_boundpoints_vals = {'mid-lhs, mid-rhs': '03_to_12', + 'lower-left, lower-right': '0_to_1'} + bounds_pos = angle_boundpoints_vals.get(cell_angle_boundpoints) + if bounds_pos == '0_to_1': + lhs_xyz = xyz[..., 0] + rhs_xyz = xyz[..., 1] + elif bounds_pos == '03_to_12': + lhs_xyz = 0.5 * (xyz[..., 0] + xyz[..., 3]) + rhs_xyz = 0.5 * (xyz[..., 1] + xyz[..., 2]) + else: + msg = ('unrecognised cell_angle_boundpoints of "{}", ' + 'must be one of {}') + raise ValueError(msg.format(cell_angle_boundpoints, + list(angle_boundpoints_vals.keys()))) + if not x_coord: + # Create bounded coords for result cube. + # Use average of lhs+rhs points in 3d to get 'mid' points, + # as coords without points are not allowed. + mid_xyz = 0.5 * (lhs_xyz + rhs_xyz) + mid_latlons = _latlon_from_xyz(mid_xyz) + # Create coords with given bounds, and averaged centrepoints. + x_coord = iris.coords.AuxCoord( + points=mid_latlons[0], bounds=x, + standard_name='longitude', units='degrees') + y_coord = iris.coords.AuxCoord( + points=mid_latlons[1], bounds=y, + standard_name='latitude', units='degrees') + + # Convert lhs and rhs points back to latlon form -- IN DEGREES ! + lhs = _latlon_from_xyz(lhs_xyz) + rhs = _latlon_from_xyz(rhs_xyz) + # 'mid' is coord.points, whether from input or just made up. + mid = np.array([x_coord.points, y_coord.points]) + + # Do the angle calcs, and return as a suitable cube. + angles = _angle(lhs, mid, rhs) + result = iris.cube.Cube(angles, + long_name='gridcell_angle_from_true_east', + units='degrees') + result.add_aux_coord(x_coord, (0, 1)) + result.add_aux_coord(y_coord, (0, 1)) + return result + + +def rotate_grid_vectors(u_cube, v_cube, grid_angles_cube=None, + grid_angles_kwargs=None): + """ + Rotate distance vectors from grid-oriented to true-latlon-oriented. + + Can also rotate by arbitrary angles, if they are passed in. + + .. Note:: + + This operation overlaps somewhat in function with + :func:`iris.analysis.cartography.rotate_winds`. + However, that routine only rotates vectors according to transformations + between coordinate systems. + This function, by contrast, can rotate vectors by arbitrary angles. + Most commonly, the angles are estimated solely from grid sampling + points, using :func:`gridcell_angles` : This allows operation on + complex meshes defined by two-dimensional coordinates, such as most + ocean grids. + + Args: + + * u_cube, v_cube : (cube) + Cubes of grid-u and grid-v vector components. + Units should be differentials of true-distance, e.g. 'm/s'. + + Optional args: + + * grid_angles_cube : (cube) + gridcell orientation angles. + Units must be angular, i.e. can be converted to 'radians'. + If not provided, grid angles are estimated from 'u_cube' using the + :func:`gridcell_angles` method. + + * grid_angles_kwargs : (dict or None) + Additional keyword args to be passed to the :func:`gridcell_angles` + method, if it is used. + + Returns: + + true_u, true_v : (cube) + Cubes of true-north oriented vector components. + Units are same as inputs. + + .. Note:: + + Vector magnitudes will always be the same as the inputs. + + """ + u_out, v_out = (cube.copy() for cube in (u_cube, v_cube)) + if not grid_angles_cube: + grid_angles_kwargs = grid_angles_kwargs or {} + grid_angles_cube = gridcell_angles(u_cube, **grid_angles_kwargs) + gridangles = grid_angles_cube.copy() + gridangles.convert_units('radians') + uu, vv, aa = (cube.data for cube in (u_out, v_out, gridangles)) + mags = np.sqrt(uu*uu + vv*vv) + angs = np.arctan2(vv, uu) + aa + uu, vv = mags * np.cos(angs), mags * np.sin(angs) + + # Promote all to masked arrays, and also apply mask at bad (NaN) angles. + mask = np.isnan(aa) + for cube in (u_out, v_out, aa): + if hasattr(cube.data, 'mask'): + mask |= cube.data.mask + u_out.data = np.ma.masked_array(uu, mask=mask) + v_out.data = np.ma.masked_array(vv, mask=mask) + + return u_out, v_out diff --git a/lib/iris/analysis/_regrid.py b/lib/iris/analysis/_regrid.py index 71304a5dde..83561a2b94 100644 --- a/lib/iris/analysis/_regrid.py +++ b/lib/iris/analysis/_regrid.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2014 - 2017, Met Office +# (C) British Crown Copyright 2014 - 2018, Met Office # # This file is part of Iris. # diff --git a/lib/iris/analysis/cartography.py b/lib/iris/analysis/cartography.py index 4b68dcc949..a778f5f846 100644 --- a/lib/iris/analysis/cartography.py +++ b/lib/iris/analysis/cartography.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2010 - 2017, Met Office +# (C) British Crown Copyright 2010 - 2018, Met Office # # This file is part of Iris. # @@ -37,7 +37,24 @@ import iris.coord_systems import iris.exceptions from iris.util import _meshgrid - +from ._grid_angles import gridcell_angles, rotate_grid_vectors + +# List of contents to control Sphinx autodocs. +# Unfortunately essential to get docs for the grid_angles functions. +__all__ = [ + 'area_weights', + 'cosine_latitude_weights', + 'get_xy_contiguous_bounded_grids', + 'get_xy_grids', + 'gridcell_angles', + 'project', + 'rotate_grid_vectors', + 'rotate_pole', + 'rotate_winds', + 'unrotate_pole', + 'wrap_lons', + 'DistanceDifferential', + 'PartialDifferential'] # This value is used as a fall-back if the cube does not define the earth DEFAULT_SPHERICAL_EARTH_RADIUS = 6367470 @@ -332,7 +349,7 @@ def _quadrant_area(radian_lat_bounds, radian_lon_bounds, radius_of_earth): def area_weights(cube, normalize=False): - """ + r""" Returns an array of area weights, with the same dimensions as the cube. This is a 2D lat/lon area weights array, repeated over the non lat/lon @@ -440,7 +457,7 @@ def area_weights(cube, normalize=False): def cosine_latitude_weights(cube): - """ + r""" Returns an array of latitude weights, with the same dimensions as the cube. The weights are the cosine of latitude. @@ -950,7 +967,7 @@ def _transform_distance_vectors_tolerance_mask(src_crs, x, y, tgt_crs, def rotate_winds(u_cube, v_cube, target_cs): - """ + r""" Transform wind vectors to a different coordinate system. The input cubes contain U and V components parallel to the local X and Y diff --git a/lib/iris/analysis/geometry.py b/lib/iris/analysis/geometry.py index a415e10091..cca5d836ec 100644 --- a/lib/iris/analysis/geometry.py +++ b/lib/iris/analysis/geometry.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2010 - 2015, Met Office +# (C) British Crown Copyright 2010 - 2018, Met Office # # This file is part of Iris. # @@ -223,7 +223,7 @@ def geometry_area_weights(cube, geometry, normalize=False): else: slices.append(slice(None)) - weights[slices] = subweights + weights[tuple(slices)] = subweights # Fix for the limitation of iris.analysis.MEAN weights handling. # Broadcast the array to the full shape of the cube diff --git a/lib/iris/coords.py b/lib/iris/coords.py index e4a8eb35f2..6402e73263 100644 --- a/lib/iris/coords.py +++ b/lib/iris/coords.py @@ -143,6 +143,47 @@ def __new__(cls, name_or_coord, minimum, maximum, 'groupby_point, groupby_slice') +def _get_2d_coord_bound_grid(bounds): + """ + Creates a grid using the bounds of a 2D coordinate with 4 sided cells. + + Assumes that the four vertices of the cells are in an anti-clockwise order + (bottom-left, bottom-right, top-right, top-left). + + Selects the zeroth vertex of each cell. A final column is added, which + contains the first vertex of the cells in the final column. A final row + is added, which contains the third vertex of all the cells in the final + row, except for in the final column where it uses the second vertex. + e.g. + # 0-0-0-0-1 + # 0-0-0-0-1 + # 3-3-3-3-2 + + Args: + * bounds: (array) + Coordinate bounds array of shape (Y, X, 4) + + Returns: + * grid: (array) + Grid of shape (Y+1, X+1) + + """ + # Check bds has the shape (ny, nx, 4) + if not (bounds.ndim == 3 and bounds.shape[-1] == 4): + raise ValueError('Bounds for 2D coordinates must be 3-dimensional and ' + 'have 4 bounds per point.') + + bounds_shape = bounds.shape + result = np.zeros((bounds_shape[0] + 1, bounds_shape[1] + 1)) + + result[:-1, :-1] = bounds[:, :, 0] + result[:-1, -1] = bounds[:, -1, 1] + result[-1, :-1] = bounds[-1, :, 3] + result[-1, -1] = bounds[-1, -1, 2] + + return result + + class Cell(collections.namedtuple('Cell', ['point', 'bound'])): """ An immutable representation of a single cell of a coordinate, including the @@ -741,7 +782,11 @@ def __str__(self): points = self._str_dates(self.points) bounds = '' if self.has_bounds(): - bounds = ', bounds=' + self._str_dates(self.bounds) + if self.units.is_long_time_interval(): + bounds_vals = self.bounds + else: + bounds_vals = self._str_dates(self.bounds) + bounds = ', bounds={vals}'.format(vals=bounds_vals) result = fmt.format(self=self, cls=type(self).__name__, points=points, bounds=bounds, other_metadata=self._repr_other_metadata()) @@ -950,22 +995,101 @@ def cells(self): """ return _CellIterator(self) - def _sanity_check_contiguous(self): - if self.ndim != 1: - raise iris.exceptions.CoordinateMultiDimError( - 'Invalid operation for {!r}. Contiguous bounds are not defined' - ' for multi-dimensional coordinates.'.format(self.name())) - if self.nbounds != 2: - raise ValueError( - 'Invalid operation for {!r}, with {} bounds. Contiguous bounds' - ' are only defined for coordinates with 2 bounds.'.format( - self.name(), self.nbounds)) + def _sanity_check_bounds(self): + if self.ndim == 1: + if self.nbounds != 2: + raise ValueError('Invalid operation for {!r}, with {} ' + 'bound(s). Contiguous bounds are only ' + 'defined for 1D coordinates with 2 ' + 'bounds.'.format(self.name(), self.nbounds)) + elif self.ndim == 2: + if self.nbounds != 4: + raise ValueError('Invalid operation for {!r}, with {} ' + 'bound(s). Contiguous bounds are only ' + 'defined for 2D coordinates with 4 ' + 'bounds.'.format(self.name(), self.nbounds)) + else: + raise ValueError('Invalid operation for {!r}. Contiguous bounds ' + 'are not defined for coordinates with more than ' + '2 dimensions.'.format(self.name())) + + def _discontiguity_in_bounds(self, rtol=1e-05, atol=1e-08): + """ + Checks that the bounds of the coordinate are contiguous. + + Kwargs: + * rtol: (float) + Relative tolerance that is used when checking contiguity. Defaults + to 1e-5. + * atol: (float) + Absolute tolerance that is used when checking contiguity. Defaults + to 1e-8. + + Returns: + * contiguous: (boolean) + True if there are no discontiguities. + * diffs: (array or tuple of arrays) + The diffs along the bounds of the coordinate. If self is a 2D + coord of shape (Y, X), a tuple of arrays is returned, where the + first is an array of differences along the x-axis, of the shape + (Y, X-1) and the second is an array of differences along the + y-axis, of the shape (Y-1, X). + + """ + self._sanity_check_bounds() + + if self.ndim == 1: + contiguous = np.allclose(self.bounds[1:, 0], + self.bounds[:-1, 1], + rtol=rtol, atol=atol) + diffs = np.abs(self.bounds[:-1, 1] - self.bounds[1:, 0]) + + elif self.ndim == 2: + def mod360_adjust(compare_axis): + bounds = self.bounds.copy() + + if compare_axis == 'x': + upper_bounds = bounds[:, :-1, 1] + lower_bounds = bounds[:, 1:, 0] + elif compare_axis == 'y': + upper_bounds = bounds[:-1, :, 3] + lower_bounds = bounds[1:, :, 0] + + if self.name() in ['longitude', 'grid_longitude']: + # If longitude, adjust for longitude wrapping + diffs = upper_bounds - lower_bounds + index = diffs > 180 + if index.any(): + sign = np.sign(diffs) + modification = (index.astype(int) * 360) * sign + upper_bounds -= modification + + diffs_along_axis = np.abs(upper_bounds - lower_bounds) + contiguous_along_axis = np.allclose(upper_bounds, lower_bounds, + rtol=rtol, atol=atol) + return diffs_along_axis, contiguous_along_axis + + diffs_along_x, match_cell_x1 = mod360_adjust(compare_axis='x') + diffs_along_y, match_cell_y1 = mod360_adjust(compare_axis='y') + + contiguous = match_cell_x1 and match_cell_y1 + diffs = (diffs_along_x, diffs_along_y) + + return contiguous, diffs def is_contiguous(self, rtol=1e-05, atol=1e-08): """ Return True if, and only if, this Coord is bounded with contiguous bounds to within the specified relative and absolute tolerances. + 1D coords are contiguous if the upper bound of a cell aligns, + within a tolerance, to the lower bound of the next cell along. + + 2D coords, with 4 bounds, are contiguous if the lower right corner of + each cell aligns with the lower left corner of the cell to the right of + it, and the upper left corner of each cell aligns with the lower left + corner of the cell above it. + Args: * rtol: @@ -978,33 +1102,49 @@ def is_contiguous(self, rtol=1e-05, atol=1e-08): """ if self.has_bounds(): - self._sanity_check_contiguous() - return np.allclose(self.bounds[1:, 0], self.bounds[:-1, 1], - rtol=rtol, atol=atol) + contiguous, _ = self._discontiguity_in_bounds(rtol=rtol, atol=atol) else: - return False + contiguous = False + return contiguous def contiguous_bounds(self): """ - Returns the N+1 bound values for a contiguous bounded coordinate - of length N. + Returns the N+1 bound values for a contiguous bounded 1D coordinate + of length N, or the (N+1, M+1) bound values for a contiguous bounded 2D + coordinate of shape (N, M). + + Only 1D or 2D coordinates are supported. .. note:: - If the coordinate is does not have bounds, this method will + If the coordinate has bounds, this method assumes they are + contiguous. + + If the coordinate is 1D and does not have bounds, this method will return bounds positioned halfway between the coordinate's points. + If the coordinate is 2D and does not have bounds, an error will be + raised. + """ if not self.has_bounds(): - warnings.warn('Coordinate {!r} is not bounded, guessing ' - 'contiguous bounds.'.format(self.name())) - bounds = self._guess_bounds() + if self.ndim == 1: + warnings.warn('Coordinate {!r} is not bounded, guessing ' + 'contiguous bounds.'.format(self.name())) + bounds = self._guess_bounds() + elif self.ndim == 2: + raise ValueError('2D coordinate {!r} is not bounded. Guessing ' + 'bounds of 2D coords is not currently ' + 'supported.'.format(self.name())) else: - self._sanity_check_contiguous() + self._sanity_check_bounds() bounds = self.bounds - c_bounds = np.resize(bounds[:, 0], bounds.shape[0] + 1) - c_bounds[-1] = bounds[-1, 1] + if self.ndim == 1: + c_bounds = np.resize(bounds[:, 0], bounds.shape[0] + 1) + c_bounds[-1] = bounds[-1, 1] + elif self.ndim == 2: + c_bounds = _get_2d_coord_bound_grid(bounds) return c_bounds def is_monotonic(self): diff --git a/lib/iris/cube.py b/lib/iris/cube.py index 9c42299c42..1e7fcb7e02 100644 --- a/lib/iris/cube.py +++ b/lib/iris/cube.py @@ -994,6 +994,12 @@ def add_aux_factory(self, aux_factory): if not isinstance(aux_factory, iris.aux_factory.AuxCoordFactory): raise TypeError('Factory must be a subclass of ' 'iris.aux_factory.AuxCoordFactory.') + cube_coords = self.coords() + for dependency in aux_factory.dependencies: + ref_coord = aux_factory.dependencies[dependency] + if ref_coord is not None and ref_coord not in cube_coords: + msg = "{} coordinate for factory is not present on cube {}" + raise ValueError(msg.format(ref_coord.name(), self.name())) self._aux_factories.append(aux_factory) def add_cell_measure(self, cell_measure, data_dims=None): @@ -3241,10 +3247,10 @@ def collapsed(self, coords, aggregator, **kwargs): # on the cube lazy array. # NOTE: do not reform the data in this case, as 'lazy_aggregate' # accepts multiple axes (unlike 'aggregate'). - collapse_axis = dims_to_collapse + collapse_axis = list(dims_to_collapse) try: data_result = aggregator.lazy_aggregate(self.lazy_data(), - collapse_axis, + axis=collapse_axis, **kwargs) except TypeError: # TypeError - when unexpected keywords passed through (such as diff --git a/lib/iris/fileformats/_pyke_rules/fc_rules_cf.krb b/lib/iris/fileformats/_pyke_rules/fc_rules_cf.krb index 942450847b..5fc68e58c2 100644 --- a/lib/iris/fileformats/_pyke_rules/fc_rules_cf.krb +++ b/lib/iris/fileformats/_pyke_rules/fc_rules_cf.krb @@ -40,7 +40,7 @@ fc_default # # Context: -# This rule will trigger iff a grid_mapping() case specific fact +# This rule will trigger iff a grid_mapping() case specific fact # has been asserted that refers to a rotated pole. # # Purpose: @@ -120,7 +120,7 @@ fc_provides_grid_mapping_mercator # # Context: -# This rule will trigger iff a grid_mapping() case specific fact +# This rule will trigger iff a grid_mapping() case specific fact # has been asserted that refers to a stereographic. # # Purpose: @@ -199,10 +199,10 @@ fc_provides_grid_mapping_albers_equal_area # # Context: # This rule will trigger iff a coordinate() case specific fact -# has been asserted that refers to a CF latitude coordinate. +# has been asserted that refers to a CF latitude coordinate. # # Purpose: -# Assert that the CF latitude coordinate exists. +# Assert that the CF latitude coordinate exists. # fc_provides_coordinate_latitude foreach @@ -435,10 +435,10 @@ fc_build_auxiliary_coordinate_longitude_rotated # # Context: # This rule will trigger for each auxiliary_coordinate() case specific fact -# that is not a spatio-temporal related auxiliary coordinate. +# that is not a spatio-temporal related auxiliary coordinate. # # Purpose: -# Add the auxiliary coordinate to the cube. +# Add the auxiliary coordinate to the cube. # fc_build_auxiliary_coordinate foreach @@ -457,7 +457,7 @@ fc_build_auxiliary_coordinate # This rule will trigger for each cell_measure case specific fact. # # Purpose: -# Add the cell measures attribute to the cube. +# Add the cell measures attribute to the cube. # fc_build_cell_measure foreach @@ -549,7 +549,7 @@ fc_build_coordinate_longitude_rotated python build_dimension_coordinate(engine, cf_coord_var, coord_name=CF_VALUE_STD_NAME_GRID_LON, coord_system=engine.provides['coordinate_system']) - python engine.rule_triggered.add(rule.name) + python engine.rule_triggered.add(rule.name) # @@ -597,7 +597,7 @@ fc_build_coordinate_longitude_nocs coord_system=None) python engine.rule_triggered.add(rule.name) - + # # Context: # This rule will trigger iff a projection_x_coordinate coordinate exists and @@ -917,8 +917,8 @@ fc_attribute_ukmo__process_flags python attr_value = engine.cf_var.ukmo__process_flags python engine.cube.attributes['ukmo__process_flags'] = tuple([x.replace("_", " ") for x in attr_value.split(" ")]) python engine.rule_triggered.add(rule.name) - - + + # # Context: # This rule will trigger iff a formula term that refers to a @@ -1070,14 +1070,14 @@ fc_extras import iris.coord_systems import iris.fileformats.cf as cf import iris.fileformats.netcdf - from iris.fileformats.netcdf import parse_cell_methods, UnknownCellMethodWarning + from iris.fileformats.netcdf import _get_cf_var_data, parse_cell_methods, UnknownCellMethodWarning import iris.fileformats.pp as pp import iris.exceptions import iris.std_names import iris.util from iris._lazy_data import as_lazy_data - - + + # # UD Units Constants (based on Unidata udunits.dat definition file) # @@ -1087,7 +1087,7 @@ fc_extras UD_UNITS_LON = ['degrees_east', 'degree_east', 'degree_e', 'degrees_e', 'degreee', 'degreese', 'degrees', 'degrees east', 'degree east', 'degree e', 'degrees e'] - + # # CF Dimensionless Vertical Coordinates # @@ -1119,7 +1119,7 @@ fc_extras CF_GRID_MAPPING_STEREO = 'stereographic' CF_GRID_MAPPING_TRANSVERSE = 'transverse_mercator' CF_GRID_MAPPING_VERTICAL = 'vertical_perspective' - + # # CF Attribute Names. # @@ -1149,7 +1149,7 @@ fc_extras CF_ATTR_LONG_NAME = 'long_name' CF_ATTR_UNITS = 'units' CF_ATTR_CELL_METHODS = 'cell_methods' - + # # CF Attribute Value Constants. # @@ -1158,11 +1158,11 @@ fc_extras CF_VALUE_AXIS_Y = 'y' CF_VALUE_AXIS_T = 't' CF_VALUE_AXIS_Z = 'z' - - + + # Attribute - positive. CF_VALUE_POSITIVE = ['down', 'up'] - + # Attribute - standard_name. CF_VALUE_STD_NAME_LAT = 'latitude' CF_VALUE_STD_NAME_LON = 'longitude' @@ -1213,7 +1213,7 @@ fc_extras msg = msg.replace('variable', 'variable {!r}'.format(name)) warnings.warn(message=msg, category=UnknownCellMethodWarning) - # Set the cube global attributes. + # Set the cube global attributes. for attr_name, attr_value in six.iteritems(cf_var.cf_group.global_attributes): try: if six.PY2 and isinstance(attr_value, six.text_type): @@ -1242,7 +1242,7 @@ fc_extras # Check for a default spherical earth. if major is None and minor is None and inverse_flattening is None: - major = getattr(cf_grid_var, CF_ATTR_GRID_EARTH_RADIUS, None) + major = getattr(cf_grid_var, CF_ATTR_GRID_EARTH_RADIUS, None) return major, minor, inverse_flattening @@ -1344,7 +1344,7 @@ fc_extras inverse_flattening is not None: ellipsoid = iris.coord_systems.GeogCS(major, minor, inverse_flattening) - + cs = iris.coord_systems.LambertConformal( latitude_of_projection_origin, longitude_of_central_meridian, false_easting, false_northing, standard_parallel, @@ -1401,7 +1401,7 @@ fc_extras # values for false_easting, false_northing, # scale_factor_at_projection_origin and standard_parallel. These are # checked elsewhere. - + ellipsoid = None if major is not None or minor is not None or \ inverse_flattening is not None: @@ -1621,7 +1621,7 @@ fc_extras attr_units = get_attr_units(cf_coord_var, attributes) points_data = cf_coord_var[:] # Gracefully fill points masked array. - if ma.isMaskedArray(points_data): + if ma.is_masked(points_data): points_data = ma.filled(points_data) msg = 'Gracefully filling {!r} dimension coordinate masked points' warnings.warn(msg.format(str(cf_coord_var.cf_name))) @@ -1631,7 +1631,7 @@ fc_extras if cf_bounds_var is not None: bounds_data = cf_bounds_var[:] # Gracefully fill bounds masked array. - if ma.isMaskedArray(bounds_data): + if ma.is_masked(bounds_data): bounds_data = ma.filled(bounds_data) msg = 'Gracefully filling {!r} dimension coordinate masked bounds' warnings.warn(msg.format(str(cf_coord_var.cf_name))) @@ -1696,7 +1696,7 @@ fc_extras else: # Scalar coords are placed in the aux_coords container. cube.add_aux_coord(coord, data_dims) - + # Update the coordinate to CF-netCDF variable mapping. engine.provides['coordinates'].append((coord, cf_coord_var.cf_name)) @@ -1712,25 +1712,16 @@ fc_extras # Get units attr_units = get_attr_units(cf_coord_var, attributes) - def cf_var_as_array(cf_var): - dtype = iris.fileformats.netcdf._get_actual_dtype(cf_var) - fill_value = getattr(cf_var.cf_data, '_FillValue', - netCDF4.default_fillvals[dtype.str[1:]]) - proxy = iris.fileformats.netcdf.NetCDFDataProxy( - cf_var.shape, dtype, engine.filename, - cf_var.cf_name, fill_value) - return as_lazy_data(proxy) - # Get any coordinate point data. if isinstance(cf_coord_var, cf.CFLabelVariable): points_data = cf_coord_var.cf_label_data(cf_var) else: - points_data = cf_var_as_array(cf_coord_var) + points_data = _get_cf_var_data(cf_coord_var, engine.filename) # Get any coordinate bounds. cf_bounds_var = get_cf_bounds_var(cf_coord_var) if cf_bounds_var is not None: - bounds_data = cf_var_as_array(cf_bounds_var) + bounds_data = _get_cf_var_data(cf_bounds_var, engine.filename) # Handle transposed bounds where the vertex dimension is not # the last one. Test based on shape to support different @@ -1748,7 +1739,7 @@ fc_extras # and the coordinate being built. common_dims = [dim for dim in cf_coord_var.dimensions if dim in cf_var.dimensions] - data_dims = None + data_dims = None if common_dims: # Calculate the offset of each common dimension. data_dims = [cf_var.dimensions.index(dim) for dim in common_dims] @@ -1772,7 +1763,7 @@ fc_extras # Update the coordinate to CF-netCDF variable mapping. engine.provides['coordinates'].append((coord, cf_coord_var.cf_name)) - + ################################################################################ def build_cell_measures(engine, cf_cm_attr, coord_name=None): """Create a CellMeasure instance and add it to the cube.""" @@ -1783,34 +1774,25 @@ fc_extras # Get units attr_units = get_attr_units(cf_cm_attr, attributes) - def cf_var_as_array(cf_var): - dtype = cf_var.dtype - fill_value = getattr(cf_var.cf_data, '_FillValue', - netCDF4.default_fillvals[dtype.str[1:]]) - proxy = iris.fileformats.netcdf.NetCDFDataProxy( - cf_var.shape, dtype, engine.filename, - cf_var.cf_name, fill_value) - return as_lazy_data(proxy) - - data = cf_var_as_array(cf_cm_attr) + data = _get_cf_var_data(cf_cm_attr, engine.filename) # Determine the name of the dimension/s shared between the CF-netCDF data variable # and the coordinate being built. common_dims = [dim for dim in cf_cm_attr.dimensions if dim in cf_var.dimensions] - data_dims = None + data_dims = None if common_dims: # Calculate the offset of each common dimension. data_dims = [cf_var.dimensions.index(dim) for dim in common_dims] # Determine the standard_name, long_name and var_name standard_name, long_name, var_name = get_names(cf_cm_attr, coord_name, attributes) - + # Obtain the cf_measure. measure = cf_cm_attr.cf_measure - + # Create the CellMeasure - cell_measure = iris.coords.CellMeasure(data, + cell_measure = iris.coords.CellMeasure(data, standard_name=standard_name, long_name=long_name, var_name=var_name, @@ -1820,17 +1802,17 @@ fc_extras # Add it to the cube cube.add_cell_measure(cell_measure, data_dims) - + ################################################################################ def _is_lat_lon(cf_var, ud_units, std_name, std_name_grid, axis_name, prefixes): """ Determine whether the CF coordinate variable is a latitude/longitude variable. - + Ref: [CF] Section 4.1 Latitude Coordinate. [CF] Section 4.2 Longitude Coordinate. - + """ is_valid = False attr_units = getattr(cf_var, CF_ATTR_UNITS, None) @@ -1872,7 +1854,7 @@ fc_extras def is_latitude(engine, cf_name): """Determine whether the CF coordinate variable is a latitude variable.""" cf_var = engine.cf_var.cf_group[cf_name] - return _is_lat_lon(cf_var, UD_UNITS_LAT, CF_VALUE_STD_NAME_LAT, + return _is_lat_lon(cf_var, UD_UNITS_LAT, CF_VALUE_STD_NAME_LAT, CF_VALUE_STD_NAME_GRID_LAT, CF_VALUE_AXIS_Y, ['lat', 'rlat']) @@ -1928,7 +1910,7 @@ fc_extras is_time_reference = cf_units.Unit(attr_units or 1).is_time_reference() except ValueError: is_time_reference = False - + return is_time_reference and (attr_std_name=='time' or attr_axis.lower()==CF_VALUE_AXIS_T) @@ -1996,7 +1978,7 @@ fc_extras def has_supported_mercator_parameters(engine, cf_name): """Determine whether the CF grid mapping variable has the supported values for the parameters of the Mercator projection.""" - + is_valid = True cf_grid_var = engine.cf_var.cf_group[cf_name] @@ -2036,8 +2018,8 @@ fc_extras ################################################################################ def has_supported_stereographic_parameters(engine, cf_name): """Determine whether the CF grid mapping variable has a value of 1.0 - for the scale_factor_at_projection_origin attribute.""" - + for the scale_factor_at_projection_origin attribute.""" + is_valid = True cf_grid_var = engine.cf_var.cf_group[cf_name] @@ -2120,6 +2102,6 @@ fc_extras if len(d[CM_NAME]) != len(comment) and len(comment) == 1: comment = comment*len(d[CM_NAME]) d[CM_INTERVAL] = tuple(interval) - d[CM_COMMENT] = tuple(comment) + d[CM_COMMENT] = tuple(comment) cell_methods.append(iris.coords.CellMethod(d[CM_METHOD], coords=d[CM_NAME], intervals=d[CM_INTERVAL], comments=d[CM_COMMENT])) return tuple(cell_methods) diff --git a/lib/iris/fileformats/netcdf.py b/lib/iris/fileformats/netcdf.py index 6ac9c26811..8c965ef1a5 100644 --- a/lib/iris/fileformats/netcdf.py +++ b/lib/iris/fileformats/netcdf.py @@ -501,8 +501,8 @@ def _get_actual_dtype(cf_var): return dummy_data.dtype -def _load_cube(engine, cf, cf_var, filename): - """Create the cube associated with the CF-netCDF data variable.""" +def _get_cf_var_data(cf_var, filename): + # Get lazy chunked data out of a cf variable. dtype = _get_actual_dtype(cf_var) # Create cube with deferred data, but no metadata @@ -510,7 +510,16 @@ def _load_cube(engine, cf, cf_var, filename): netCDF4.default_fillvals[cf_var.dtype.str[1:]]) proxy = NetCDFDataProxy(cf_var.shape, dtype, filename, cf_var.cf_name, fill_value) - data = as_lazy_data(proxy) + chunks = cf_var.cf_data.chunking() + # Chunks can be an iterable, None, or `'contiguous'`. + if chunks == 'contiguous': + chunks = None + return as_lazy_data(proxy, chunks=chunks) + + +def _load_cube(engine, cf, cf_var, filename): + """Create the cube associated with the CF-netCDF data variable.""" + data = _get_cf_var_data(cf_var, filename) cube = iris.cube.Cube(data) # Reset the pyke inference engine. diff --git a/lib/iris/fileformats/pp_save_rules.py b/lib/iris/fileformats/pp_save_rules.py index 84b4980647..5323311061 100644 --- a/lib/iris/fileformats/pp_save_rules.py +++ b/lib/iris/fileformats/pp_save_rules.py @@ -174,7 +174,14 @@ def _general_time_rules(cube, pp): if (time_coord is not None and time_coord.has_bounds() and clim_season_coord is None and - (fp_coord is not None or frt_coord is not None) and + cm_time_mean is not None): + pp.lbtim.ib = 2 + pp.t1 = time_coord.units.num2date(time_coord.bounds[0, 0]) + pp.t2 = time_coord.units.num2date(time_coord.bounds[0, 1]) + + if (time_coord is not None and + time_coord.has_bounds() and + clim_season_coord is None and cm_time_mean is not None and cm_time_mean.intervals != () and cm_time_mean.intervals[0].endswith('hour')): diff --git a/lib/iris/pandas.py b/lib/iris/pandas.py index 244a7c011b..798e034017 100644 --- a/lib/iris/pandas.py +++ b/lib/iris/pandas.py @@ -115,7 +115,7 @@ def as_cube(pandas_array, copy=True, calendars=None): _add_iris_coord(cube, "index", pandas_array.index, 0, calendars.get(0, None)) if pandas_array.ndim == 2: - _add_iris_coord(cube, "columns", pandas_array.columns, 1, + _add_iris_coord(cube, "columns", pandas_array.columns.values, 1, calendars.get(1, None)) return cube diff --git a/lib/iris/plot.py b/lib/iris/plot.py index 64ce54c04c..e3193e96a4 100644 --- a/lib/iris/plot.py +++ b/lib/iris/plot.py @@ -31,13 +31,13 @@ import cartopy.crs as ccrs import cartopy.mpl.geoaxes from cartopy.geodesic import Geodesic +import cftime import matplotlib.axes import matplotlib.collections as mpl_collections import matplotlib.dates as mpl_dates import matplotlib.pyplot as plt from matplotlib.offsetbox import AnchoredText import matplotlib.ticker as mpl_ticker -import cftime import matplotlib.transforms as mpl_transforms import numpy as np import numpy.ma as ma @@ -51,7 +51,6 @@ import iris.palette from iris.util import _meshgrid - # Cynthia Brewer citation text. BREWER_CITE = 'Colours based on ColorBrewer.org' @@ -95,15 +94,17 @@ def get_span(coord): else: span = set(cube.coord_dims(coord)) return span + spans = list(map(get_span, coords)) for span, coord in zip(spans, coords): if not span: msg = 'The coordinate {!r} doesn\'t span a data dimension.' raise ValueError(msg.format(coord.name())) - if mode == iris.coords.BOUND_MODE and len(span) != 1: - raise ValueError('The coordinate {!r} is multi-dimensional and' - ' cannot be used in a cell-based plot.' - .format(coord.name())) + if mode == iris.coords.BOUND_MODE and len(span) not in [1, 2]: + raise ValueError('The coordinate {!r} has {} dimensions.' + 'Cell-based plotting is only supported for' + 'coordinates with one or two dimensions.' + .format(coord.name()), len(span)) # Check the combination of coordinates spans enough (ndims) data # dimensions. @@ -129,7 +130,7 @@ def get_span(coord): return PlotDefn(plot_coords, transpose) -def _valid_bound_coord(coord): +def _valid_bound_dim_coord(coord): result = None if coord and coord.ndim == 1 and coord.nbounds: result = coord @@ -154,7 +155,7 @@ def _get_plot_defn(cube, mode, ndims=2): # When appropriate, restrict to 1D with bounds. if mode == iris.coords.BOUND_MODE: - coords = list(map(_valid_bound_coord, coords)) + coords = list(map(_valid_bound_dim_coord, coords)) def guess_axis(coord): axis = None @@ -172,6 +173,19 @@ def guess_axis(coord): aux_coords.sort(key=lambda coord: coord._as_defn()) coords[dim] = aux_coords[0] + # If plotting a 2 dimensional plot, check for 2d coordinates + if ndims == 2: + missing_dims = [dim for dim, coord in enumerate(coords) if + coord is None] + if missing_dims: + # Note that this only picks up coordinates that span the dims + two_dim_coords = cube.coords(dimensions=missing_dims) + two_dim_coords = [coord for coord in two_dim_coords + if coord.ndim == 2] + if len(two_dim_coords) >= 2: + two_dim_coords.sort(key=lambda coord: coord._as_defn()) + coords = two_dim_coords[:2] + if mode == iris.coords.POINT_MODE: # Allow multi-dimensional aux_coords to override the dim_coords # along the Z axis. This results in a preference for using the @@ -195,6 +209,7 @@ def sort_key(coord): return (order.get(axis, 0), coords.index(coord), coord and coord.name()) + sorted_coords = sorted(coords, key=sort_key) transpose = (sorted_coords != coords) @@ -256,6 +271,85 @@ def _invert_yaxis(v_coord, axes=None): axes.invert_yaxis() +def _check_bounds_contiguity_and_mask(coord, data, atol=None): + """ + Checks that any discontiguities in the bounds of the given coordinate only + occur where the data is masked. + + Where a discontinuity occurs the grid created for plotting will not be + correct. This does not matter if the data is masked in that location as + this is not plotted. + + If a discontiguity occurs where the data is *not* masked, an error is + raised. + + Args: + coord: (iris.coord.Coord) + Coordinate the bounds of which will be checked for contiguity + data: (array) + Data of the the cube we are plotting + atol: + Absolute tolerance when checking the contiguity. Defaults to None. + If an absolute tolerance is not set, 1D coords are not checked (so + as to not introduce a breaking change without a major release) but + 2D coords are always checked, by calling + :meth:`iris.coords.Coord._discontiguity_in_bounds` with its default + tolerance. + + """ + data_is_masked = hasattr(data, 'mask') + if data_is_masked: + # When checking the location of the discontiguities, we check against + # the opposite of the mask, which is True where data exists. + mask_invert = np.logical_not(data.mask) + + if coord.ndim == 1: + # 1D coords are only checked if an absolute tolerance is set, to avoid + # introducing a breaking change. + if atol: + contiguous, diffs = coord._discontiguity_in_bounds(atol=atol) + + if not contiguous and data_is_masked: + not_masked_at_discontiguity = np.any( + np.logical_and(mask_invert[:-1], diffs)) + else: + return + + elif coord.ndim == 2: + if atol: + kwargs = {'atol': atol} + else: + kwargs = {} + contiguous, diffs = coord._discontiguity_in_bounds(**kwargs) + + if not contiguous and data_is_masked: + diffs_along_x, diffs_along_y = diffs + + # Check along both dimensions. + not_masked_at_discontiguity_along_x = np.any( + np.logical_and(mask_invert[:, :-1], diffs_along_x)) + + not_masked_at_discontiguity_along_y = np.any( + np.logical_and(mask_invert[:-1, ], diffs_along_y)) + + not_masked_at_discontiguity = not_masked_at_discontiguity_along_x \ + or not_masked_at_discontiguity_along_y + + # If any discontiguity occurs where the data is not masked the grid will be + # created incorrectly, so raise an error. + if not contiguous: + if not data_is_masked: + raise ValueError('The bounds of the {} coordinate are not ' + 'contiguous. Not able to create a suitable grid' + 'to plot.'.format(coord.name())) + if not_masked_at_discontiguity: + raise ValueError('The bounds of the {} coordinate are not ' + 'contiguous and data is not masked where the ' + 'discontiguity occurs. Not able to create a ' + 'suitable grid to plot.'.format( + coord.name())) + + def _draw_2d_from_bounds(draw_method_name, cube, *args, **kwargs): # NB. In the interests of clarity we use "u" and "v" to refer to the # horizontal and vertical axes on the matplotlib plot. @@ -263,10 +357,18 @@ def _draw_2d_from_bounds(draw_method_name, cube, *args, **kwargs): # Get & remove the coords entry from kwargs. coords = kwargs.pop('coords', None) if coords is not None: - plot_defn = _get_plot_defn_custom_coords_picked(cube, coords, mode) + plot_defn = _get_plot_defn_custom_coords_picked(cube, coords, mode, + ndims=2) else: plot_defn = _get_plot_defn(cube, mode, ndims=2) + contig_tol = kwargs.pop('contiguity_tolerance', None) + + for coord in plot_defn.coords: + if hasattr(coord, 'has_bounds') and coord.has_bounds(): + _check_bounds_contiguity_and_mask(coord, data=cube.data, + atol=contig_tol) + if _can_draw_map(plot_defn.coords): result = _map_common(draw_method_name, None, iris.coords.BOUND_MODE, cube, plot_defn, *args, **kwargs) @@ -309,7 +411,15 @@ def _draw_2d_from_bounds(draw_method_name, cube, *args, **kwargs): plot_arrays.append(values) u, v = plot_arrays - u, v = _broadcast_2d(u, v) + + # If the data is transposed, 2D coordinates will also need to be + # transposed. + if plot_defn.transpose is True: + u, v = [coord.T if coord.ndim == 2 else coord for coord in [u, v]] + + if u.ndim == v.ndim == 1: + u, v = _broadcast_2d(u, v) + axes = kwargs.pop('axes', None) draw_method = getattr(axes if axes else plt, draw_method_name) result = draw_method(u, v, data, *args, **kwargs) @@ -434,8 +544,8 @@ def _fixup_dates(coord, values): raise IrisError(msg) r = [nc_time_axis.CalendarDateTime( - cftime.datetime(*date), coord.units.calendar) - for date in dates] + cftime.datetime(*date), coord.units.calendar) + for date in dates] values = np.empty(len(r), dtype=object) values[:] = r return values @@ -675,8 +785,8 @@ def _ensure_cartopy_axes_and_determine_kwargs(x_coord, y_coord, kwargs): axes = kwargs.get('axes') if axes is None: if (isinstance(cs, iris.coord_systems.RotatedGeogCS) and - x_coord.points.max() > 180 and x_coord.points.max() < 360 and - x_coord.points.min() > 0): + x_coord.points.max() > 180 and x_coord.points.max() < 360 and + x_coord.points.min() > 0): # The RotatedGeogCS has 0 - 360 extent, different from the # assumptions made by Cartopy: rebase longitudes for the map axes # to set the datum longitude to the International Date Line. @@ -721,17 +831,22 @@ def _map_common(draw_method_name, arg_func, mode, cube, plot_defn, else: raise ValueError("Expected 1D or 2D XY coords") else: - try: - x, y = _meshgrid(x_coord.contiguous_bounds(), - y_coord.contiguous_bounds()) - # Exception translation. - except iris.exceptions.CoordinateMultiDimError: - raise ValueError("Could not get XY grid from bounds. " - "X or Y coordinate not 1D.") - except ValueError: - raise ValueError("Could not get XY grid from bounds. " - "X or Y coordinate doesn't have 2 bounds " - "per point.") + if not x_coord.ndim == y_coord.ndim == 2: + try: + x, y = _meshgrid(x_coord.contiguous_bounds(), + y_coord.contiguous_bounds()) + # Exception translation. + except iris.exceptions.CoordinateMultiDimError: + raise ValueError("Expected two 1D coords. Could not get XY" + " grid from bounds. X or Y coordinate not" + " 1D.") + except ValueError: + raise ValueError("Could not get XY grid from bounds. " + "X or Y coordinate doesn't have 2 bounds " + "per point.") + else: + x = x_coord.contiguous_bounds() + y = y_coord.contiguous_bounds() # Obtain the data array. data = cube.data @@ -1005,7 +1120,10 @@ def outline(cube, coords=None, color='k', linewidth=None, axes=None): def pcolor(cube, *args, **kwargs): """ - Draws a pseudocolor plot based on the given Cube. + Draws a pseudocolor plot based on the given 2-dimensional Cube. + + The cube must have either two 1-dimensional coordinates or two + 2-dimensional coordinates with contiguous bounds to plot the cube against. Kwargs: @@ -1019,6 +1137,9 @@ def pcolor(cube, *args, **kwargs): * axes: the :class:`matplotlib.axes.Axes` to use for drawing. Defaults to the current axes if none provided. + * contiguity_tolerance: The absolute tolerance used when checking for + contiguity between the bounds of the cells. Defaults to None. + See :func:`matplotlib.pyplot.pcolor` for details of other valid keyword arguments. @@ -1031,7 +1152,11 @@ def pcolor(cube, *args, **kwargs): def pcolormesh(cube, *args, **kwargs): """ - Draws a pseudocolor plot based on the given Cube. + Draws a pseudocolor plot based on the given 2-dimensional Cube. + + The cube must have either two 1-dimensional coordinates or two + 2-dimensional coordinates with contiguous bounds to plot against each + other. Kwargs: @@ -1045,6 +1170,9 @@ def pcolormesh(cube, *args, **kwargs): * axes: the :class:`matplotlib.axes.Axes` to use for drawing. Defaults to the current axes if none provided. + * contiguity_tolerance: The absolute tolerance used when checking for + contiguity between the bounds of the cells. Defaults to None. + See :func:`matplotlib.pyplot.pcolormesh` for details of other valid keyword arguments. @@ -1073,6 +1201,7 @@ def points(cube, *args, **kwargs): keyword arguments. """ + def _scatter_args(u, v, data, *args, **kwargs): return ((u, v) + args, kwargs) @@ -1080,6 +1209,92 @@ def _scatter_args(u, v, data, *args, **kwargs): *args, **kwargs) +def _vector_component_args(x_points, y_points, u_data, *args, **kwargs): + """ + Callback from _draw_2d_from_points for 'quiver' and 'streamlines'. + + Returns arguments (x, y, u, v), to be passed to the underlying matplotlib + call. + + "u_data" will always be "u_cube.data". + The matching "v_cube.data" component is stored in kwargs['_v_data']. + + """ + v_data = kwargs.pop('_v_data') + + # Rescale u+v values for plot distortion. + crs = kwargs.get('transform', None) + if crs: + if not isinstance(crs, (ccrs.PlateCarree, ccrs.RotatedPole)): + msg = ('Can only plot vectors provided in a lat-lon ' + 'projection, i.e. equivalent to "cartopy.crs.PlateCarree" ' + 'or "cartopy.crs.RotatedPole". This ' + "cube coordinate system translates as Cartopy {}.") + raise ValueError(msg.format(crs)) + # Given the above check, the Y points must be latitudes. + # We therefore **assume** they are in degrees : I'm not sure this + # is wise, but all the rest of this plot code does that, e.g. in + # _map_common. + # TODO: investigate degree units assumptions, here + elsewhere. + + # Implement a latitude scaling, but preserve the given magnitudes. + u_data, v_data = [arr.copy() for arr in (u_data, v_data)] + mags = np.sqrt(u_data * u_data + v_data * v_data) + v_data *= np.cos(np.deg2rad(y_points)) + scales = mags / np.sqrt(u_data * u_data + v_data * v_data) + u_data *= scales + v_data *= scales + + return ((x_points, y_points, u_data, v_data), kwargs) + + +def quiver(u_cube, v_cube, *args, **kwargs): + """ + Draws an arrow plot from two vector component cubes. + + Args: + + * u_cube, v_cube : (:class:`~iris.cube.Cube`) + u and v vector components. Must have same shape and units. + If the cubes have geographic coordinates, the values are treated as + true distance differentials, e.g. windspeeds, and *not* map coordinate + vectors. The components are aligned with the North and East of the + cube coordinate system. + + .. Note: + + At present, if u_cube and v_cube have geographic coordinates, then they + must be in a lat-lon coordinate system, though it may be a rotated one. + To transform wind values between coordinate systems, use + :func:`iris.analysis.cartography.rotate_vectors`. + To transform coordinate grid points, you will need to create + 2-dimensional arrays of x and y values. These can be transformed with + :meth:`cartopy.crs.CRS.transform_points`. + + Kwargs: + + * coords: (list of :class:`~iris.coords.Coord` or string) + Coordinates or coordinate names. Use the given coordinates as the axes + for the plot. The order of the given coordinates indicates which axis + to use for each, where the first element is the horizontal + axis of the plot and the second element is the vertical axis + of the plot. + + * axes: the :class:`matplotlib.axes.Axes` to use for drawing. + Defaults to the current axes if none provided. + + See :func:`matplotlib.pyplot.quiver` for details of other valid + keyword arguments. + + """ + # + # TODO: check u + v cubes for compatibility. + # + kwargs['_v_data'] = v_cube.data + return _draw_2d_from_points('quiver', _vector_component_args, u_cube, + *args, **kwargs) + + def plot(*args, **kwargs): """ Draws a line plot based on the given cube(s) or coordinate(s). diff --git a/lib/iris/tests/__init__.py b/lib/iris/tests/__init__.py index 8c64dc15b1..84d4614c0e 100644 --- a/lib/iris/tests/__init__.py +++ b/lib/iris/tests/__init__.py @@ -642,7 +642,7 @@ def assertMaskedArrayAlmostEqual(self, a, b, decimal=6, strict=False): self._assertMaskedArray(np.testing.assert_array_almost_equal, a, b, strict, decimal=decimal) - def assertArrayAllClose(self, a, b, rtol=1.0e-7, atol=0.0, **kwargs): + def assertArrayAllClose(self, a, b, rtol=1.0e-7, atol=1.0e-8, **kwargs): """ Check arrays are equal, within given relative + absolute tolerances. @@ -660,10 +660,35 @@ def assertArrayAllClose(self, a, b, rtol=1.0e-7, atol=0.0, **kwargs): Performs pointwise toleranced comparison, and raises an assertion if the two are not equal 'near enough'. - For full details see underlying routine numpy.testing.assert_allclose. + For full details see underlying routine numpy.allclose. """ - np.testing.assert_allclose(a, b, rtol=rtol, atol=atol, **kwargs) + # Handle the 'err_msg' kwarg, which is the only API difference + # between np.allclose and np.testing_assert_allclose. + msg = kwargs.pop('err_msg', None) + ok = np.allclose(a, b, rtol=rtol, atol=atol, **kwargs) + if not ok: + # Calculate errors above a pointwise tolerance : The method is + # taken from "numpy.core.numeric.isclose". + a, b = np.broadcast_arrays(a, b) + errors = (np.abs(a-b) - atol + rtol * np.abs(b)) + worst_inds = np.unravel_index(np.argmax(errors.flat), errors.shape) + + if msg is None: + # Build a more useful message than np.testing.assert_allclose. + msg = ( + '\nARRAY CHECK FAILED "assertArrayAllClose" :' + '\n with shapes={} {}, atol={}, rtol={}' + '\n worst at element {} : a={} b={}' + '\n absolute error ~{:.3g}, equivalent to rtol ~{:.3e}') + aval, bval = a[worst_inds], b[worst_inds] + absdiff = np.abs(aval - bval) + equiv_rtol = absdiff / bval + msg = msg.format( + a.shape, b.shape, atol, rtol, worst_inds, aval, bval, + absdiff, equiv_rtol) + + raise AssertionError(msg) @contextlib.contextmanager def temp_filename(self, suffix=''): diff --git a/lib/iris/tests/integration/plot/test_plot_2d_coords.py b/lib/iris/tests/integration/plot/test_plot_2d_coords.py new file mode 100644 index 0000000000..5b63b10959 --- /dev/null +++ b/lib/iris/tests/integration/plot/test_plot_2d_coords.py @@ -0,0 +1,67 @@ +# (C) British Crown Copyright 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +""" +Test plots with two dimensional coordinates. + +""" + +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +# import iris tests first so that some things can be initialised before +# importing anything else +import iris.tests as tests + +import cartopy.crs as ccrs +import matplotlib.pyplot as plt + +import iris + +# Run tests in no graphics mode if matplotlib is not available. +if tests.MPL_AVAILABLE: + import iris.quickplot as qplt + + +@tests.skip_data +def simple_cube_w_2d_coords(): + path = tests.get_data_path(('NetCDF', 'ORCA2', 'votemper.nc')) + cube = iris.load_cube(path) + return cube + + +@tests.skip_plot +@tests.skip_data +class Test(tests.GraphicsTest): + def test_2d_coord_bounds_platecarree(self): + # To avoid a problem with Cartopy smearing the data where the + # longitude wraps, we set the central_longitude + cube = simple_cube_w_2d_coords()[0, 0] + ax = plt.axes(projection=ccrs.PlateCarree(central_longitude=180)) + qplt.pcolormesh(cube) + ax.coastlines(color='red') + self.check_graphic() + + def test_2d_coord_bounds_northpolarstereo(self): + cube = simple_cube_w_2d_coords()[0, 0] + ax = plt.axes(projection=ccrs.NorthPolarStereo()) + qplt.pcolormesh(cube) + ax.coastlines(color='red') + self.check_graphic() + + +if __name__ == "__main__": + tests.main() diff --git a/lib/iris/tests/integration/plot/test_vector_plots.py b/lib/iris/tests/integration/plot/test_vector_plots.py new file mode 100644 index 0000000000..00d9d48e8d --- /dev/null +++ b/lib/iris/tests/integration/plot/test_vector_plots.py @@ -0,0 +1,180 @@ +# (C) British Crown Copyright 2014 - 2016, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +""" +Test some key usages of :func:`iris.plot.quiver`. + +""" + +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +# import iris tests first so that some things can be initialised before +# importing anything else +import iris.tests as tests + +import numpy as np + +import cartopy.crs as ccrs +from iris.coords import AuxCoord, DimCoord +from iris.coord_systems import Mercator +from iris.cube import Cube +from iris.tests.stock import sample_2d_latlons + +# Run tests in no graphics mode if matplotlib is not available. +if tests.MPL_AVAILABLE: + import matplotlib.pyplot as plt + from iris.plot import quiver + + +@tests.skip_plot +class MixinVectorPlotCases(object): + """ + Test examples mixin, used by separate quiver + streamplot classes. + + NOTE: at present for quiver only, as streamplot does not support arbitrary + coordinates. + + """ + + def plot(self, plotname, *args, **kwargs): + plot_function = self.plot_function_to_test() + plot_function(*args, **kwargs) + plt.suptitle(plotname) + + @staticmethod + def _nonlatlon_xyuv(): + # Create common x, y, u, v arrays for quiver/streamplot testing. + x = np.array([0., 2, 3, 5]) + y = np.array([0., 2.5, 4]) + uv = np.array([[(0., 0), (0, 1), (0, -1), (2, 1)], + [(-1, 0), (-1, -1), (-1, 1), (-2, 1)], + [(1., 0), (1, -1), (1, 1), (-2, 2)]]) + uv = np.array(uv) + u, v = uv[..., 0], uv[..., 1] + return x, y, u, v + + @staticmethod + def _nonlatlon_uv_cubes(x, y, u, v): + # Create u and v test cubes from x, y, u, v arrays. + coord_cls = DimCoord if x.ndim == 1 else AuxCoord + x_coord = coord_cls(x, long_name='x') + y_coord = coord_cls(y, long_name='y') + u_cube = Cube(u, long_name='u', units='ms-1') + if x.ndim == 1: + u_cube.add_dim_coord(y_coord, 0) + u_cube.add_dim_coord(x_coord, 1) + else: + u_cube.add_aux_coord(y_coord, (0, 1)) + u_cube.add_aux_coord(x_coord, (0, 1)) + v_cube = u_cube.copy() + v_cube.rename('v') + v_cube.data = v + return u_cube, v_cube + + def test_non_latlon_1d_coords(self): + # Plot against simple 1D x and y coords. + x, y, u, v = self._nonlatlon_xyuv() + u_cube, v_cube = self._nonlatlon_uv_cubes(x, y, u, v) + self.plot('nonlatlon, 1-d coords', u_cube, v_cube) + plt.xlim(x.min() - 1, x.max() + 2) + plt.ylim(y.min() - 1, y.max() + 2) + self.check_graphic() + + def test_non_latlon_2d_coords(self): + # Plot against expanded 2D x and y coords. + x, y, u, v = self._nonlatlon_xyuv() + x, y = np.meshgrid(x, y) + u_cube, v_cube = self._nonlatlon_uv_cubes(x, y, u, v) + # Call plot : N.B. default gives wrong coords order. + self.plot('nonlatlon_2d', u_cube, v_cube, coords=('x', 'y')) + plt.xlim(x.min() - 1, x.max() + 2) + plt.ylim(y.min() - 1, y.max() + 2) + self.check_graphic() + + @staticmethod + def _latlon_uv_cubes(grid_cube): + # Make a sample grid into u and v data for quiver/streamplot testing. + u_cube = grid_cube.copy() + u_cube.rename('dx') + u_cube.units = 'ms-1' + v_cube = u_cube.copy() + v_cube.rename('dy') + ny, nx = u_cube.shape + nn = nx * ny + angles = np.arange(nn).reshape((ny, nx)) + angles = (angles * 360.0 / 5.5) % 360. + scale = np.arange(nn) % 5 + scale = (scale + 4) / 4 + scale = scale.reshape((ny, nx)) + u_cube.data = scale * np.cos(np.deg2rad(angles)) + v_cube.data = scale * np.sin(np.deg2rad(angles)) + return u_cube, v_cube + + def test_2d_plain_latlon(self): + # Test 2d vector plotting with implicit (PlateCarree) coord system. + u_cube, v_cube = self._latlon_uv_cubes(sample_2d_latlons()) + ax = plt.axes(projection=ccrs.PlateCarree(central_longitude=180)) + self.plot('latlon_2d', u_cube, v_cube, + coords=('longitude', 'latitude')) + ax.coastlines(color='red') + ax.set_global() + self.check_graphic() + + def test_2d_plain_latlon_on_polar_map(self): + # Test 2d vector plotting onto a different projection. + u_cube, v_cube = self._latlon_uv_cubes(sample_2d_latlons()) + ax = plt.axes(projection=ccrs.NorthPolarStereo()) + self.plot('latlon_2d_polar', u_cube, v_cube, + coords=('longitude', 'latitude')) + ax.coastlines(color='red') + self.check_graphic() + + def test_2d_rotated_latlon(self): + # Test plotting vectors in a rotated latlon coord system. + u_cube, v_cube = self._latlon_uv_cubes( + sample_2d_latlons(rotated=True)) + ax = plt.axes(projection=ccrs.PlateCarree(central_longitude=180)) + self.plot('2d_rotated', u_cube, v_cube, + coords=('longitude', 'latitude')) + ax.coastlines(color='red') + ax.set_global() + self.check_graphic() + + def test_fail_unsupported_coord_system(self): + # Test plotting vectors in a rotated latlon coord system. + u_cube, v_cube = self._latlon_uv_cubes(sample_2d_latlons()) + patch_coord_system = Mercator() + for cube in u_cube, v_cube: + for coord in cube.coords(): + coord.coord_system = patch_coord_system + re_msg = ('Can only plot .* lat-lon projection, .* ' + 'This .* translates as Cartopy.*Mercator') + with self.assertRaisesRegexp(ValueError, re_msg): + self.plot('2d_rotated', u_cube, v_cube, + coords=('longitude', 'latitude')) + + +class TestQuiver(MixinVectorPlotCases, tests.GraphicsTest): + def setUp(self): + super(TestQuiver, self).setUp() + + def plot_function_to_test(self): + return quiver + + +if __name__ == "__main__": + tests.main() diff --git a/lib/iris/tests/integration/test_regridding.py b/lib/iris/tests/integration/test_regridding.py index 47d155f3f2..80878c21b5 100644 --- a/lib/iris/tests/integration/test_regridding.py +++ b/lib/iris/tests/integration/test_regridding.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2013 - 2017, Met Office +# (C) British Crown Copyright 2013 - 2018, Met Office # # This file is part of Iris. # diff --git a/lib/iris/tests/results/imagerepo.json b/lib/iris/tests/results/imagerepo.json index 51d37daf3c..ef9bf542de 100644 --- a/lib/iris/tests/results/imagerepo.json +++ b/lib/iris/tests/results/imagerepo.json @@ -160,6 +160,27 @@ "https://scitools.github.io/test-iris-imagehash/images/v4/ea857a81957a857e957ec17e817e6a816a853e817a853e816e818d3a862ad3fe.png", "https://scitools.github.io/test-iris-imagehash/images/v4/be81857ec17e7a81c17e7e81857e3e803e81817a3e81c17e7a81c17ec97e2c2b.png" ], + "iris.tests.integration.plot.test_plot_2d_coords.Test.test_2d_coord_bounds_northpolarstereo.0": [ + "https://scitools.github.io/test-iris-imagehash/images/v4/e59661969e699659c0f719a6c967339a1992c07f3649c09c3f612669c07b3f66.png" + ], + "iris.tests.integration.plot.test_plot_2d_coords.Test.test_2d_coord_bounds_platecarree.0": [ + "https://scitools.github.io/test-iris-imagehash/images/v4/ee816299954a1da699b6915ec25b6e419729c42c3f84bd9fe6d262d1d1dac076.png" + ], + "iris.tests.integration.plot.test_vector_plots.TestQuiver.test_2d_plain_latlon.0": [ + "https://scitools.github.io/test-iris-imagehash/images/v4/fb8d4f21c472b27e919d2e216f216b3178e69c7e961ab39a84696c616d245b94.png" + ], + "iris.tests.integration.plot.test_vector_plots.TestQuiver.test_2d_plain_latlon_on_polar_map.0": [ + "https://scitools.github.io/test-iris-imagehash/images/v4/e66c6619999933666666c6d99999336663646d9999c1332667b60cf964d8672c.png" + ], + "iris.tests.integration.plot.test_vector_plots.TestQuiver.test_2d_rotated_latlon.0": [ + "https://scitools.github.io/test-iris-imagehash/images/v4/eba925a5c476d25a95a56b876f3826246a449c6b96a3731ab13f6c656a5cb48a.png" + ], + "iris.tests.integration.plot.test_vector_plots.TestQuiver.test_non_latlon_1d_coords.0": [ + "https://scitools.github.io/test-iris-imagehash/images/v4/a7ac24947259f3493697632df45926b6e126c4f392593b4937266f26ccf032d8.png" + ], + "iris.tests.integration.plot.test_vector_plots.TestQuiver.test_non_latlon_2d_coords.0": [ + "https://scitools.github.io/test-iris-imagehash/images/v4/afac26367251d3493617632df45c26a6e126c6f392593b4937266f26ccf232d0.png" + ], "iris.tests.test_analysis.TestProject.test_cartopy_projection.0": [ "https://scitools.github.io/test-iris-imagehash/images/v4/9e1952c9c165b4fc668a9d47c1461d7a60fb2e853eb426bd62fd229c9f04c16d.png" ], diff --git a/lib/iris/tests/results/pandas/as_cube/data_frame_multidim.cml b/lib/iris/tests/results/pandas/as_cube/data_frame_multidim.cml new file mode 100644 index 0000000000..d377fa72d8 --- /dev/null +++ b/lib/iris/tests/results/pandas/as_cube/data_frame_multidim.cml @@ -0,0 +1,15 @@ + + + + + + + + + + + + + + + diff --git a/lib/iris/tests/stock.py b/lib/iris/tests/stock/__init__.py similarity index 74% rename from lib/iris/tests/stock.py rename to lib/iris/tests/stock/__init__.py index 25df8eda33..5965d5a208 100644 --- a/lib/iris/tests/stock.py +++ b/lib/iris/tests/stock/__init__.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2010 - 2017, Met Office +# (C) British Crown Copyright 2010 - 2018, Met Office # # This file is part of Iris. # @@ -36,7 +36,8 @@ from iris.coords import DimCoord, AuxCoord import iris.tests as tests from iris.coord_systems import GeogCS, RotatedGeogCS - +from ._stock_2d_latlons import (sample_2d_latlons, + make_bounds_discontiguous_at_point) def lat_lon_cube(): """ @@ -46,12 +47,12 @@ def lat_lon_cube(): """ cube = Cube(np.arange(12, dtype=np.int32).reshape((3, 4))) cs = GeogCS(6371229) - coord = iris.coords.DimCoord(points=np.array([-1, 0, 1], dtype=np.int32), + coord = DimCoord(points=np.array([-1, 0, 1], dtype=np.int32), standard_name='latitude', units='degrees', coord_system=cs) cube.add_dim_coord(coord, 0) - coord = iris.coords.DimCoord(points=np.array([-1, 0, 1, 2], dtype=np.int32), + coord = DimCoord(points=np.array([-1, 0, 1, 2], dtype=np.int32), standard_name='longitude', units='degrees', coord_system=cs) @@ -99,7 +100,7 @@ def simple_1d(with_bounds=True): cube.units = '1' points = np.arange(11, dtype=np.int32) + 1 bounds = np.column_stack([np.arange(11, dtype=np.int32), np.arange(11, dtype=np.int32) + 1]) - coord = iris.coords.DimCoord(points, long_name='foo', units='1', bounds=bounds) + coord = DimCoord(points, long_name='foo', units='1', bounds=bounds) cube.add_dim_coord(coord, 0) return cube @@ -124,12 +125,15 @@ def simple_2d(with_bounds=True): cube = Cube(np.arange(12, dtype=np.int32).reshape((3, 4))) cube.long_name = 'thingness' cube.units = '1' - y_points = np.array([ 2.5, 7.5, 12.5]) + y_points = np.array([2.5, 7.5, 12.5]) y_bounds = np.array([[0, 5], [5, 10], [10, 15]], dtype=np.int32) - y_coord = iris.coords.DimCoord(y_points, long_name='bar', units='1', bounds=y_bounds if with_bounds else None) + y_coord = DimCoord(y_points, long_name='bar', units='1', + bounds=y_bounds if with_bounds else None) x_points = np.array([ -7.5, 7.5, 22.5, 37.5]) - x_bounds = np.array([[-15, 0], [0, 15], [15, 30], [30, 45]], dtype=np.int32) - x_coord = iris.coords.DimCoord(x_points, long_name='foo', units='1', bounds=x_bounds if with_bounds else None) + x_bounds = np.array([[-15, 0], [0, 15], [15, 30], [30, 45]], + dtype=np.int32) + x_coord = DimCoord(x_points, long_name='foo', units='1', + bounds=x_bounds if with_bounds else None) cube.add_dim_coord(y_coord, 0) cube.add_dim_coord(x_coord, 1) @@ -191,9 +195,8 @@ def simple_3d_w_multidim_coords(with_bounds=True): [[5, 15], [15, 20], [20, 35], [35, 50]], [[10, 20], [20, 25], [25, 40], [40, 60]]], dtype=np.int32) - y_coord = iris.coords.AuxCoord(points=y_points, long_name='bar', - units='1', - bounds=y_bounds if with_bounds else None) + y_coord = AuxCoord(points=y_points, long_name='bar', units='1', + bounds=y_bounds if with_bounds else None) x_points = np.array([[-7.5, 7.5, 22.5, 37.5], [-12.5, 4., 26.5, 47.5], [2.5, 14., 36.5, 44.]]) @@ -201,12 +204,10 @@ def simple_3d_w_multidim_coords(with_bounds=True): [[-25, 0], [0, 8], [8, 45], [45, 50]], [[-5, 10], [10, 18], [18, 55], [18, 70]]], dtype=np.int32) - x_coord = iris.coords.AuxCoord(points=x_points, long_name='foo', - units='1', - bounds=x_bounds if with_bounds else None) - wibble_coord = iris.coords.DimCoord(np.array([10., 30.], - dtype=np.float32), - long_name='wibble', units='1') + x_coord = AuxCoord(points=x_points, long_name='foo', units='1', + bounds=x_bounds if with_bounds else None) + wibble_coord = DimCoord(np.array([10., 30.], dtype=np.float32), + long_name='wibble', units='1') cube.add_dim_coord(wibble_coord, [0]) cube.add_aux_coord(y_coord, [1, 2]) @@ -238,13 +239,13 @@ def simple_3d(): cube = Cube(np.arange(24, dtype=np.int32).reshape((2, 3, 4))) cube.long_name = 'thingness' cube.units = '1' - wibble_coord = iris.coords.DimCoord(np.array([10., 30.], + wibble_coord = DimCoord(np.array([10., 30.], dtype=np.float32), long_name='wibble', units='1') - lon = iris.coords.DimCoord([-180, -90, 0, 90], + lon = DimCoord([-180, -90, 0, 90], standard_name='longitude', units='degrees', circular=True) - lat = iris.coords.DimCoord([90, 0, -90], + lat = DimCoord([90, 0, -90], standard_name='latitude', units='degrees') cube.add_dim_coord(wibble_coord, [0]) cube.add_dim_coord(lat, [1]) @@ -294,39 +295,46 @@ def track_1d(duplicate_x=False): array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10]) """ - cube = Cube(np.arange(11, dtype=np.int32), standard_name='air_temperature', units='K') - bounds = np.column_stack([np.arange(11, dtype=np.int32), np.arange(11, dtype=np.int32) + 1]) + cube = Cube(np.arange(11, dtype=np.int32), standard_name='air_temperature', + units='K') + bounds = np.column_stack([np.arange(11, dtype=np.int32), + np.arange(11, dtype=np.int32) + 1]) pts = bounds[:, 1] - coord = iris.coords.AuxCoord(pts, 'projection_x_coordinate', units='1', bounds=bounds) + coord = AuxCoord(pts, 'projection_x_coordinate', units='1', bounds=bounds) cube.add_aux_coord(coord, [0]) if duplicate_x: - coord = iris.coords.AuxCoord(pts, 'projection_x_coordinate', units='1', bounds=bounds) + coord = AuxCoord(pts, 'projection_x_coordinate', units='1', + bounds=bounds) cube.add_aux_coord(coord, [0]) - coord = iris.coords.AuxCoord(pts * 2, 'projection_y_coordinate', units='1', bounds=bounds * 2) + coord = AuxCoord(pts * 2, 'projection_y_coordinate', units='1', + bounds=bounds * 2) cube.add_aux_coord(coord, 0) return cube def simple_2d_w_multidim_and_scalars(): data = np.arange(50, dtype=np.int32).reshape((5, 10)) - cube = iris.cube.Cube(data, long_name='test 2d dimensional cube', units='meters') + cube = iris.cube.Cube(data, long_name='test 2d dimensional cube', + units='meters') # DimCoords - dim1 = iris.coords.DimCoord(np.arange(5, dtype=np.float32) * 5.1 + 3.0, long_name='dim1', units='meters') - dim2 = iris.coords.DimCoord(np.arange(10, dtype=np.int32), long_name='dim2', units='meters', - bounds=np.arange(20, dtype=np.int32).reshape(10, 2)) + dim1 = DimCoord(np.arange(5, dtype=np.float32) * 5.1 + 3.0, + long_name='dim1', units='meters') + dim2 = DimCoord(np.arange(10, dtype=np.int32), + long_name='dim2', units='meters', + bounds=np.arange(20, dtype=np.int32).reshape(10, 2)) # Scalars - an_other = iris.coords.AuxCoord(3.0, long_name='an_other', units='meters') - yet_an_other = iris.coords.DimCoord(23.3, standard_name='air_temperature', - long_name='custom long name', - var_name='custom_var_name', - units='K') + an_other = AuxCoord(3.0, long_name='an_other', units='meters') + yet_an_other = DimCoord(23.3, standard_name='air_temperature', + long_name='custom long name', + var_name='custom_var_name', units='K') # Multidim - my_multi_dim_coord = iris.coords.AuxCoord(np.arange(50, dtype=np.int32).reshape(5, 10), - long_name='my_multi_dim_coord', units='1', - bounds=np.arange(200, dtype=np.int32).reshape(5, 10, 4)) + my_multi_dim_coord = AuxCoord(np.arange(50, dtype=np.int32).reshape(5, 10), + long_name='my_multi_dim_coord', units='1', + bounds=np.arange(200, dtype=np.int32). + reshape(5, 10, 4)) cube.add_dim_coord(dim1, 0) cube.add_dim_coord(dim2, 1) @@ -361,18 +369,21 @@ def hybrid_height(): """ data = np.arange(12, dtype='i8').reshape((3, 4)) - orography = icoords.AuxCoord([10, 25, 50, 5], standard_name='surface_altitude', units='m') - model_level = icoords.AuxCoord([2, 1, 0], standard_name='model_level_number') - level_height = icoords.DimCoord([100, 50, 10], long_name='level_height', - units='m', attributes={'positive': 'up'}, - bounds=[[150, 75], [75, 20], [20, 0]]) - sigma = icoords.AuxCoord([0.8, 0.9, 0.95], long_name='sigma', - bounds=[[0.7, 0.85], [0.85, 0.97], [0.97, 1.0]]) - hybrid_height = iris.aux_factory.HybridHeightFactory(level_height, sigma, orography) + orography = AuxCoord([10, 25, 50, 5], standard_name='surface_altitude', + units='m') + model_level = AuxCoord([2, 1, 0], standard_name='model_level_number') + level_height = DimCoord([100, 50, 10], long_name='level_height', + units='m', attributes={'positive': 'up'}, + bounds=[[150, 75], [75, 20], [20, 0]]) + sigma = AuxCoord([0.8, 0.9, 0.95], long_name='sigma', + bounds=[[0.7, 0.85], [0.85, 0.97], [0.97, 1.0]]) + hybrid_height = iris.aux_factory.HybridHeightFactory(level_height, + sigma, orography) cube = iris.cube.Cube(data, standard_name='air_temperature', units='K', dim_coords_and_dims=[(level_height, 0)], - aux_coords_and_dims=[(orography, 1), (model_level, 0), (sigma, 0)], + aux_coords_and_dims=[(orography, 1), + (model_level, 0), (sigma, 0)], aux_factories=[hybrid_height]) return cube @@ -381,25 +392,25 @@ def simple_4d_with_hybrid_height(): cube = iris.cube.Cube(np.arange(3*4*5*6, dtype='i8').reshape(3, 4, 5, 6), "air_temperature", units="K") - cube.add_dim_coord(iris.coords.DimCoord(np.arange(3, dtype='i8'), "time", - units="hours since epoch"), 0) - cube.add_dim_coord(iris.coords.DimCoord(np.arange(4, dtype='i8')+10, - "model_level_number", units="1"), 1) - cube.add_dim_coord(iris.coords.DimCoord(np.arange(5, dtype='i8')+20, - "grid_latitude", - units="degrees"), 2) - cube.add_dim_coord(iris.coords.DimCoord(np.arange(6, dtype='i8')+30, - "grid_longitude", - units="degrees"), 3) - - cube.add_aux_coord(iris.coords.AuxCoord(np.arange(4, dtype='i8')+40, - long_name="level_height", - units="m"), 1) - cube.add_aux_coord(iris.coords.AuxCoord(np.arange(4, dtype='i8')+50, - long_name="sigma", units="1"), 1) - cube.add_aux_coord(iris.coords.AuxCoord(np.arange(5*6, dtype='i8').reshape(5, 6)+100, - long_name="surface_altitude", - units="m"), [2, 3]) + cube.add_dim_coord(DimCoord(np.arange(3, dtype='i8'), "time", + units="hours since epoch"), 0) + cube.add_dim_coord(DimCoord(np.arange(4, dtype='i8')+10, + "model_level_number", units="1"), 1) + cube.add_dim_coord(DimCoord(np.arange(5, dtype='i8')+20, + "grid_latitude", + units="degrees"), 2) + cube.add_dim_coord(DimCoord(np.arange(6, dtype='i8')+30, + "grid_longitude", + units="degrees"), 3) + + cube.add_aux_coord(AuxCoord(np.arange(4, dtype='i8')+40, + long_name="level_height", + units="m"), 1) + cube.add_aux_coord(AuxCoord(np.arange(4, dtype='i8')+50, + long_name="sigma", units="1"), 1) + cube.add_aux_coord(AuxCoord(np.arange(5*6, dtype='i8').reshape(5, 6)+100, + long_name="surface_altitude", + units="m"), [2, 3]) cube.add_aux_factory(iris.aux_factory.HybridHeightFactory( delta=cube.coord("level_height"), @@ -449,7 +460,8 @@ def realistic_4d(): Returns a realistic 4d cube. >>> print(repr(realistic_4d())) - + """ # the stock arrays were created in Iris 0.8 with: @@ -477,7 +489,8 @@ def realistic_4d(): if not os.path.isfile(data_path): raise IOError('Test data is not available at {}.'.format(data_path)) r = np.load(data_path) - # sort the arrays based on the order they were originally given. The names given are of the form 'arr_1' or 'arr_10' + # sort the arrays based on the order they were originally given. + # The names given are of the form 'arr_1' or 'arr_10' _, arrays = zip(*sorted(six.iteritems(r), key=lambda item: int(item[0][4:]))) lat_pts, lat_bnds, lon_pts, lon_bnds, level_height_pts, \ @@ -486,25 +499,37 @@ def realistic_4d(): ll_cs = RotatedGeogCS(37.5, 177.5, ellipsoid=GeogCS(6371229.0)) - lat = icoords.DimCoord(lat_pts, standard_name='grid_latitude', units='degrees', - bounds=lat_bnds, coord_system=ll_cs) - lon = icoords.DimCoord(lon_pts, standard_name='grid_longitude', units='degrees', - bounds=lon_bnds, coord_system=ll_cs) + lat = icoords.DimCoord(lat_pts, standard_name='grid_latitude', + units='degrees', bounds=lat_bnds, + coord_system=ll_cs) + lon = icoords.DimCoord(lon_pts, standard_name='grid_longitude', + units='degrees', bounds=lon_bnds, + coord_system=ll_cs) level_height = icoords.DimCoord(level_height_pts, long_name='level_height', units='m', bounds=level_height_bnds, attributes={'positive': 'up'}) - model_level = icoords.DimCoord(model_level_pts, standard_name='model_level_number', + model_level = icoords.DimCoord(model_level_pts, + standard_name='model_level_number', units='1', attributes={'positive': 'up'}) - sigma = icoords.AuxCoord(sigma_pts, long_name='sigma', units='1', bounds=sigma_bnds) - orography = icoords.AuxCoord(orography, standard_name='surface_altitude', units='m') - time = icoords.DimCoord(time_pts, standard_name='time', units='hours since 1970-01-01 00:00:00') - forecast_period = icoords.DimCoord(forecast_period_pts, standard_name='forecast_period', units='hours') + sigma = icoords.AuxCoord(sigma_pts, long_name='sigma', units='1', + bounds=sigma_bnds) + orography = icoords.AuxCoord(orography, standard_name='surface_altitude', + units='m') + time = icoords.DimCoord(time_pts, standard_name='time', + units='hours since 1970-01-01 00:00:00') + forecast_period = icoords.DimCoord(forecast_period_pts, + standard_name='forecast_period', + units='hours') - hybrid_height = iris.aux_factory.HybridHeightFactory(level_height, sigma, orography) + hybrid_height = iris.aux_factory.HybridHeightFactory(level_height, + sigma, orography) - cube = iris.cube.Cube(data, standard_name='air_potential_temperature', units='K', - dim_coords_and_dims=[(time, 0), (model_level, 1), (lat, 2), (lon, 3)], - aux_coords_and_dims=[(orography, (2, 3)), (level_height, 1), (sigma, 1), + cube = iris.cube.Cube(data, standard_name='air_potential_temperature', + units='K', + dim_coords_and_dims=[(time, 0), (model_level, 1), + (lat, 2), (lon, 3)], + aux_coords_and_dims=[(orography, (2, 3)), + (level_height, 1), (sigma, 1), (forecast_period, None)], attributes={'source': 'Iris test case'}, aux_factories=[hybrid_height]) @@ -516,7 +541,8 @@ def realistic_4d_no_derived(): Returns a realistic 4d cube without hybrid height >>> print(repr(realistic_4d())) - + """ cube = realistic_4d() @@ -534,24 +560,30 @@ def realistic_4d_w_missing_data(): data_archive = np.load(data_path) data = ma.masked_array(data_archive['arr_0'], mask=data_archive['arr_1']) - # sort the arrays based on the order they were originally given. The names given are of the form 'arr_1' or 'arr_10' + # sort the arrays based on the order they were originally given. + # The names given are of the form 'arr_1' or 'arr_10' ll_cs = GeogCS(6371229) - lat = iris.coords.DimCoord(np.arange(20, dtype=np.float32), standard_name='grid_latitude', - units='degrees', coord_system=ll_cs) - lon = iris.coords.DimCoord(np.arange(20, dtype=np.float32), standard_name='grid_longitude', - units='degrees', coord_system=ll_cs) - time = iris.coords.DimCoord([1000., 1003., 1006.], standard_name='time', - units='hours since 1970-01-01 00:00:00') - forecast_period = iris.coords.DimCoord([0.0, 3.0, 6.0], standard_name='forecast_period', units='hours') - pressure = iris.coords.DimCoord(np.array([ 800., 900., 1000.], dtype=np.float32), - long_name='pressure', units='hPa') + lat = DimCoord(np.arange(20, dtype=np.float32), + standard_name='grid_latitude', + units='degrees', coord_system=ll_cs) + lon = DimCoord(np.arange(20, dtype=np.float32), + standard_name='grid_longitude', + units='degrees', coord_system=ll_cs) + time = DimCoord([1000., 1003., 1006.], standard_name='time', + units='hours since 1970-01-01 00:00:00') + forecast_period = DimCoord([0.0, 3.0, 6.0], + standard_name='forecast_period', + units='hours') + pressure = DimCoord(np.array([800., 900., 1000.], dtype=np.float32), + long_name='pressure', units='hPa') cube = iris.cube.Cube(data, long_name='missing data test data', units='K', - dim_coords_and_dims=[(time, 0), (pressure, 1), (lat, 2), (lon, 3)], + dim_coords_and_dims=[(time, 0), (pressure, 1), + (lat, 2), (lon, 3)], aux_coords_and_dims=[(forecast_period, 0)], - attributes={'source':'Iris test case'}) + attributes={'source': 'Iris test case'}) return cube diff --git a/lib/iris/tests/stock/_stock_2d_latlons.py b/lib/iris/tests/stock/_stock_2d_latlons.py new file mode 100644 index 0000000000..395a50d8ea --- /dev/null +++ b/lib/iris/tests/stock/_stock_2d_latlons.py @@ -0,0 +1,350 @@ +# (C) British Crown Copyright 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +""" +Extra stock routines for making and manipulating cubes with 2d coordinates, +to mimic ocean grid data. + +""" +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +import numpy as np +import numpy.ma as ma + +from iris.analysis.cartography import unrotate_pole +from iris.cube import Cube +from iris.coords import AuxCoord, DimCoord +from iris.coord_systems import RotatedGeogCS + + +def expand_1d_x_and_y_bounds_to_2d_xy(x_bounds_1d, y_bounds_1d): + """ + Convert bounds for separate 1-D X and Y coords into bounds for the + equivalent 2D coordinates. + + The output arrays have 4 points per cell, for 4 'corners' of a gridcell, + in the usual anticlockwise order + (bottom-left, bottom-right, top-right, top-left). + + If 1-dimensional X and Y coords have shapes nx and ny, then their bounds + have shapes (nx, 2) and (ny, 2). + The equivalent 2d coordinates would have values which are a "meshgrid" of + the original 1-D points, both having the shape (ny, ny). + The outputs are 2d bounds arrays suitable for such 2d coordinates. + + Args: + + * x_bounds_1d, y_bounds_1d : (array) + Coordinate bounds arrays, with shapes (nx, 2) and (ny, 2). + + Result: + + * x_bounds_2d, y_bounds_2d : (array) + Expanded 2d bounds arrays, both of shape (ny, nx, 4). + + """ + shapes = [bds.shape for bds in (x_bounds_1d, y_bounds_1d)] + for shape in shapes: + if len(shape) != 2 or shape[1] != 2: + msg = ('One-dimensional bounds arrays must have shapes (ny, 2) ' + 'and (nx, 2). Got {} and {}.') + raise ValueError(msg.format(*shapes)) + + # Construct output arrays, which are both (ny, nx, 4). + nx, ny = [shape[0] for shape in shapes] + bds_2d_x = np.zeros((ny, nx, 4)) + bds_2d_y = bds_2d_x.copy() + + # Expand x bounds to 2D array (ny, nx, 4) : the same over all 'Y'. + # Bottom left+right corners are the original 1-D x bounds. + bds_2d_x[:, :, 0] = x_bounds_1d[:, 0].reshape((1, nx)) + bds_2d_x[:, :, 1] = x_bounds_1d[:, 1].reshape((1, nx)) + # Top left+right corners are the same as bottom left+right. + bds_2d_x[:, :, 2] = bds_2d_x[:, :, 1].copy() + bds_2d_x[:, :, 3] = bds_2d_x[:, :, 0].copy() + + # Expand y bounds to 2D array (ny, nx, 4) : the same over all 'X'. + # Left-hand lower+upper corners are the original 1-D y bounds. + bds_2d_y[:, :, 0] = y_bounds_1d[:, 0].reshape((ny, 1)) + bds_2d_y[:, :, 3] = y_bounds_1d[:, 1].reshape((ny, 1)) + # Right-hand lower+upper corners are the same as the left-hand ones. + bds_2d_y[:, :, 1] = bds_2d_y[:, :, 0].copy() + bds_2d_y[:, :, 2] = bds_2d_y[:, :, 3].copy() + + return bds_2d_x, bds_2d_y + + +def grid_coords_2d_from_1d(x_coord_1d, y_coord_1d): + """ + Calculate a pair of 2d X+Y coordinates from 1d ones, in a "meshgrid" style. + If the inputs are bounded, the outputs have 4 points per bounds in the + usual way, i.e. points 0,1,2,3 are the gridcell corners anticlockwise from + the bottom left. + + """ + for coord in (x_coord_1d, y_coord_1d): + if coord.ndim != 1: + msg = ('Input coords must be one-dimensional. ' + 'Coordinate "{}" has shape {}.') + raise ValueError(msg.format(coord.name(), coord.shape)) + + # Calculate centre-points as a mesh of the 2 inputs. + pts_2d_x, pts_2d_y = np.meshgrid(x_coord_1d.points, y_coord_1d.points) + if not x_coord_1d.has_bounds() or not y_coord_1d.has_bounds(): + bds_2d_x = None + bds_2d_y = None + else: + bds_2d_x, bds_2d_y = expand_1d_x_and_y_bounds_to_2d_xy( + x_coord_1d.bounds, y_coord_1d.bounds) + + # Make two new coords + return them. + result = [] + for pts, bds, name in zip((pts_2d_x, pts_2d_y), + (bds_2d_x, bds_2d_y), + ('longitude', 'latitude')): + coord = AuxCoord(pts, bounds=bds, standard_name=name, units='degrees') + result.append(coord) + + return result + + +def sample_2d_latlons(regional=False, rotated=False, transformed=False): + """ + Construct small 2d cubes with 2d X and Y coordinates. + + This makes cubes with 'expanded' coordinates (4 bounds per cell), analagous + to ORCA data. + The coordinates are always geographical, so either it has a coord system + or they are "true" lats + lons. + ( At present, they are always latitudes and longitudes, but maybe in a + rotated system. ) + The results always have fully contiguous bounds. + + Kwargs: + * regional (bool): + If False (default), results cover the whole globe, and there is + implicit connectivity between rhs + lhs of the array. + If True, coverage is regional and edges do not connect. + * rotated (bool): + If False, X and Y coordinates are true-latitudes and longitudes, with + an implicit coordinate system (i.e. None). + If True, the X and Y coordinates are lats+lons in a selected + rotated-latlon coordinate system. + * transformed (bool): + Build coords from rotated coords as for 'rotated', but then replace + their values with the equivalent "true" lats + lons, and no + coord-system (defaults to true-latlon). + In this case, the X and Y coords are no longer 'meshgrid' style, + i.e. the points + bounds values vary in *both* dimensions. + + .. note:: + + 'transformed' is an alternative to 'rotated' : when 'transformed' is + set, then 'rotated' has no effect. + + .. Some sample results printouts :: + + >>> print(sample_2d_latlons()) + test_data / (unknown) (-- : 5; -- : 6) + Auxiliary coordinates: + latitude x x + longitude x x + >>> + >>> print(sample_2d_latlons().coord(axis='x')[0, :2]) + AuxCoord(array([ 37.5 , 93.75]), + bounds=array([[ 0. , 65.625, 65.625, 0. ], + [ 65.625, 121.875, 121.875, 65.625]]), + standard_name='longitude', units=Unit('degrees')) + >>> print(np.round(sample_2d_latlons().coord(axis='x').points, 3)) + [[ 37.5 93.75 150. 206.25 262.5 318.75] + [ 37.5 93.75 150. 206.25 262.5 318.75] + [ 37.5 93.75 150. 206.25 262.5 318.75] + [ 37.5 93.75 150. 206.25 262.5 318.75] + [ 37.5 93.75 150. 206.25 262.5 318.75]] + >>> print(np.round(sample_2d_latlons().coord(axis='y').points, 3)) + [[-85. -85. -85. -85. -85. -85. ] + [-47.5 -47.5 -47.5 -47.5 -47.5 -47.5] + [-10. -10. -10. -10. -10. -10. ] + [ 27.5 27.5 27.5 27.5 27.5 27.5] + [ 65. 65. 65. 65. 65. 65. ]] + + + >>> print(np.round( + sample_2d_latlons(rotated=True).coord(axis='x').points, 3)) + [[ 37.5 93.75 150. 206.25 262.5 318.75] + [ 37.5 93.75 150. 206.25 262.5 318.75] + [ 37.5 93.75 150. 206.25 262.5 318.75] + [ 37.5 93.75 150. 206.25 262.5 318.75] + [ 37.5 93.75 150. 206.25 262.5 318.75]] + >>> print(sample_2d_latlons(rotated=True).coord(axis='y').coord_system) + RotatedGeogCS(75.0, 120.0) + + + >>> print( + sample_2d_latlons(transformed=True).coord(axis='y').coord_system) + None + >>> print(np.round( + sample_2d_latlons(transformed=True).coord(axis='x').points, 3)) + [[ -50.718 -40.983 -46.74 -71.938 -79.293 -70.146] + [ -29.867 17.606 77.936 157.145 -141.037 -93.172] + [ -23.139 31.007 87.699 148.322 -154.639 -100.505] + [ -16.054 41.218 92.761 143.837 -164.738 -108.105] + [ 10.86 61.78 100.236 137.285 175.511 -135.446]] + >>> print(np.round( + sample_2d_latlons(transformed=True).coord(axis='y').points, 3)) + [[-70.796 -74.52 -79.048 -79.26 -74.839 -70.96 ] + [-34.99 -46.352 -59.721 -60.34 -47.305 -35.499] + [ 1.976 -10.626 -22.859 -23.349 -11.595 1.37 ] + [ 38.914 25.531 14.312 13.893 24.585 38.215] + [ 74.197 60.258 51.325 51.016 59.446 73.268]] + >>> + + """ + def sample_cube(xargs, yargs): + # Make a test cube with given latitude + longitude coordinates. + # xargs/yargs are args for np.linspace (start, stop, N), to make the X + # and Y coordinate points. + x0, x1, nx = xargs + y0, y1, ny = yargs + # Data has cycling values, staggered a bit in successive rows. + data = np.zeros((ny, nx)) + data.flat[:] = np.arange(ny * nx) % (nx + 2) + # Build a 2d cube with longitude + latitude coordinates. + cube = Cube(data, long_name='test_data') + x_pts = np.linspace(x0, x1, nx, endpoint=True) + y_pts = np.linspace(y0, y1, ny, endpoint=True) + co_x = DimCoord(x_pts, standard_name='longitude', units='degrees') + co_y = DimCoord(y_pts, standard_name='latitude', units='degrees') + cube.add_dim_coord(co_y, 0) + cube.add_dim_coord(co_x, 1) + return cube + + # Start by making a "normal" cube with separate 1-D X and Y coords. + if regional: + # Make a small regional cube. + cube = sample_cube(xargs=(150., 243.75, 6), yargs=(-10., 40., 5)) + + # Add contiguous bounds. + for ax in ('x', 'y'): + cube.coord(axis=ax).guess_bounds() + else: + # Global data, but at a drastically reduced resolution. + cube = sample_cube(xargs=(37.5, 318.75, 6), yargs=(-85., 65., 5)) + + # Make contiguous bounds and adjust outer edges to ensure it is global. + for name in ('longitude', 'latitude'): + coord = cube.coord(name) + coord.guess_bounds() + bds = coord.bounds.copy() + # Make bounds global, by fixing lowest and uppermost values. + if name == 'longitude': + bds[0, 0] = 0.0 + bds[-1, 1] = 360.0 + else: + bds[0, 0] = -90.0 + bds[-1, 1] = 90.0 + coord.bounds = bds + + # Now convert the 1-d coords to 2-d equivalents. + # Get original 1-d coords. + co_1d_x, co_1d_y = [cube.coord(axis=ax).copy() for ax in ('x', 'y')] + + # Calculate 2-d equivalents. + co_2d_x, co_2d_y = grid_coords_2d_from_1d(co_1d_x, co_1d_y) + + # Remove the old grid coords. + for coord in (co_1d_x, co_1d_y): + cube.remove_coord(coord) + + # Add the new grid coords. + for coord in (co_2d_x, co_2d_y): + cube.add_aux_coord(coord, (0, 1)) + + if transformed or rotated: + # Put the cube locations into a rotated coord system. + pole_lat, pole_lon = 75.0, 120.0 + if transformed: + # Reproject coordinate values from rotated to true lat-lons. + co_x, co_y = [cube.coord(axis=ax) for ax in ('x', 'y')] + # Unrotate points. + lons, lats = co_x.points, co_y.points + lons, lats = unrotate_pole(lons, lats, pole_lon, pole_lat) + co_x.points, co_y.points = lons, lats + # Unrotate bounds. + lons, lats = co_x.bounds, co_y.bounds + # Note: save the shape, flatten + then re-apply the shape, because + # "unrotate_pole" uses "cartopy.crs.CRS.transform_points", which + # only works on arrays of 1 or 2 dimensions. + shape = lons.shape + lons, lats = unrotate_pole(lons.flatten(), lats.flatten(), + pole_lon, pole_lat) + co_x.bounds, co_y.bounds = lons.reshape(shape), lats.reshape(shape) + else: + # "Just" rotate operation : add a coord-system to each coord. + cs = RotatedGeogCS(pole_lat, pole_lon) + for coord in cube.coords(): + coord.coord_system = cs + + return cube + + +def make_bounds_discontiguous_at_point(cube, at_iy, at_ix): + """ + Meddle with the XY grid bounds of a cube to make the grid discontiguous. + + Changes the points and bounds of a single gridcell, so that it becomes + discontinuous with the next gridcell to its right. + Also masks the cube data at the given point. + + The cube must have bounded 2d 'x' and 'y' coordinates. + + TODO: add a switch to make a discontiguity in the *y*-direction instead ? + + """ + x_coord = cube.coord(axis='x') + y_coord = cube.coord(axis='y') + assert x_coord.shape == y_coord.shape + assert (coord.bounds.ndim == 3 and coord.shape[-1] == 4 + for coord in (x_coord, y_coord)) + + # For both X and Y coord, move points + bounds to create a discontinuity. + def adjust_coord(coord): + pts, bds = coord.points, coord.bounds + # Fetch the 4 bounds (bottom-left, bottom-right, top-right, top-left) + bds_bl, bds_br, bds_tr, bds_tl = bds[at_iy, at_ix] + # Make a discontinuity "at" (iy, ix), by moving the right-hand edge of + # the cell to the midpoint of the existing left+right bounds. + new_bds_br = 0.5 * (bds_bl + bds_br) + new_bds_tr = 0.5 * (bds_tl + bds_tr) + bds_br, bds_tr = new_bds_br, new_bds_tr + bds[at_iy, at_ix] = [bds_bl, bds_br, bds_tr, bds_tl] + # Also reset the cell midpoint to the middle of the 4 new corners, + # in case having a midpoint outside the corners might cause a problem. + new_pt = 0.25 * sum([bds_bl, bds_br, bds_tr, bds_tl]) + pts[at_iy, at_ix] = new_pt + # Write back the coord points+bounds (can only assign whole arrays). + coord.points, coord.bounds = pts, bds + + adjust_coord(x_coord) + adjust_coord(y_coord) + # Also mask the relevant data point. + data = cube.data # N.B. fetch all the data. + if not ma.isMaskedArray(data): + # Promote to masked array, to avoid converting mask to NaN. + data = ma.masked_array(data) + data[at_iy, at_ix] = ma.masked + cube.data = data diff --git a/lib/iris/tests/test_coding_standards.py b/lib/iris/tests/test_coding_standards.py index 39c73e8285..c24ac46cb3 100644 --- a/lib/iris/tests/test_coding_standards.py +++ b/lib/iris/tests/test_coding_standards.py @@ -93,7 +93,7 @@ class StandardReportWithExclusions(pep8.StandardReport): '*/iris/io/format_picker.py', '*/iris/tests/__init__.py', '*/iris/tests/pp.py', - '*/iris/tests/stock.py', + '*/iris/tests/stock/__init__.py', '*/iris/tests/system_test.py', '*/iris/tests/test_analysis.py', '*/iris/tests/test_analysis_calculus.py', diff --git a/lib/iris/tests/test_coordsystem.py b/lib/iris/tests/test_coordsystem.py index 6def1a3123..5aef363533 100644 --- a/lib/iris/tests/test_coordsystem.py +++ b/lib/iris/tests/test_coordsystem.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2010 - 2017, Met Office +# (C) British Crown Copyright 2010 - 2018, Met Office # # This file is part of Iris. # diff --git a/lib/iris/tests/test_cube_to_pp.py b/lib/iris/tests/test_cube_to_pp.py index fa78f95166..46045fb55d 100644 --- a/lib/iris/tests/test_cube_to_pp.py +++ b/lib/iris/tests/test_cube_to_pp.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2010 - 2017, Met Office +# (C) British Crown Copyright 2010 - 2018, Met Office # # This file is part of Iris. # @@ -294,8 +294,8 @@ def test_lbvc(self): self.assertEqual(field.lbvc, lbvc) self.assertEqual(field.lblev, lblev) self.assertEqual(field.blev, blev) - - + + def fields_from_cube(cubes): """ Return an iterator of PP fields generated from saving the given cube(s) diff --git a/lib/iris/tests/test_pandas.py b/lib/iris/tests/test_pandas.py index aec432692a..51a163b728 100644 --- a/lib/iris/tests/test_pandas.py +++ b/lib/iris/tests/test_pandas.py @@ -398,6 +398,17 @@ def test_data_frame_masked(self): tests.get_result_path(('pandas', 'as_cube', 'data_frame_masked.cml'))) + def test_data_frame_multidim(self): + data_frame = pandas.DataFrame([[0, 1, 2, 3, 4], + [5, 6, 7, 8, 9]], + index=[0, 1], + columns=['col_1', 'col_2', 'col_3', + 'col_4', 'col_5']) + self.assertCML( + iris.pandas.as_cube(data_frame), + tests.get_result_path(('pandas', 'as_cube', + 'data_frame_multidim.cml'))) + def test_data_frame_cftime_360(self): data_frame = pandas.DataFrame( [[0, 1, 2, 3, 4], diff --git a/lib/iris/tests/test_util.py b/lib/iris/tests/test_util.py index 97bb4d627d..d8cfcf648c 100644 --- a/lib/iris/tests/test_util.py +++ b/lib/iris/tests/test_util.py @@ -93,31 +93,6 @@ def test_monotonic_strict(self): self.assertMonotonic(b) -class TestReverse(tests.IrisTest): - def test_simple(self): - a = np.arange(12).reshape(3, 4) - np.testing.assert_array_equal(a[::-1], iris.util.reverse(a, 0)) - np.testing.assert_array_equal(a[::-1, ::-1], iris.util.reverse(a, [0, 1])) - np.testing.assert_array_equal(a[:, ::-1], iris.util.reverse(a, 1)) - np.testing.assert_array_equal(a[:, ::-1], iris.util.reverse(a, [1])) - self.assertRaises(ValueError, iris.util.reverse, a, []) - self.assertRaises(ValueError, iris.util.reverse, a, -1) - self.assertRaises(ValueError, iris.util.reverse, a, 10) - self.assertRaises(ValueError, iris.util.reverse, a, [-1]) - self.assertRaises(ValueError, iris.util.reverse, a, [0, -1]) - - def test_single(self): - a = np.arange(36).reshape(3, 4, 3) - np.testing.assert_array_equal(a[::-1], iris.util.reverse(a, 0)) - np.testing.assert_array_equal(a[::-1, ::-1], iris.util.reverse(a, [0, 1])) - np.testing.assert_array_equal(a[:, ::-1, ::-1], iris.util.reverse(a, [1, 2])) - np.testing.assert_array_equal(a[..., ::-1], iris.util.reverse(a, 2)) - self.assertRaises(ValueError, iris.util.reverse, a, -1) - self.assertRaises(ValueError, iris.util.reverse, a, 10) - self.assertRaises(ValueError, iris.util.reverse, a, [-1]) - self.assertRaises(ValueError, iris.util.reverse, a, [0, -1]) - - class TestClipString(tests.IrisTest): def setUp(self): self.test_string = "Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum." diff --git a/lib/iris/tests/unit/analysis/cartography/test_gridcell_angles.py b/lib/iris/tests/unit/analysis/cartography/test_gridcell_angles.py new file mode 100644 index 0000000000..4a1344e83d --- /dev/null +++ b/lib/iris/tests/unit/analysis/cartography/test_gridcell_angles.py @@ -0,0 +1,327 @@ +# (C) British Crown Copyright 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +""" +Unit tests for the function +:func:`iris.analysis.cartography.gridcell_angles`. + +""" +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +# Import iris.tests first so that some things can be initialised before +# importing anything else. +import iris.tests as tests + +import numpy as np + +from cf_units import Unit +from iris.cube import Cube +from iris.coords import AuxCoord + +from iris.analysis.cartography import gridcell_angles +from iris.tests.stock import sample_2d_latlons, lat_lon_cube + + +def _2d_multicells_testcube(cellsize_degrees=1.0): + """ + Create a test cube with a grid of X and Y points, where each gridcell + is independent (disjoint), arranged at an angle == the x-coord point. + + """ + # Setup np.linspace arguments to make the coordinate points. + x0, x1, nx = -164, 164, 9 + y0, y1, ny = -75, 75, 7 + + lats = np.linspace(y0, y1, ny, endpoint=True) + lons_angles = np.linspace(x0, x1, nx, endpoint=True) + x_pts_2d, y_pts_2d = np.meshgrid(lons_angles, lats) + + # Make gridcells rectangles surrounding these centrepoints, but also + # tilted at various angles (= same as x-point lons, as that's easy). + + # Calculate centrepoint lons+lats : in radians, and shape (ny, nx, 1). + xangs, yangs = np.deg2rad(x_pts_2d), np.deg2rad(y_pts_2d) + xangs, yangs = [arr[..., None] for arr in (xangs, yangs)] + # Program which corners are up+down on each gridcell axis. + dx_corners = [[[-1, 1, 1, -1]]] + dy_corners = [[[-1, -1, 1, 1]]] + # Calculate the relative offsets in x+y at the 4 corners. + x_ofs_2d = cellsize_degrees * np.cos(xangs) * dx_corners + x_ofs_2d -= cellsize_degrees * np.sin(xangs) * dy_corners + y_ofs_2d = cellsize_degrees * np.cos(xangs) * dy_corners + y_ofs_2d += cellsize_degrees * np.sin(xangs) * dx_corners + # Apply a latitude stretch to make correct angles on the globe. + y_ofs_2d *= np.cos(yangs) + # Make bounds arrays by adding the corner offsets to the centrepoints. + x_bds_2d = x_pts_2d[..., None] + x_ofs_2d + y_bds_2d = y_pts_2d[..., None] + y_ofs_2d + + # Create a cube with these points + bounds in its 'X' and 'Y' coords. + co_x = AuxCoord(points=x_pts_2d, bounds=x_bds_2d, + standard_name='longitude', units='degrees') + co_y = AuxCoord(points=y_pts_2d, bounds=y_bds_2d, + standard_name='latitude', units='degrees') + cube = Cube(np.zeros((ny, nx))) + cube.add_aux_coord(co_x, (0, 1)) + cube.add_aux_coord(co_y, (0, 1)) + return cube + + +class TestGridcellAngles(tests.IrisTest): + def setUp(self): + # Make a small "normal" contiguous-bounded cube to test on. + # This one is regional. + self.standard_regional_cube = sample_2d_latlons( + regional=True, transformed=True) + # Record the standard correct angle answers. + result_cube = gridcell_angles(self.standard_regional_cube) + result_cube.convert_units('degrees') + self.standard_result_cube = result_cube + self.standard_small_cube_results = result_cube.data + + def _check_multiple_orientations_and_latitudes(self, + method='mid-lhs, mid-rhs', + atol_degrees=0.005, + cellsize_degrees=1.0): + + cube = _2d_multicells_testcube(cellsize_degrees=cellsize_degrees) + + # Calculate gridcell angles at each point. + angles_cube = gridcell_angles(cube, cell_angle_boundpoints=method) + + # Check that the results are a close match to the original intended + # gridcell orientation angles. + # NOTE: neither the above gridcell construction nor the calculation + # itself are exact : Errors scale as the square of gridcell sizes. + angles_cube.convert_units('degrees') + angles_calculated = angles_cube.data + + # Note: the gridcell angles **should** just match the longitudes at + # each point + angles_expected = cube.coord('longitude').points + + # Wrap both into standard range for comparison. + angles_calculated = (angles_calculated + 360.) % 360. + angles_expected = (angles_expected + 360.) % 360. + + # Assert (toleranced) equality, and return results. + self.assertArrayAllClose(angles_calculated, angles_expected, + atol=atol_degrees) + + return angles_calculated, angles_expected + + def test_various_orientations_and_locations(self): + self._check_multiple_orientations_and_latitudes() + + def test_result_form(self): + # Check properties of the result cube *other than* the data values. + test_cube = self.standard_regional_cube + result_cube = self.standard_result_cube + self.assertEqual(result_cube.long_name, + 'gridcell_angle_from_true_east') + self.assertEqual(result_cube.units, Unit('degrees')) + self.assertEqual(len(result_cube.coords()), 2) + self.assertEqual(result_cube.coord(axis='x'), + test_cube.coord(axis='x')) + self.assertEqual(result_cube.coord(axis='y'), + test_cube.coord(axis='y')) + + def test_bottom_edge_method(self): + # Get results with the "other" calculation method + check to tolerance. + # A smallish cellsize should yield similar results in both cases. + r1, _ = self._check_multiple_orientations_and_latitudes() + r2, _ = self._check_multiple_orientations_and_latitudes( + method='lower-left, lower-right', + cellsize_degrees=0.1, atol_degrees=0.1) + + # Not *exactly* the same : this checks we tested the 'other' method ! + self.assertFalse(np.allclose(r1, r2)) + # Note: results are a bit different in places. This is acceptable. + self.assertArrayAllClose(r1, r2, atol=0.1) + + def test_bounded_coord_args(self): + # Check that passing the coords gives the same result as the cube. + co_x, co_y = (self.standard_regional_cube.coord(axis=ax) + for ax in ('x', 'y')) + result = gridcell_angles(co_x, co_y) + self.assertArrayAllClose(result.data, + self.standard_small_cube_results) + + def test_coords_radians_args(self): + # Check it still works with coords converted to radians. + co_x, co_y = (self.standard_regional_cube.coord(axis=ax) + for ax in ('x', 'y')) + for coord in (co_x, co_y): + coord.convert_units('radians') + result = gridcell_angles(co_x, co_y) + self.assertArrayAllClose(result.data, + self.standard_small_cube_results) + + def test_bounds_array_args(self): + # Check we can calculate from bounds values alone. + co_x, co_y = (self.standard_regional_cube.coord(axis=ax) + for ax in ('x', 'y')) + # Results drawn from coord bounds should be nearly the same, + # but not exactly, because of the different 'midpoint' values. + result = gridcell_angles(co_x.bounds, co_y.bounds) + self.assertArrayAllClose(result.data, + self.standard_small_cube_results, atol=0.1) + + def test_unbounded_regional_coord_args(self): + # Remove the coord bounds to check points-based calculation. + co_x, co_y = (self.standard_regional_cube.coord(axis=ax) + for ax in ('x', 'y')) + for coord in (co_x, co_y): + coord.bounds = None + result = gridcell_angles(co_x, co_y) + # Note: in this case, we can expect the leftmost and rightmost columns + # to be rubbish, because the data is not global. + # But the rest should match okay. + self.assertArrayAllClose(result.data[:, 1:-1], + self.standard_small_cube_results[:, 1:-1]) + + def test_points_array_args(self): + # Check we can calculate from points arrays alone (no coords). + co_x, co_y = (self.standard_regional_cube.coord(axis=ax) + for ax in ('x', 'y')) + # As previous, the leftmost and rightmost columns are not good. + result = gridcell_angles(co_x.points, co_y.points) + self.assertArrayAllClose(result.data[:, 1:-1], + self.standard_small_cube_results[:, 1:-1]) + + def test_unbounded_global(self): + # For a contiguous global grid, a result based on points, i.e. with the + # bounds removed, should be a reasonable match for the 'ideal' one + # based on the bounds. + + # Make a global cube + calculate ideal bounds-based results. + global_cube = sample_2d_latlons(transformed=True) + result_cube = gridcell_angles(global_cube) + result_cube.convert_units('degrees') + global_cube_results = result_cube.data + + # Check a points-based calculation on the same basic grid. + co_x, co_y = (global_cube.coord(axis=ax) + for ax in ('x', 'y')) + for coord in (co_x, co_y): + coord.bounds = None + result = gridcell_angles(co_x, co_y) + # In this case, the match is actually rather poor (!). + self.assertArrayAllClose(result.data, + global_cube_results, + atol=7.5) + # Leaving off first + last columns again gives a decent result. + self.assertArrayAllClose(result.data[:, 1:-1], + global_cube_results[:, 1:-1]) + + # NOTE: although this looks just as bad as 'test_points_array_args', + # maximum errors there in the end columns are actually > 100 degrees ! + + def test_nonlatlon_coord_system(self): + # Check with points specified in an unexpected coord system. + cube = sample_2d_latlons(regional=True, rotated=True) + result = gridcell_angles(cube) + self.assertArrayAllClose(result.data, + self.standard_small_cube_results) + # Check that the result has transformed (true-latlon) coordinates. + self.assertEqual(len(result.coords()), 2) + x_coord = result.coord(axis='x') + y_coord = result.coord(axis='y') + self.assertEqual(x_coord.shape, cube.shape) + self.assertEqual(y_coord.shape, cube.shape) + self.assertIsNotNone(cube.coord_system) + self.assertIsNone(x_coord.coord_system) + self.assertIsNone(y_coord.coord_system) + + def test_fail_coords_bad_units(self): + # Check error with bad coords units. + co_x, co_y = (self.standard_regional_cube.coord(axis=ax) + for ax in ('x', 'y')) + co_y.units = 'm' + with self.assertRaisesRegexp(ValueError, 'must have angular units'): + gridcell_angles(co_x, co_y) + + def test_fail_nonarraylike(self): + # Check error with bad args. + co_x, co_y = 1, 2 + with self.assertRaisesRegexp(ValueError, + 'must have array shape property'): + gridcell_angles(co_x, co_y) + + def test_fail_non2d_coords(self): + # Check error with bad args. + cube = lat_lon_cube() + with self.assertRaisesRegexp(ValueError, + 'inputs must have 2-dimensional shape'): + gridcell_angles(cube) + + def test_fail_different_shapes(self): + # Check error with mismatched shapes. + co_x, co_y = (self.standard_regional_cube.coord(axis=ax) + for ax in ('x', 'y')) + co_y = co_y[1:] + with self.assertRaisesRegexp(ValueError, 'must have same shape'): + gridcell_angles(co_x, co_y) + + def test_fail_different_coord_system(self): + # Check error with mismatched coord systems. + cube = sample_2d_latlons(regional=True, rotated=True) + cube.coord(axis='x').coord_system = None + with self.assertRaisesRegexp(ValueError, + 'must have same coordinate system'): + gridcell_angles(cube) + + def test_fail_cube_dims(self): + # Check error with mismatched cube dims. + cube = self.standard_regional_cube + # Make 5x6 into 5x5. + cube = cube[:, :-1] + co_x = cube.coord(axis='x') + pts, bds = co_x.points, co_x.bounds + co_new_x = co_x.copy(points=pts.transpose((1, 0)), + bounds=bds.transpose((1, 0, 2))) + cube.remove_coord(co_x) + cube.add_aux_coord(co_new_x, (1, 0)) + with self.assertRaisesRegexp(ValueError, + 'must have the same cube dimensions'): + gridcell_angles(cube) + + def test_fail_coord_noncoord(self): + # Check that passing a coord + an array gives an error. + co_x, co_y = (self.standard_regional_cube.coord(axis=ax) + for ax in ('x', 'y')) + with self.assertRaisesRegexp(ValueError, + 'is a Coordinate, but .* is not'): + gridcell_angles(co_x, co_y.bounds) + + def test_fail_noncoord_coord(self): + # Check that passing an array + a coord gives an error. + co_x, co_y = (self.standard_regional_cube.coord(axis=ax) + for ax in ('x', 'y')) + with self.assertRaisesRegexp(ValueError, + 'is a Coordinate, but .* is not'): + gridcell_angles(co_x.points, co_y) + + def test_fail_bad_method(self): + with self.assertRaisesRegexp(ValueError, + 'unrecognised cell_angle_boundpoints'): + self._check_multiple_orientations_and_latitudes( + method='something_unknown') + + +if __name__ == "__main__": + tests.main() diff --git a/lib/iris/tests/unit/analysis/cartography/test_rotate_grid_vectors.py b/lib/iris/tests/unit/analysis/cartography/test_rotate_grid_vectors.py new file mode 100644 index 0000000000..90ec37e162 --- /dev/null +++ b/lib/iris/tests/unit/analysis/cartography/test_rotate_grid_vectors.py @@ -0,0 +1,148 @@ +# (C) British Crown Copyright 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +""" +Unit tests for the function +:func:`iris.analysis.cartography.rotate_grid_vectors`. + +""" +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +# Import iris.tests first so that some things can be initialised before +# importing anything else. +import iris.tests as tests + +from mock import Mock, call as mock_call +import numpy as np + +from iris.cube import Cube +from iris.tests.stock import sample_2d_latlons + +from iris.analysis.cartography import rotate_grid_vectors + + +class TestRotateGridVectors(tests.IrisTest): + def _check_angles_calculation(self, angles_in_degrees=True, + nan_angles_mask=None): + # Check basic maths on a 2d latlon grid. + u_cube = sample_2d_latlons(regional=True, transformed=True) + u_cube.units = 'ms-1' + u_cube.rename('dx') + u_cube.data[...] = 0 + v_cube = u_cube.copy() + v_cube.name('dy') + + # Define 6 different vectors, repeated in each data row. + in_vu = np.array([(0, 1), (2, -1), (-1, -1), (-3, 1), (2, 0), (0, 0)]) + in_angs = np.rad2deg(np.arctan2(in_vu[..., 0], in_vu[..., 1])) + in_mags = np.sqrt(np.sum(in_vu * in_vu, axis=1)) + v_cube.data[...] = in_vu[..., 0] + u_cube.data[...] = in_vu[..., 1] + + # Define 5 different test rotation angles, one for each data row. + rotation_angles = np.array([0., -45., 135, -140., 90.]) + ang_cube_data = np.broadcast_to(rotation_angles[:, None], + u_cube.shape) + ang_cube = u_cube.copy() + if angles_in_degrees: + ang_cube.units = 'degrees' + else: + ang_cube.units = 'radians' + ang_cube_data = np.deg2rad(ang_cube_data) + ang_cube.data[:] = ang_cube_data + + if nan_angles_mask is not None: + ang_cube.data[nan_angles_mask] = np.nan + + # Rotate all vectors by all the given angles. + result = rotate_grid_vectors(u_cube, v_cube, ang_cube) + out_u, out_v = [cube.data for cube in result] + + # Check that vector magnitudes were unchanged. + out_mags = np.sqrt(out_u * out_u + out_v * out_v) + expect_mags = in_mags[None, :] + self.assertArrayAllClose(out_mags, expect_mags) + + # Check that vector angles are all as expected. + out_angs = np.rad2deg(np.arctan2(out_v, out_u)) + expect_angs = in_angs[None, :] + rotation_angles[:, None] + ang_diffs = out_angs - expect_angs + # Fix for null vectors, and +/-360 differences. + ang_diffs[np.abs(out_mags) < 0.001] = 0.0 + ang_diffs = ang_diffs % 360.0 + # Check that any differences are very small. + self.assertArrayAllClose(ang_diffs, 0.0) + + # Check that results are always masked arrays, masked at NaN angles. + self.assertTrue(np.ma.isMaskedArray(out_u)) + self.assertTrue(np.ma.isMaskedArray(out_v)) + if nan_angles_mask is not None: + self.assertArrayEqual(out_u.mask, nan_angles_mask) + self.assertArrayEqual(out_v.mask, nan_angles_mask) + + def test_angles_calculation(self): + self._check_angles_calculation() + + def test_angles_in_radians(self): + self._check_angles_calculation(angles_in_degrees=False) + + def test_angles_from_grid(self): + # Check it will gets angles from 'u_cube', and pass any kwargs on to + # the angles routine. + u_cube = sample_2d_latlons(regional=True, transformed=True) + u_cube = u_cube[:2, :3] + u_cube.units = 'ms-1' + u_cube.rename('dx') + u_cube.data[...] = 1.0 + v_cube = u_cube.copy() + v_cube.name('dy') + v_cube.data[...] = 0.0 + + # Setup a fake angles result from the inner call to 'gridcell_angles'. + angles_result_data = np.array([[0.0, 90.0, 180.0], + [-180.0, -90.0, 270.0]]) + angles_result_cube = Cube(angles_result_data, units='degrees') + angles_kwargs = {'this': 2} + angles_call_patch = self.patch( + 'iris.analysis._grid_angles.gridcell_angles', + Mock(return_value=angles_result_cube)) + + # Call the routine. + result = rotate_grid_vectors(u_cube, v_cube, + grid_angles_kwargs=angles_kwargs) + + self.assertEqual(angles_call_patch.call_args_list, + [mock_call(u_cube, this=2)]) + + out_u, out_v = [cube.data for cube in result] + # Records what results should be for the various n*90deg rotations. + expect_u = np.array([[1.0, 0.0, -1.0], + [-1.0, 0.0, 0.0]]) + expect_v = np.array([[0.0, 1.0, 0.0], + [0.0, -1.0, -1.0]]) + # Check results are as expected. + self.assertArrayAllClose(out_u, expect_u) + self.assertArrayAllClose(out_v, expect_v) + + def test_nan_vectors(self): + bad_angle_points = np.zeros((5, 6), dtype=bool) + bad_angle_points[2, 3] = True + self._check_angles_calculation(nan_angles_mask=bad_angle_points) + + +if __name__ == "__main__": + tests.main() diff --git a/lib/iris/tests/unit/analysis/regrid/test_RectilinearRegridder.py b/lib/iris/tests/unit/analysis/regrid/test_RectilinearRegridder.py index ca86e5cdd7..6035630aae 100644 --- a/lib/iris/tests/unit/analysis/regrid/test_RectilinearRegridder.py +++ b/lib/iris/tests/unit/analysis/regrid/test_RectilinearRegridder.py @@ -86,7 +86,8 @@ def assert_values(self, values): result = regrid(self.data, self.x_dim, self.y_dim, self.x, self.y, np.array([xs]), np.array([ys])) - self.assertArrayAllClose(result, expecteds, rtol=1e-04) + self.assertArrayAllClose(result, expecteds, rtol=1e-04, + equal_nan=True) # Check that transposing the input data results in the same values ndim = self.data.ndim diff --git a/lib/iris/tests/unit/analysis/test_Aggregator.py b/lib/iris/tests/unit/analysis/test_Aggregator.py index 6895a7c691..e4e7b71656 100644 --- a/lib/iris/tests/unit/analysis/test_Aggregator.py +++ b/lib/iris/tests/unit/analysis/test_Aggregator.py @@ -283,7 +283,7 @@ def test_kwarg_pass_through_no_kwargs(self): axis = mock.sentinel.axis aggregator = Aggregator('', None, lazy_func=lazy_func) aggregator.lazy_aggregate(data, axis) - lazy_func.assert_called_once_with(data, axis) + lazy_func.assert_called_once_with(data, axis=axis) def test_kwarg_pass_through_call_kwargs(self): lazy_func = mock.Mock() @@ -292,7 +292,7 @@ def test_kwarg_pass_through_call_kwargs(self): kwargs = dict(wibble='wobble', foo='bar') aggregator = Aggregator('', None, lazy_func=lazy_func) aggregator.lazy_aggregate(data, axis, **kwargs) - lazy_func.assert_called_once_with(data, axis, **kwargs) + lazy_func.assert_called_once_with(data, axis=axis, **kwargs) def test_kwarg_pass_through_init_kwargs(self): lazy_func = mock.Mock() @@ -301,7 +301,7 @@ def test_kwarg_pass_through_init_kwargs(self): kwargs = dict(wibble='wobble', foo='bar') aggregator = Aggregator('', None, lazy_func=lazy_func, **kwargs) aggregator.lazy_aggregate(data, axis) - lazy_func.assert_called_once_with(data, axis, **kwargs) + lazy_func.assert_called_once_with(data, axis=axis, **kwargs) def test_kwarg_pass_through_combined_kwargs(self): lazy_func = mock.Mock() @@ -313,7 +313,7 @@ def test_kwarg_pass_through_combined_kwargs(self): aggregator.lazy_aggregate(data, axis, **call_kwargs) expected_kwargs = init_kwargs.copy() expected_kwargs.update(call_kwargs) - lazy_func.assert_called_once_with(data, axis, **expected_kwargs) + lazy_func.assert_called_once_with(data, axis=axis, **expected_kwargs) if __name__ == "__main__": diff --git a/lib/iris/tests/unit/analysis/test_COUNT.py b/lib/iris/tests/unit/analysis/test_COUNT.py index 9248a17910..5422a197e0 100644 --- a/lib/iris/tests/unit/analysis/test_COUNT.py +++ b/lib/iris/tests/unit/analysis/test_COUNT.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2013 - 2017, Met Office +# (C) British Crown Copyright 2013 - 2018, Met Office # # This file is part of Iris. # @@ -23,11 +23,58 @@ # importing anything else. import iris.tests as tests +import numpy as np import numpy.ma as ma from iris.analysis import COUNT -import iris.cube +from iris.cube import Cube from iris.coords import DimCoord +from iris._lazy_data import as_lazy_data, is_lazy_data + + +class Test_basics(tests.IrisTest): + def setUp(self): + data = np.array([1, 2, 3, 4, 5]) + coord = DimCoord([6, 7, 8, 9, 10], long_name='foo') + self.cube = Cube(data) + self.cube.add_dim_coord(coord, 0) + self.lazy_cube = Cube(as_lazy_data(data)) + self.lazy_cube.add_dim_coord(coord, 0) + self.func = lambda x: x >= 3 + + def test_name(self): + self.assertEqual(COUNT.name(), 'count') + + def test_no_function(self): + exp_emsg = r"function must be a callable. Got <.* 'NoneType'>" + with self.assertRaisesRegexp(TypeError, exp_emsg): + COUNT.lazy_aggregate(self.lazy_cube.lazy_data(), axis=0) + + def test_not_callable(self): + with self.assertRaisesRegexp(TypeError, 'function must be a callable'): + COUNT.aggregate(self.cube.data, axis=0, function='wibble') + + def test_lazy_not_callable(self): + with self.assertRaisesRegexp(TypeError, 'function must be a callable'): + COUNT.lazy_aggregate(self.lazy_cube.lazy_data(), + axis=0, + function='wibble') + + def test_collapse(self): + data = COUNT.aggregate(self.cube.data, axis=0, function=self.func) + self.assertArrayEqual(data, [3]) + + def test_lazy(self): + lazy_data = COUNT.lazy_aggregate(self.lazy_cube.lazy_data(), + axis=0, + function=self.func) + self.assertTrue(is_lazy_data(lazy_data)) + + def test_lazy_collapse(self): + lazy_data = COUNT.lazy_aggregate(self.lazy_cube.lazy_data(), + axis=0, + function=self.func) + self.assertArrayEqual(lazy_data.compute(), [3]) class Test_units_func(tests.IrisTest): @@ -39,18 +86,30 @@ def test(self): class Test_masked(tests.IrisTest): def setUp(self): - self.cube = iris.cube.Cube(ma.masked_equal([1, 2, 3, 4, 5], 3)) + self.cube = Cube(ma.masked_equal([1, 2, 3, 4, 5], 3)) self.cube.add_dim_coord(DimCoord([6, 7, 8, 9, 10], long_name='foo'), 0) self.func = lambda x: x >= 3 def test_ma(self): - cube = self.cube.collapsed("foo", COUNT, function=self.func) - self.assertArrayEqual(cube.data, [2]) + data = COUNT.aggregate(self.cube.data, axis=0, function=self.func) + self.assertArrayEqual(data, [2]) -class Test_name(tests.IrisTest): - def test(self): - self.assertEqual(COUNT.name(), 'count') +class Test_lazy_masked(tests.IrisTest): + def setUp(self): + lazy_data = as_lazy_data(ma.masked_equal([1, 2, 3, 4, 5], 3)) + self.lazy_cube = Cube(lazy_data) + self.lazy_cube.add_dim_coord(DimCoord([6, 7, 8, 9, 10], + long_name='foo'), + 0) + self.func = lambda x: x >= 3 + + def test_ma(self): + lazy_data = COUNT.lazy_aggregate(self.lazy_cube.lazy_data(), + axis=0, + function=self.func) + self.assertTrue(is_lazy_data(lazy_data)) + self.assertArrayEqual(lazy_data.compute(), [2]) class Test_aggregate_shape(tests.IrisTest): diff --git a/lib/iris/tests/unit/analysis/test_MAX.py b/lib/iris/tests/unit/analysis/test_MAX.py new file mode 100644 index 0000000000..f3a523bc93 --- /dev/null +++ b/lib/iris/tests/unit/analysis/test_MAX.py @@ -0,0 +1,92 @@ +# (C) British Crown Copyright 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +"""Unit tests for the :data:`iris.analysis.MAX` aggregator.""" + +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +# Import iris.tests first so that some things can be initialised before +# importing anything else. +import iris.tests as tests + +import numpy as np +import numpy.ma as ma + +from iris.analysis import MAX +from iris.cube import Cube +from iris.coords import DimCoord +from iris._lazy_data import as_lazy_data, is_lazy_data + + +class Test_basics(tests.IrisTest): + def setUp(self): + data = np.array([1, 2, 3, 4, 5]) + coord = DimCoord([6, 7, 8, 9, 10], long_name='foo') + self.cube = Cube(data) + self.cube.add_dim_coord(coord, 0) + self.lazy_cube = Cube(as_lazy_data(data)) + self.lazy_cube.add_dim_coord(coord, 0) + + def test_name(self): + self.assertEqual(MAX.name(), 'maximum') + + def test_collapse(self): + data = MAX.aggregate(self.cube.data, axis=0) + self.assertArrayEqual(data, [5]) + + def test_lazy(self): + lazy_data = MAX.lazy_aggregate(self.lazy_cube.lazy_data(), axis=0) + self.assertTrue(is_lazy_data(lazy_data)) + + def test_lazy_collapse(self): + lazy_data = MAX.lazy_aggregate(self.lazy_cube.lazy_data(), axis=0) + self.assertArrayEqual(lazy_data.compute(), [5]) + + +class Test_masked(tests.IrisTest): + def setUp(self): + self.cube = Cube(ma.masked_greater([1, 2, 3, 4, 5], 3)) + self.cube.add_dim_coord(DimCoord([6, 7, 8, 9, 10], long_name='foo'), 0) + + def test_ma(self): + data = MAX.aggregate(self.cube.data, axis=0) + self.assertArrayEqual(data, [3]) + + +class Test_lazy_masked(tests.IrisTest): + def setUp(self): + masked_data = ma.masked_greater([1, 2, 3, 4, 5], 3) + self.cube = Cube(as_lazy_data(masked_data)) + self.cube.add_dim_coord(DimCoord([6, 7, 8, 9, 10], long_name='foo'), 0) + + def test_lazy_ma(self): + lazy_data = MAX.lazy_aggregate(self.cube.lazy_data(), axis=0) + self.assertTrue(is_lazy_data(lazy_data)) + self.assertArrayEqual(lazy_data.compute(), [3]) + + +class Test_aggregate_shape(tests.IrisTest): + def test(self): + shape = () + kwargs = dict() + self.assertTupleEqual(MAX.aggregate_shape(**kwargs), shape) + kwargs = dict(wibble='wobble') + self.assertTupleEqual(MAX.aggregate_shape(**kwargs), shape) + + +if __name__ == "__main__": + tests.main() diff --git a/lib/iris/tests/unit/analysis/test_MIN.py b/lib/iris/tests/unit/analysis/test_MIN.py new file mode 100644 index 0000000000..13860f5306 --- /dev/null +++ b/lib/iris/tests/unit/analysis/test_MIN.py @@ -0,0 +1,92 @@ +# (C) British Crown Copyright 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +"""Unit tests for the :data:`iris.analysis.MIN` aggregator.""" + +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +# Import iris.tests first so that some things can be initialised before +# importing anything else. +import iris.tests as tests + +import numpy as np +import numpy.ma as ma + +from iris.analysis import MIN +from iris.cube import Cube +from iris.coords import DimCoord +from iris._lazy_data import as_lazy_data, is_lazy_data + + +class Test_basics(tests.IrisTest): + def setUp(self): + data = np.array([1, 2, 3, 4, 5]) + coord = DimCoord([6, 7, 8, 9, 10], long_name='foo') + self.cube = Cube(data) + self.cube.add_dim_coord(coord, 0) + self.lazy_cube = Cube(as_lazy_data(data)) + self.lazy_cube.add_dim_coord(coord, 0) + + def test_name(self): + self.assertEqual(MIN.name(), 'minimum') + + def test_collapse(self): + data = MIN.aggregate(self.cube.data, axis=0) + self.assertArrayEqual(data, [1]) + + def test_lazy(self): + lazy_data = MIN.lazy_aggregate(self.lazy_cube.lazy_data(), axis=0) + self.assertTrue(is_lazy_data(lazy_data)) + + def test_lazy_collapse(self): + lazy_data = MIN.lazy_aggregate(self.lazy_cube.lazy_data(), axis=0) + self.assertArrayEqual(lazy_data.compute(), [1]) + + +class Test_masked(tests.IrisTest): + def setUp(self): + self.cube = Cube(ma.masked_less([1, 2, 3, 4, 5], 3)) + self.cube.add_dim_coord(DimCoord([6, 7, 8, 9, 10], long_name='foo'), 0) + + def test_ma(self): + data = MIN.aggregate(self.cube.data, axis=0) + self.assertArrayEqual(data, [3]) + + +class Test_lazy_masked(tests.IrisTest): + def setUp(self): + masked_data = ma.masked_less([1, 2, 3, 4, 5], 3) + self.cube = Cube(as_lazy_data(masked_data)) + self.cube.add_dim_coord(DimCoord([6, 7, 8, 9, 10], long_name='foo'), 0) + + def test_lazy_ma(self): + lazy_data = MIN.lazy_aggregate(self.cube.lazy_data(), axis=0) + self.assertTrue(is_lazy_data(lazy_data)) + self.assertArrayEqual(lazy_data.compute(), [3]) + + +class Test_aggregate_shape(tests.IrisTest): + def test(self): + shape = () + kwargs = dict() + self.assertTupleEqual(MIN.aggregate_shape(**kwargs), shape) + kwargs = dict(wibble='wobble') + self.assertTupleEqual(MIN.aggregate_shape(**kwargs), shape) + + +if __name__ == "__main__": + tests.main() diff --git a/lib/iris/tests/unit/analysis/test_STD_DEV.py b/lib/iris/tests/unit/analysis/test_STD_DEV.py index e1aac6ae63..57cc00132f 100644 --- a/lib/iris/tests/unit/analysis/test_STD_DEV.py +++ b/lib/iris/tests/unit/analysis/test_STD_DEV.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2014 - 2017, Met Office +# (C) British Crown Copyright 2014 - 2018, Met Office # # This file is part of Iris. # @@ -25,8 +25,35 @@ import numpy as np -from iris._lazy_data import as_concrete_data, as_lazy_data +from iris._lazy_data import as_concrete_data, as_lazy_data, is_lazy_data from iris.analysis import STD_DEV +from iris.cube import Cube +from iris.coords import DimCoord + + +class Test_basics(tests.IrisTest): + def setUp(self): + data = np.array([1, 2, 3, 4, 5]) + coord = DimCoord([6, 7, 8, 9, 10], long_name='foo') + self.cube = Cube(data) + self.cube.add_dim_coord(coord, 0) + self.lazy_cube = Cube(as_lazy_data(data)) + self.lazy_cube.add_dim_coord(coord, 0) + + def test_name(self): + self.assertEqual(STD_DEV.name(), 'standard_deviation') + + def test_collapse(self): + data = STD_DEV.aggregate(self.cube.data, axis=0) + self.assertArrayAlmostEqual(data, [1.58113883]) + + def test_lazy(self): + lazy_data = STD_DEV.lazy_aggregate(self.lazy_cube.lazy_data(), axis=0) + self.assertTrue(is_lazy_data(lazy_data)) + + def test_lazy_collapse(self): + lazy_data = STD_DEV.lazy_aggregate(self.lazy_cube.lazy_data(), axis=0) + self.assertArrayAlmostEqual(lazy_data.compute(), [1.58113883]) class Test_lazy_aggregate(tests.IrisTest): @@ -56,11 +83,6 @@ def test_ddof_zero(self): self.assertArrayAlmostEqual(result, np.array(2.291287)) -class Test_name(tests.IrisTest): - def test(self): - self.assertEqual(STD_DEV.name(), 'standard_deviation') - - class Test_aggregate_shape(tests.IrisTest): def test(self): shape = () diff --git a/lib/iris/tests/unit/analysis/test_SUM.py b/lib/iris/tests/unit/analysis/test_SUM.py new file mode 100644 index 0000000000..0a0f2332ab --- /dev/null +++ b/lib/iris/tests/unit/analysis/test_SUM.py @@ -0,0 +1,153 @@ +# (C) British Crown Copyright 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +"""Unit tests for the :data:`iris.analysis.SUM` aggregator.""" + +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +# Import iris.tests first so that some things can be initialised before +# importing anything else. +import iris.tests as tests + +import numpy as np +import numpy.ma as ma + +from iris.analysis import SUM +from iris.cube import Cube +from iris.coords import DimCoord +from iris._lazy_data import as_lazy_data, is_lazy_data + + +class Test_basics(tests.IrisTest): + def setUp(self): + data = np.array([1, 2, 3, 4, 5]) + coord = DimCoord([6, 7, 8, 9, 10], long_name='foo') + self.cube = Cube(data) + self.cube.add_dim_coord(coord, 0) + self.lazy_cube = Cube(as_lazy_data(data)) + self.lazy_cube.add_dim_coord(coord, 0) + + def test_name(self): + self.assertEqual(SUM.name(), 'sum') + + def test_collapse(self): + data = SUM.aggregate(self.cube.data, axis=0) + self.assertArrayEqual(data, [15]) + + def test_lazy(self): + lazy_data = SUM.lazy_aggregate(self.lazy_cube.lazy_data(), axis=0) + self.assertTrue(is_lazy_data(lazy_data)) + + def test_lazy_collapse(self): + lazy_data = SUM.lazy_aggregate(self.lazy_cube.lazy_data(), axis=0) + self.assertArrayEqual(lazy_data.compute(), [15]) + + +class Test_masked(tests.IrisTest): + def setUp(self): + self.cube = Cube(ma.masked_equal([1, 2, 3, 4, 5], 3)) + self.cube.add_dim_coord(DimCoord([6, 7, 8, 9, 10], long_name='foo'), 0) + + def test_ma(self): + data = SUM.aggregate(self.cube.data, axis=0) + self.assertArrayEqual(data, [12]) + + +class Test_lazy_masked(tests.IrisTest): + def setUp(self): + masked_data = ma.masked_equal([1, 2, 3, 4, 5], 3) + self.cube = Cube(as_lazy_data(masked_data)) + self.cube.add_dim_coord(DimCoord([6, 7, 8, 9, 10], long_name='foo'), 0) + + def test_lazy_ma(self): + lazy_data = SUM.lazy_aggregate(self.cube.lazy_data(), axis=0) + self.assertTrue(is_lazy_data(lazy_data)) + self.assertArrayEqual(lazy_data.compute(), [12]) + + +class Test_weights_and_returned(tests.IrisTest): + def setUp(self): + data_2d = np.arange(1, 11).reshape(2, 5) + coord_0 = DimCoord([11, 12], long_name='bar') + coord_1 = DimCoord([6, 7, 8, 9, 10], long_name='foo') + self.cube_2d = Cube(data_2d) + self.cube_2d.add_dim_coord(coord_0, 0) + self.cube_2d.add_dim_coord(coord_1, 1) + self.weights = np.array([2, 1, 1, 1, 1] * 2).reshape(2, 5) + + def test_weights(self): + data = SUM.aggregate(self.cube_2d.data, axis=0, weights=self.weights) + self.assertArrayEqual(data, [14, 9, 11, 13, 15]) + + def test_returned(self): + data, weights = SUM.aggregate(self.cube_2d.data, axis=0, returned=True) + self.assertArrayEqual(data, [7, 9, 11, 13, 15]) + self.assertArrayEqual(weights, [2, 2, 2, 2, 2]) + + def test_weights_and_returned(self): + data, weights = SUM.aggregate(self.cube_2d.data, axis=0, + weights=self.weights, + returned=True) + self.assertArrayEqual(data, [14, 9, 11, 13, 15]) + self.assertArrayEqual(weights, [4, 2, 2, 2, 2]) + + +class Test_lazy_weights_and_returned(tests.IrisTest): + def setUp(self): + data_2d = np.arange(1, 11).reshape(2, 5) + coord_0 = DimCoord([11, 12], long_name='bar') + coord_1 = DimCoord([6, 7, 8, 9, 10], long_name='foo') + self.cube_2d = Cube(as_lazy_data(data_2d)) + self.cube_2d.add_dim_coord(coord_0, 0) + self.cube_2d.add_dim_coord(coord_1, 1) + self.weights = np.array([2, 1, 1, 1, 1] * 2).reshape(2, 5) + + def test_weights(self): + lazy_data = SUM.lazy_aggregate(self.cube_2d.lazy_data(), axis=0, + weights=self.weights) + self.assertTrue(is_lazy_data(lazy_data)) + self.assertArrayEqual(lazy_data.compute(), [14, 9, 11, 13, 15]) + + def test_returned(self): + lazy_data, weights = SUM.lazy_aggregate(self.cube_2d.lazy_data(), + axis=0, + returned=True) + self.assertTrue(is_lazy_data(lazy_data)) + self.assertArrayEqual(lazy_data.compute(), [7, 9, 11, 13, 15]) + self.assertArrayEqual(weights, [2, 2, 2, 2, 2]) + + def test_weights_and_returned(self): + lazy_data, weights = SUM.lazy_aggregate(self.cube_2d.lazy_data(), + axis=0, + weights=self.weights, + returned=True) + self.assertTrue(is_lazy_data(lazy_data)) + self.assertArrayEqual(lazy_data.compute(), [14, 9, 11, 13, 15]) + self.assertArrayEqual(weights, [4, 2, 2, 2, 2]) + + +class Test_aggregate_shape(tests.IrisTest): + def test(self): + shape = () + kwargs = dict() + self.assertTupleEqual(SUM.aggregate_shape(**kwargs), shape) + kwargs = dict(wibble='wobble') + self.assertTupleEqual(SUM.aggregate_shape(**kwargs), shape) + + +if __name__ == "__main__": + tests.main() diff --git a/lib/iris/tests/unit/coords/test_Coord.py b/lib/iris/tests/unit/coords/test_Coord.py index b43df9447a..27b468044d 100644 --- a/lib/iris/tests/unit/coords/test_Coord.py +++ b/lib/iris/tests/unit/coords/test_Coord.py @@ -24,6 +24,8 @@ import iris.tests as tests import collections +import mock +import warnings import numpy as np @@ -355,6 +357,307 @@ def test_different_array_attrs_incompatible(self): self.assertFalse(self.test_coord.is_compatible(self.other_coord)) +class Test_contiguous_bounds(tests.IrisTest): + def test_1d_coord_no_bounds_warning(self): + coord = DimCoord([0, 1, 2], standard_name='latitude') + msg = "Coordinate 'latitude' is not bounded, guessing contiguous " \ + "bounds." + with warnings.catch_warnings(): + # Cause all warnings to raise Exceptions + warnings.simplefilter("error") + with self.assertRaisesRegexp(Warning, msg): + coord.contiguous_bounds() + + def test_2d_coord_no_bounds_error(self): + coord = AuxCoord(np.array([[0, 0], [5, 5]]), standard_name='latitude') + emsg = 'Guessing bounds of 2D coords is not currently supported' + with self.assertRaisesRegexp(ValueError, emsg): + coord.contiguous_bounds() + + def test__sanity_check_bounds_call(self): + coord = DimCoord([5, 15, 25], bounds=[[0, 10], [10, 20], [20, 30]]) + with mock.patch('iris.coords.Coord._sanity_check_bounds' + ) as bounds_check: + coord.contiguous_bounds() + bounds_check.assert_called_once() + + def test_1d_coord(self): + coord = DimCoord([2, 4, 6], standard_name='latitude', + bounds=[[1, 3], [3, 5], [5, 7]]) + expected = np.array([1, 3, 5, 7]) + result = coord.contiguous_bounds() + self.assertArrayEqual(result, expected) + + def test_1d_coord_discontiguous(self): + coord = DimCoord([2, 4, 6], standard_name='latitude', + bounds=[[1, 3], [4, 5], [5, 7]]) + expected = np.array([1, 4, 5, 7]) + result = coord.contiguous_bounds() + self.assertArrayEqual(result, expected) + + def test_2d_lon_bounds(self): + coord = AuxCoord(np.array([[1, 3], [1, 3]]), + bounds=np.array([[[0, 2, 2, 0], [2, 4, 4, 2]], + [[0, 2, 2, 0], [2, 4, 4, 2]]])) + expected = np.array([[0, 2, 4], [0, 2, 4], [0, 2, 4]]) + result = coord.contiguous_bounds() + self.assertArrayEqual(result, expected) + + def test_2d_lat_bounds(self): + coord = AuxCoord(np.array([[1, 1], [3, 3]]), + bounds=np.array([[[0, 0, 2, 2], [0, 0, 2, 2]], + [[2, 2, 4, 4], [2, 2, 4, 4]]])) + expected = np.array([[0, 0, 0], [2, 2, 2], [4, 4, 4]]) + result = coord.contiguous_bounds() + self.assertArrayEqual(result, expected) + + +class Test_is_contiguous(tests.IrisTest): + def test_no_bounds(self): + coord = DimCoord([1, 3]) + result = coord.is_contiguous() + self.assertFalse(result) + + def test__discontiguity_in_bounds_call(self): + # Check that :meth:`iris.coords.Coord._discontiguity_in_bounds` is + # called. + coord = DimCoord([1, 3], bounds=[[0, 2], [2, 4]]) + with mock.patch('iris.coords.Coord._discontiguity_in_bounds' + ) as discontiguity_check: + # Discontiguity returns two objects that are unpacked in + # `coord.is_contiguous`. + discontiguity_check.return_value = [None, None] + coord.is_contiguous(rtol=1e-1, atol=1e-3) + discontiguity_check.assert_called_with(rtol=1e-1, atol=1e-3) + + +class Test__discontiguity_in_bounds(tests.IrisTest): + def setUp(self): + self.points_3by3 = np.array([[1, 2, 3], + [1, 2, 3], + [1, 2, 3]]) + self.lon_bounds_3by3 = np.array( + [[[0, 2, 2, 0], [2, 4, 4, 2], [4, 6, 6, 4]], + [[0, 2, 2, 0], [2, 4, 4, 2], [4, 6, 6, 4]], + [[0, 2, 2, 0], [2, 4, 4, 2], [4, 6, 6, 4]]]) + self.lat_bounds_3by3 = np.array( + [[[0, 0, 2, 2], [0, 0, 2, 2], [0, 0, 2, 2]], + [[2, 2, 4, 4], [2, 2, 4, 4], [2, 2, 4, 4]], + [[4, 4, 6, 6], [4, 4, 6, 6], [4, 4, 6, 6]]]) + + def test_1d_contiguous(self): + coord = DimCoord([-20, 0, 20], + bounds=[[-30, -10], [-10, 10], [10, 30]]) + contiguous, diffs = coord._discontiguity_in_bounds() + self.assertTrue(contiguous) + self.assertArrayEqual(diffs, np.zeros(2)) + + def test_1d_discontiguous(self): + coord = DimCoord([10, 20, 40], + bounds=[[5, 15], [15, 25], [35, 45]]) + contiguous, diffs = coord._discontiguity_in_bounds() + self.assertFalse(contiguous) + self.assertArrayEqual(diffs, np.array([0, 10])) + + def test_1d_one_cell(self): + # Test a 1D coord with a single cell. + coord = DimCoord(20, bounds=[[10, 30]]) + contiguous, diffs = coord._discontiguity_in_bounds() + self.assertTrue(contiguous) + self.assertArrayEqual(diffs, np.array([])) + + def test_2d_contiguous_both_dirs(self): + coord = AuxCoord(self.points_3by3, bounds=self.lon_bounds_3by3) + contiguous, diffs = coord._discontiguity_in_bounds() + diffs_along_x, diffs_along_y = diffs + self.assertTrue(contiguous) + self.assertTrue(not diffs_along_x.any()) + self.assertTrue(not diffs_along_y.any()) + + def test_2d_discontiguous_along_x(self): + coord = AuxCoord(self.points_3by3[:, ::2], + bounds=self.lon_bounds_3by3[:, ::2, :]) + contiguous, diffs = coord._discontiguity_in_bounds() + diffs_along_x, diffs_along_y = diffs + self.assertFalse(contiguous) + self.assertArrayEqual(diffs_along_x, np.array([2, 2, 2]).reshape(3, 1)) + self.assertTrue(not diffs_along_y.any()) + + def test_2d_discontiguous_along_y(self): + coord = AuxCoord(self.points_3by3[::2, :], + bounds=self.lat_bounds_3by3[::2, :, :]) + contiguous, diffs = coord._discontiguity_in_bounds() + diffs_along_x, diffs_along_y = diffs + self.assertFalse(contiguous) + self.assertTrue(not diffs_along_x.any()) + self.assertArrayEqual(diffs_along_y, np.array([[2, 2, 2]])) + + def test_2d_discontiguous_along_x_and_y(self): + coord = AuxCoord(np.array([[1, 5], [3, 5]]), + bounds=np.array([[[0, 2, 2, 0], [4, 6, 6, 4]], + [[2, 4, 4, 2], [4, 6, 6, 4]]])) + contiguous, diffs = coord._discontiguity_in_bounds() + diffs_along_x, diffs_along_y = diffs + exp_x_diffs = np.array([2, 0]).reshape(2, 1) + exp_y_diffs = np.array([2, 0]).reshape(1, 2) + self.assertFalse(contiguous) + self.assertArrayEqual(diffs_along_x, exp_x_diffs) + self.assertArrayEqual(diffs_along_y, exp_y_diffs) + + def test_2d_contiguous_along_x_atol(self): + coord = AuxCoord(self.points_3by3[:, ::2], + bounds=self.lon_bounds_3by3[:, ::2, :]) + # Set a high atol that allows small discontiguities. + contiguous, diffs = coord._discontiguity_in_bounds(atol=2) + diffs_along_x, diffs_along_y = diffs + self.assertTrue(contiguous) + self.assertArrayEqual(diffs_along_x, np.array([2, 2, 2]).reshape(3, 1)) + self.assertTrue(not diffs_along_y.any()) + + def test_2d_one_cell(self): + # Test a 2D coord with a single cell, where the coord has shape (1, 1). + coord = AuxCoord(self.points_3by3[:1, :1], + bounds=self.lon_bounds_3by3[:1, :1, :]) + contiguous, diffs = coord._discontiguity_in_bounds() + diffs_along_x, diffs_along_y = diffs + expected_diffs = np.array([], dtype=np.int64) + self.assertTrue(contiguous) + self.assertArrayEqual(diffs_along_x, expected_diffs.reshape(1, 0)) + self.assertArrayEqual(diffs_along_y, expected_diffs.reshape(0, 1)) + + def test_2d_one_cell_along_x(self): + # Test a 2D coord with a single cell along the x axis, where the coord + # has shape (2, 1). + coord = AuxCoord(self.points_3by3[:, :1], + bounds=self.lat_bounds_3by3[:, :1, :]) + contiguous, diffs = coord._discontiguity_in_bounds() + diffs_along_x, diffs_along_y = diffs + self.assertTrue(contiguous) + self.assertTrue(not diffs_along_x.any()) + self.assertArrayEqual(diffs_along_y, np.array([0, 0]).reshape(2, 1)) + + def test_2d_one_cell_along_y(self): + # Test a 2D coord with a single cell along the y axis, where the coord + # has shape (1, 2). + coord = AuxCoord(self.points_3by3[:1, :], + bounds=self.lon_bounds_3by3[:1, :, :]) + contiguous, diffs = coord._discontiguity_in_bounds() + diffs_along_x, diffs_along_y = diffs + self.assertTrue(contiguous) + self.assertTrue(not diffs_along_x.any()) + self.assertTrue(not diffs_along_y.any()) + + def test_2d_contiguous_mod_360(self): + # Test that longitude coordinates are adjusted by the 360 modulus when + # calculating the discontiguities in contiguous bounds. + coord = AuxCoord( + [[175, -175], [175, -175]], standard_name='longitude', + bounds=np.array([[[170, 180, 180, 170], [-180, -170, -170, -180]], + [[170, 180, 180, 170], [-180, -170, -170, -180]]]) + ) + contiguous, diffs = coord._discontiguity_in_bounds() + diffs_along_x, diffs_along_y = diffs + self.assertTrue(contiguous) + self.assertTrue(not diffs_along_x.any()) + self.assertTrue(not diffs_along_y.any()) + + def test_2d_discontiguous_mod_360(self): + # Test that longitude coordinates are adjusted by the 360 modulus when + # calculating the discontiguities in contiguous bounds. + coord = AuxCoord( + [[175, -175], [175, -175]], standard_name='longitude', + bounds=np.array([[[170, 180, 180, 170], [10, 20, 20, 10]], + [[170, 180, 180, 170], [10, 20, 20, 10]]])) + contiguous, diffs = coord._discontiguity_in_bounds() + diffs_along_x, diffs_along_y = diffs + self.assertFalse(contiguous) + self.assertArrayEqual(diffs_along_x, np.array([[170], [170]])) + self.assertTrue(not diffs_along_y.any()) + + def test_2d_contiguous_mod_360_not_longitude(self): + # Test that non-longitude coordinates are not adjusted by the 360 + # modulus when calculating the discontiguities in contiguous bounds. + coord = AuxCoord( + [[-150, 350], [-150, 350]], standard_name='height', + bounds=np.array([[[-400, 100, 100, -400], [100, 600, 600, 100]], + [[-400, 100, 100, -400], [100, 600, 600, 100]]]) + ) + contiguous, diffs = coord._discontiguity_in_bounds() + diffs_along_x, diffs_along_y = diffs + self.assertTrue(contiguous) + self.assertTrue(not diffs_along_x.any()) + self.assertTrue(not diffs_along_y.any()) + + def test_2d_discontiguous_mod_360_not_longitude(self): + # Test that non-longitude coordinates are not adjusted by the 360 + # modulus when calculating the discontiguities in discontiguous bounds. + coord = AuxCoord( + [[-150, 350], [-150, 350]], standard_name='height', + bounds=np.array([[[-400, 100, 100, -400], [200, 600, 600, 200]], + [[-400, 100, 100, -400], [200, 600, 600, 200]]]) + ) + contiguous, diffs = coord._discontiguity_in_bounds() + diffs_along_x, diffs_along_y = diffs + self.assertFalse(contiguous) + self.assertArrayEqual(diffs_along_x, np.array([[100], [100]])) + self.assertTrue(not diffs_along_y.any()) + + +class Test__sanity_check_bounds(tests.IrisTest): + def test_coord_1d_2_bounds(self): + # Check that a 1d coord with 2 bounds does not raise an error. + coord = iris.coords.DimCoord([0, 1], standard_name='latitude', + bounds=[[0, 1], [1, 2]]) + coord._sanity_check_bounds() + + def test_coord_1d_no_bounds(self): + coord = iris.coords.DimCoord([0, 1], standard_name='latitude') + emsg = "Contiguous bounds are only defined for 1D coordinates with " \ + "2 bounds." + with self.assertRaisesRegexp(ValueError, emsg): + coord._sanity_check_bounds() + + def test_coord_1d_1_bounds(self): + coord = iris.coords.DimCoord([0, 1], standard_name='latitude', + bounds=np.array([[0], [1]])) + emsg = "Contiguous bounds are only defined for 1D coordinates with " \ + "2 bounds." + with self.assertRaisesRegexp(ValueError, emsg): + coord._sanity_check_bounds() + + def test_coord_2d_4_bounds(self): + coord = iris.coords.AuxCoord( + [[0, 0], [1, 1]], standard_name='latitude', + bounds=np.array([[[0, 0, 1, 1], [0, 0, 1, 1]], + [[1, 1, 2, 2], [1, 1, 2, 2]]])) + coord._sanity_check_bounds() + + def test_coord_2d_no_bounds(self): + coord = iris.coords.AuxCoord([[0, 0], [1, 1]], + standard_name='latitude') + emsg = "Contiguous bounds are only defined for 2D coordinates with " \ + "4 bounds." + with self.assertRaisesRegexp(ValueError, emsg): + coord._sanity_check_bounds() + + def test_coord_2d_2_bounds(self): + coord = iris.coords.AuxCoord( + [[0, 0], [1, 1]], standard_name='latitude', + bounds=np.array([[[0, 1], [0, 1]], [[1, 2], [1, 2]]])) + emsg = "Contiguous bounds are only defined for 2D coordinates with " \ + "4 bounds." + with self.assertRaisesRegexp(ValueError, emsg): + coord._sanity_check_bounds() + + def test_coord_3d(self): + coord = iris.coords.AuxCoord(np.zeros((2, 2, 2)), + standard_name='height') + emsg = "Contiguous bounds are not defined for coordinates with more " \ + "than 2 dimensions." + with self.assertRaisesRegexp(ValueError, emsg): + coord._sanity_check_bounds() + + class Test_convert_units(tests.IrisTest): def test_convert_unknown_units(self): coord = iris.coords.AuxCoord(1, units='unknown') @@ -374,6 +677,17 @@ def test_short_time_interval(self): result = coord.__str__() self.assertEqual(expected, result) + def test_short_time_interval__bounded(self): + coord = DimCoord([5, 6], standard_name='time', + units='days since 1970-01-01') + coord.guess_bounds() + expected = ("DimCoord([1970-01-06 00:00:00, 1970-01-07 00:00:00], " + "bounds=[[1970-01-05 12:00:00, 1970-01-06 12:00:00],\n" + " [1970-01-06 12:00:00, 1970-01-07 12:00:00]], " + "standard_name='time', calendar='gregorian')") + result = coord.__str__() + self.assertEqual(expected, result) + def test_long_time_interval(self): coord = DimCoord([5], standard_name='time', units='years since 1970-01-01') @@ -381,6 +695,15 @@ def test_long_time_interval(self): result = coord.__str__() self.assertEqual(expected, result) + def test_long_time_interval__bounded(self): + coord = DimCoord([5, 6], standard_name='time', + units='years since 1970-01-01') + coord.guess_bounds() + expected = ("DimCoord([5 6], bounds=[[4.5 5.5]\n [5.5 6.5]], " + "standard_name='time', calendar='gregorian')") + result = coord.__str__() + self.assertEqual(expected, result) + def test_non_time_unit(self): coord = DimCoord([1.]) expected = repr(coord) diff --git a/lib/iris/tests/unit/cube/test_Cube.py b/lib/iris/tests/unit/cube/test_Cube.py index 5633417c80..dcc3eca7cb 100644 --- a/lib/iris/tests/unit/cube/test_Cube.py +++ b/lib/iris/tests/unit/cube/test_Cube.py @@ -34,9 +34,9 @@ import iris.aux_factory import iris.coords import iris.exceptions -from iris import FUTURE from iris.analysis import WeightedAggregator, Aggregator from iris.analysis import MEAN +from iris.aux_factory import HybridHeightFactory from iris.cube import Cube from iris.coords import AuxCoord, DimCoord, CellMeasure from iris.exceptions import (CoordinateNotFoundError, CellMeasureNotFoundError, @@ -1546,6 +1546,31 @@ def test_add_cell_measure(self): cube.add_cell_measure(a_cell_measure, [0, 1]) self.assertEqual(cube.cell_measure('area'), a_cell_measure) + def test_add_valid_aux_factory(self): + cube = Cube(np.arange(8).reshape(2, 2, 2)) + delta = AuxCoord(points=[0, 1], long_name='delta', units='m') + sigma = AuxCoord(points=[0, 1], long_name='sigma') + orog = AuxCoord(np.arange(4).reshape(2, 2), units='m') + cube.add_aux_coord(delta, 0) + cube.add_aux_coord(sigma, 0) + cube.add_aux_coord(orog, (1, 2)) + factory = HybridHeightFactory(delta=delta, sigma=sigma, orography=orog) + self.assertIsNone(cube.add_aux_factory(factory)) + + def test_error_for_add_invalid_aux_factory(self): + cube = Cube(np.arange(8).reshape(2, 2, 2), long_name='bar') + delta = AuxCoord(points=[0, 1], long_name='delta', units='m') + sigma = AuxCoord(points=[0, 1], long_name='sigma') + orog = AuxCoord(np.arange(4).reshape(2, 2), units='m', long_name='foo') + cube.add_aux_coord(delta, 0) + cube.add_aux_coord(sigma, 0) + # Note orography is not added to the cube here + factory = HybridHeightFactory(delta=delta, sigma=sigma, orography=orog) + expected_error = ("foo coordinate for factory is not present on cube " + "bar") + with self.assertRaisesRegexp(ValueError, expected_error): + cube.add_aux_factory(factory) + class Test_remove_metadata(tests.IrisTest): def setUp(self): diff --git a/lib/iris/tests/unit/fileformats/netcdf/test__get_cf_var_data.py b/lib/iris/tests/unit/fileformats/netcdf/test__get_cf_var_data.py new file mode 100644 index 0000000000..6a1aeb9d61 --- /dev/null +++ b/lib/iris/tests/unit/fileformats/netcdf/test__get_cf_var_data.py @@ -0,0 +1,83 @@ +# (C) British Crown Copyright 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +"""Unit tests for the `iris.fileformats.netcdf._get_cf_var_data` function.""" + +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +# Import iris.tests first so that some things can be initialised before +# importing anything else. +import iris.tests as tests + +from dask.array import Array as dask_array +import numpy as np + +from iris._lazy_data import _limited_shape +import iris.fileformats.cf +from iris.fileformats.netcdf import _get_cf_var_data +from iris.tests import mock + + +class Test__get_cf_var_data(tests.IrisTest): + def setUp(self): + self.filename = 'DUMMY' + self.shape = (3, 240, 200) + self.expected_chunks = _limited_shape(self.shape) + + def _make(self, chunksizes): + cf_data = mock.Mock(_FillValue=None) + cf_data.chunking = mock.MagicMock(return_value=chunksizes) + cf_var = mock.MagicMock(spec=iris.fileformats.cf.CFVariable, + dtype=np.dtype('i4'), + cf_data=cf_data, + cf_name='DUMMY_VAR', + shape=self.shape) + return cf_var + + def test_cf_data_type(self): + chunks = [1, 12, 100] + cf_var = self._make(chunks) + lazy_data = _get_cf_var_data(cf_var, self.filename) + self.assertIsInstance(lazy_data, dask_array) + + def test_cf_data_chunks(self): + chunks = [1, 12, 100] + cf_var = self._make(chunks) + lazy_data = _get_cf_var_data(cf_var, self.filename) + lazy_data_chunks = [c[0] for c in lazy_data.chunks] + self.assertArrayEqual(chunks, lazy_data_chunks) + + def test_cf_data_no_chunks(self): + # No chunks means chunks are calculated from the array's shape by + # `iris._lazy_data._limited_shape()`. + chunks = None + cf_var = self._make(chunks) + lazy_data = _get_cf_var_data(cf_var, self.filename) + lazy_data_chunks = [c[0] for c in lazy_data.chunks] + self.assertArrayEqual(lazy_data_chunks, self.expected_chunks) + + def test_cf_data_contiguous(self): + # Chunks 'contiguous' is equivalent to no chunks. + chunks = 'contiguous' + cf_var = self._make(chunks) + lazy_data = _get_cf_var_data(cf_var, self.filename) + lazy_data_chunks = [c[0] for c in lazy_data.chunks] + self.assertArrayEqual(lazy_data_chunks, self.expected_chunks) + + +if __name__ == "__main__": + tests.main() diff --git a/lib/iris/tests/unit/fileformats/netcdf/test__load_cube.py b/lib/iris/tests/unit/fileformats/netcdf/test__load_cube.py index 1e38baac86..603547b707 100644 --- a/lib/iris/tests/unit/fileformats/netcdf/test__load_cube.py +++ b/lib/iris/tests/unit/fileformats/netcdf/test__load_cube.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2014 - 2017, Met Office +# (C) British Crown Copyright 2014 - 2018, Met Office # # This file is part of Iris. # @@ -57,6 +57,7 @@ def setUp(self): def _make(self, names, attrs): coords = [DimCoord(i, long_name=name) for i, name in enumerate(names)] + shape = (1,) cf_group = {} for name, cf_attrs in zip(names, attrs): @@ -64,12 +65,14 @@ def _make(self, names, attrs): cf_group[name] = mock.Mock(cf_attrs_unused=cf_attrs_unused) cf = mock.Mock(cf_group=cf_group) + cf_data = mock.Mock(_FillValue=None) + cf_data.chunking = mock.MagicMock(return_value=shape) cf_var = mock.MagicMock(spec=iris.fileformats.cf.CFVariable, dtype=np.dtype('i4'), - cf_data=mock.Mock(_FillValue=None), + cf_data=cf_data, cf_name='DUMMY_VAR', cf_group=coords, - shape=(1,)) + shape=shape) return cf, cf_var def test_flag_pass_thru(self): @@ -129,14 +132,17 @@ def setUp(self): self.valid_max = mock.sentinel.valid_max def _make(self, attrs): + shape = (1,) cf_attrs_unused = mock.Mock(return_value=attrs) + cf_data = mock.Mock(_FillValue=None) + cf_data.chunking = mock.MagicMock(return_value=shape) cf_var = mock.MagicMock(spec=iris.fileformats.cf.CFVariable, dtype=np.dtype('i4'), - cf_data=mock.Mock(_FillValue=None), + cf_data=cf_data, cf_name='DUMMY_VAR', cf_group=mock.Mock(), cf_attrs_unused=cf_attrs_unused, - shape=(1,)) + shape=shape) return cf_var def test_flag_pass_thru(self): diff --git a/lib/iris/tests/unit/fileformats/pp/test_save.py b/lib/iris/tests/unit/fileformats/pp/test_save.py index 0784132915..361c01f936 100644 --- a/lib/iris/tests/unit/fileformats/pp/test_save.py +++ b/lib/iris/tests/unit/fileformats/pp/test_save.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2014 - 2017, Met Office +# (C) British Crown Copyright 2014 - 2018, Met Office # # This file is part of Iris. # @@ -19,6 +19,9 @@ from __future__ import (absolute_import, division, print_function) from six.moves import (filter, input, map, range, zip) # noqa +import cftime +import cf_units + # Import iris.tests first so that some things can be initialised before # importing anything else. import iris.tests as tests @@ -26,7 +29,7 @@ from iris.coords import DimCoord, CellMethod from iris.fileformats._ff_cross_references import STASH_TRANS import iris.fileformats.pp as pp -from iris.fileformats.pp_save_rules import _lbproc_rules +from iris.fileformats.pp_save_rules import _lbproc_rules, verify from iris.tests import mock import iris.tests.stock as stock @@ -198,5 +201,140 @@ def test_maximum(self): self.assertEqual(lbproc, 8192) +class TestTimeMean(tests.IrisTest): + ''' + Tests that time mean cell method is converted to pp appropriately. + + Pattern is pairs of tests - one with time mean method, and one without, to + show divergent behaviour. + + ''' + def test_t1_time_mean(self): + cube = _get_single_time_cube(set_time_mean=True) + tc = cube.coord(axis='t') + expected = tc.units.num2date(0) + + with mock.patch('iris.fileformats.pp.PPField3', + autospec=True) as pp_field: + verify(cube, pp_field) + actual = pp_field.t1 + + self.assertEqual(expected, actual) + + def test_t1_no_time_mean(self): + cube = _get_single_time_cube() + tc = cube.coord(axis='t') + expected = tc.units.num2date(15) + + with mock.patch('iris.fileformats.pp.PPField3', + autospec=True) as pp_field: + verify(cube, pp_field) + actual = pp_field.t1 + + self.assertEqual(expected, actual) + + def test_t2_time_mean(self): + cube = _get_single_time_cube(set_time_mean=True) + tc = cube.coord(axis='t') + expected = tc.units.num2date(30) + + with mock.patch('iris.fileformats.pp.PPField3', + autospec=True) as pp_field: + verify(cube, pp_field) + actual = pp_field.t2 + + self.assertEqual(expected, actual) + + def test_t2_no_time_mean(self): + cube = _get_single_time_cube(set_time_mean=False) + expected = cftime.datetime(0, 0, 0) + + with mock.patch('iris.fileformats.pp.PPField3', + autospec=True) as pp_field: + verify(cube, pp_field) + actual = pp_field.t2 + self.assertEqual(expected, actual) + + def test_lbft_no_forecast_time(self): + # Different pattern here: checking that lbft hasn't been changed from + # the default value. + cube = _get_single_time_cube() + mock_lbft = mock.sentinel.lbft + + with mock.patch('iris.fileformats.pp.PPField3', + autospec=True) as pp_field: + pp_field.lbft = mock_lbft + verify(cube, pp_field) + actual = pp_field.lbft + + assert(mock_lbft is actual) + + def test_lbtim_no_time_mean(self): + cube = _get_single_time_cube() + expected_ib = 0 + expected_ic = 2 # 360 day calendar + + with mock.patch('iris.fileformats.pp.PPField3', + autospec=True) as pp_field: + verify(cube, pp_field) + actual_ib = pp_field.lbtim.ib + actual_ic = pp_field.lbtim.ic + + self.assertEqual(expected_ib, actual_ib) + self.assertEqual(expected_ic, actual_ic) + + def test_lbtim_time_mean(self): + cube = _get_single_time_cube(set_time_mean=True) + expected_ib = 2 # Time mean + expected_ic = 2 # 360 day calendar + + with mock.patch('iris.fileformats.pp.PPField3', + autospec=True) as pp_field: + verify(cube, pp_field) + actual_ib = pp_field.lbtim.ib + actual_ic = pp_field.lbtim.ic + + self.assertEqual(expected_ib, actual_ib) + self.assertEqual(expected_ic, actual_ic) + + def test_lbproc_no_time_mean(self): + cube = _get_single_time_cube() + expected = 0 + + with mock.patch('iris.fileformats.pp.PPField3', + autospec=True) as pp_field: + verify(cube, pp_field) + actual = pp_field.lbproc + + self.assertEqual(expected, actual) + + def test_lbproc_time_mean(self): + cube = _get_single_time_cube(set_time_mean=True) + expected = 128 + + with mock.patch('iris.fileformats.pp.PPField3', + autospec=True) as pp_field: + verify(cube, pp_field) + actual = pp_field.lbproc + + self.assertEqual(expected, actual) + + +def _get_single_time_cube(set_time_mean=False): + cube = stock.realistic_3d()[0:1, :, :] + cube.remove_coord('time') + cube.remove_coord('forecast_period') + tc = DimCoord( + points=[15, ], + standard_name='time', + units=cf_units.Unit('days since epoch', calendar='360_day'), + bounds=[[0, 30], ], + ) + cube.add_dim_coord(tc, 0) + if set_time_mean: + cube.cell_methods = (CellMethod("mean", coords='time'), ) + return cube + + if __name__ == "__main__": tests.main() diff --git a/lib/iris/tests/unit/fileformats/pyke_rules/compiled_krb/fc_rules_cf_fc/test_build_auxiliary_coordinate.py b/lib/iris/tests/unit/fileformats/pyke_rules/compiled_krb/fc_rules_cf_fc/test_build_auxiliary_coordinate.py index a796621cc0..4bb653f23f 100644 --- a/lib/iris/tests/unit/fileformats/pyke_rules/compiled_krb/fc_rules_cf_fc/test_build_auxiliary_coordinate.py +++ b/lib/iris/tests/unit/fileformats/pyke_rules/compiled_krb/fc_rules_cf_fc/test_build_auxiliary_coordinate.py @@ -40,11 +40,13 @@ class TestBoundsVertexDim(tests.IrisTest): def setUp(self): # Create coordinate cf variables and pyke engine. points = np.arange(6).reshape(2, 3) + + cf_data = self._make_cf_data(points) self.cf_coord_var = mock.Mock( spec=CFVariable, dimensions=('foo', 'bar'), cf_name='wibble', - cf_data=mock.Mock(), + cf_data=cf_data, standard_name=None, long_name='wibble', units='m', @@ -54,7 +56,8 @@ def setUp(self): self.engine = mock.Mock( cube=mock.Mock(), - cf_var=mock.Mock(dimensions=('foo', 'bar')), + cf_var=mock.Mock(dimensions=('foo', 'bar'), + cf_data=cf_data), filename='DUMMY', provides=dict(coordinates=[])) @@ -72,14 +75,21 @@ def patched__getitem__(proxy_self, keys): 'iris.fileformats.netcdf.NetCDFDataProxy.__getitem__', new=patched__getitem__) + @staticmethod + def _make_cf_data(vals): + cf_data = mock.Mock(_FillValue=None) + cf_data.chunking = mock.MagicMock(return_value=vals.shape) + return cf_data + def test_slowest_varying_vertex_dim(self): # Create the bounds cf variable. bounds = np.arange(24).reshape(4, 2, 3) + cf_data = self._make_cf_data(bounds) self.cf_bounds_var = mock.Mock( spec=CFVariable, dimensions=('nv', 'foo', 'bar'), cf_name='wibble_bnds', - cf_data=mock.Mock(), + cf_data=cf_data, shape=bounds.shape, dtype=bounds.dtype, __getitem__=lambda self, key: bounds[key]) @@ -116,11 +126,12 @@ def test_slowest_varying_vertex_dim(self): def test_fastest_varying_vertex_dim(self): bounds = np.arange(24).reshape(2, 3, 4) + cf_data = self._make_cf_data(bounds) self.cf_bounds_var = mock.Mock( spec=CFVariable, dimensions=('foo', 'bar', 'nv'), cf_name='wibble_bnds', - cf_data=mock.Mock(), + cf_data=cf_data, shape=bounds.shape, dtype=bounds.dtype, __getitem__=lambda self, key: bounds[key]) @@ -155,11 +166,12 @@ def test_fastest_with_different_dim_names(self): # which are 'foo' and 'bar' (as permitted by the cf spec), # this should still work because the vertex dim is the fastest varying. bounds = np.arange(24).reshape(2, 3, 4) + cf_data = self._make_cf_data(bounds) self.cf_bounds_var = mock.Mock( spec=CFVariable, dimensions=('x', 'y', 'nv'), cf_name='wibble_bnds', - cf_data=mock.Mock(), + cf_data=cf_data, shape=bounds.shape, dtype=bounds.dtype, __getitem__=lambda self, key: bounds[key]) @@ -194,11 +206,14 @@ class TestDtype(tests.IrisTest): def setUp(self): # Create coordinate cf variables and pyke engine. points = np.arange(6).reshape(2, 3) + cf_data = mock.Mock(_FillValue=None) + cf_data.chunking = mock.MagicMock(return_value=points.shape) + self.cf_coord_var = mock.Mock( spec=CFVariable, dimensions=('foo', 'bar'), cf_name='wibble', - cf_data=mock.Mock(), + cf_data=cf_data, standard_name=None, long_name='wibble', units='m', diff --git a/lib/iris/tests/unit/fileformats/pyke_rules/compiled_krb/fc_rules_cf_fc/test_build_dimension_coordinate.py b/lib/iris/tests/unit/fileformats/pyke_rules/compiled_krb/fc_rules_cf_fc/test_build_dimension_coordinate.py index c645f984cc..38239ea9a0 100644 --- a/lib/iris/tests/unit/fileformats/pyke_rules/compiled_krb/fc_rules_cf_fc/test_build_dimension_coordinate.py +++ b/lib/iris/tests/unit/fileformats/pyke_rules/compiled_krb/fc_rules_cf_fc/test_build_dimension_coordinate.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2014 - 2015, Met Office +# (C) British Crown Copyright 2014 - 2018, Met Office # # This file is part of Iris. # @@ -112,6 +112,84 @@ def test_dim_coord_construction(self): self.engine.cube.add_dim_coord.assert_called_with( expected_coord, [0]) + def test_dim_coord_construction_masked_array(self): + self._set_cf_coord_var(np.ma.array( + np.arange(6), + mask=[True, False, False, False, False, False], + fill_value=-999, + )) + + expected_coord = DimCoord( + np.array([-999, 1, 2, 3, 4, 5]), + long_name=self.cf_coord_var.long_name, + var_name=self.cf_coord_var.cf_name, + units=self.cf_coord_var.units, + bounds=self.bounds) + + with warnings.catch_warnings(record=True) as w: + # Asserts must lie within context manager because of deferred + # loading. + with self.deferred_load_patch, self.get_cf_bounds_var_patch: + build_dimension_coordinate(self.engine, self.cf_coord_var) + + # Test that expected coord is built and added to cube. + self.engine.cube.add_dim_coord.assert_called_with( + expected_coord, [0]) + + # Assert warning is raised + assert len(w) == 1 + assert 'Gracefully filling' in w[0].message.args[0] + + def test_dim_coord_construction_masked_array_mask_does_nothing(self): + self._set_cf_coord_var(np.ma.array( + np.arange(6), + mask=False, + )) + + expected_coord = DimCoord( + self.cf_coord_var[:], + long_name=self.cf_coord_var.long_name, + var_name=self.cf_coord_var.cf_name, + units=self.cf_coord_var.units, + bounds=self.bounds) + + with warnings.catch_warnings(record=True) as w: + # Asserts must lie within context manager because of deferred + # loading. + with self.deferred_load_patch, self.get_cf_bounds_var_patch: + build_dimension_coordinate(self.engine, self.cf_coord_var) + + # Test that expected coord is built and added to cube. + self.engine.cube.add_dim_coord.assert_called_with( + expected_coord, [0]) + + # Assert no warning is raised + assert len(w) == 0 + + def test_dim_coord_construction_masked_bounds_mask_does_nothing(self): + self.bounds = np.ma.array(np.arange(12).reshape(6, 2), mask=False) + self._set_cf_coord_var(np.arange(6)) + + expected_coord = DimCoord( + self.cf_coord_var[:], + long_name=self.cf_coord_var.long_name, + var_name=self.cf_coord_var.cf_name, + units=self.cf_coord_var.units, + bounds=self.bounds) + + with warnings.catch_warnings(record=True) as w: + # Asserts must lie within context manager because of deferred + # loading. + with self.deferred_load_patch, self.get_cf_bounds_var_patch: + build_dimension_coordinate(self.engine, self.cf_coord_var) + + # Test that expected coord is built and added to cube. + self.engine.cube.add_dim_coord.assert_called_with( + expected_coord, [0]) + + # Assert no warning is raised + assert len(w) == 0 + def test_aux_coord_construction(self): # Use non monotonically increasing coordinates to force aux coord # construction. diff --git a/lib/iris/tests/unit/lazy_data/test_non_lazy.py b/lib/iris/tests/unit/lazy_data/test_non_lazy.py new file mode 100644 index 0000000000..7a736a0584 --- /dev/null +++ b/lib/iris/tests/unit/lazy_data/test_non_lazy.py @@ -0,0 +1,51 @@ +# (C) British Crown Copyright 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +"""Test function :func:`iris._lazy data.non_lazy`.""" + +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +# Import iris.tests first so that some things can be initialised before +# importing anything else. +import iris.tests as tests + +import numpy as np + +from iris._lazy_data import as_lazy_data, is_lazy_data, non_lazy + + +class Test_non_lazy(tests.IrisTest): + def setUp(self): + self.array = np.arange(8).reshape(2, 4) + self.lazy_array = as_lazy_data(self.array) + self.func = non_lazy(lambda array: array.sum(axis=0)) + self.func_result = [4, 6, 8, 10] + + def test_lazy_input(self): + result = self.func(self.lazy_array) + self.assertFalse(is_lazy_data(result)) + self.assertArrayEqual(result, self.func_result) + + def test_non_lazy_input(self): + # Check that a non-lazy input doesn't trip up the functionality. + result = self.func(self.array) + self.assertFalse(is_lazy_data(result)) + self.assertArrayEqual(result, self.func_result) + + +if __name__ == '__main__': + tests.main() diff --git a/lib/iris/tests/unit/plot/_blockplot_common.py b/lib/iris/tests/unit/plot/_blockplot_common.py new file mode 100644 index 0000000000..dc69a5c991 --- /dev/null +++ b/lib/iris/tests/unit/plot/_blockplot_common.py @@ -0,0 +1,121 @@ +# (C) British Crown Copyright 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +"""Common test code for `iris.plot.pcolor` and `iris.plot.pcolormesh`.""" + +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +# Import iris.tests first so that some things can be initialised before +# importing anything else. +import iris.tests as tests + +import numpy as np + +from iris.tests import mock +from iris.tests.stock import simple_2d +from iris.tests.unit.plot import MixinCoords + + +class MixinStringCoordPlot(object): + # Mixin for common string-coord tests on pcolor/pcolormesh. + # To use, make a class that inherits from this *and* + # :class:`iris.tests.unit.plot.TestGraphicStringCoord`, + # and defines "self.blockplot_func()", to return the `iris.plot` function. + def test_yaxis_labels(self): + self.blockplot_func()(self.cube, coords=('bar', 'str_coord')) + self.assertBoundsTickLabels('yaxis') + + def test_xaxis_labels(self): + self.blockplot_func()(self.cube, coords=('str_coord', 'bar')) + self.assertBoundsTickLabels('xaxis') + + def test_xaxis_labels_with_axes(self): + import matplotlib.pyplot as plt + fig = plt.figure() + ax = fig.add_subplot(111) + ax.set_xlim(0, 3) + self.blockplot_func()(self.cube, coords=('str_coord', 'bar'), axes=ax) + plt.close(fig) + self.assertPointsTickLabels('xaxis', ax) + + def test_yaxis_labels_with_axes(self): + import matplotlib.pyplot as plt + fig = plt.figure() + ax = fig.add_subplot(111) + ax.set_ylim(0, 3) + self.blockplot_func()(self.cube, axes=ax, coords=('bar', 'str_coord')) + plt.close(fig) + self.assertPointsTickLabels('yaxis', ax) + + def test_geoaxes_exception(self): + import matplotlib.pyplot as plt + fig = plt.figure() + ax = fig.add_subplot(111) + self.assertRaises(TypeError, self.blockplot_func(), + self.lat_lon_cube, axes=ax) + plt.close(fig) + + +class Mixin2dCoordsPlot(MixinCoords): + # Mixin for common coordinate tests on pcolor/pcolormesh. + # To use, make a class that inherits from this *and* + # :class:`iris.tests.IrisTest`, + # and defines "self.blockplot_func()", to return the `iris.plot` function. + def blockplot_setup(self): + # We have a 2d cube with dimensionality (bar: 3; foo: 4) + self.cube = simple_2d(with_bounds=True) + coord = self.cube.coord('foo') + self.foo = coord.contiguous_bounds() + self.foo_index = np.arange(coord.points.size + 1) + coord = self.cube.coord('bar') + self.bar = coord.contiguous_bounds() + self.bar_index = np.arange(coord.points.size + 1) + self.data = self.cube.data + self.dataT = self.data.T + self.draw_func = self.blockplot_func() + patch_target_name = 'matplotlib.pyplot.' + self.draw_func.__name__ + self.mpl_patch = self.patch(patch_target_name) + + +class Mixin2dCoordsContigTol(object): + # Mixin for contiguity tolerance argument to pcolor/pcolormesh. + # To use, make a class that inherits from this *and* + # :class:`iris.tests.IrisTest`, + # and defines "self.blockplot_func()", to return the `iris.plot` function, + # and defines "self.additional_kwargs" for expected extra call args. + def test_contig_tol(self): + # Patch the inner call to ensure contiguity_tolerance is passed. + cube_argument = mock.sentinel.passed_arg + expected_result = mock.sentinel.returned_value + blockplot_patch = self.patch( + 'iris.plot._draw_2d_from_bounds', + mock.Mock(return_value=expected_result)) + # Make the call + draw_func = self.blockplot_func() + other_kwargs = self.additional_kwargs + result = draw_func(cube_argument, contiguity_tolerance=0.0123) + drawfunc_name = draw_func.__name__ + # Check details of the call that was made. + self.assertEqual( + blockplot_patch.call_args_list, + [mock.call(drawfunc_name, cube_argument, + contiguity_tolerance=0.0123, **other_kwargs)]) + self.assertEqual(result, expected_result) + + +if __name__ == "__main__": + tests.main() diff --git a/lib/iris/tests/unit/plot/test__check_bounds_contiguity_and_mask.py b/lib/iris/tests/unit/plot/test__check_bounds_contiguity_and_mask.py new file mode 100644 index 0000000000..225aa2890e --- /dev/null +++ b/lib/iris/tests/unit/plot/test__check_bounds_contiguity_and_mask.py @@ -0,0 +1,111 @@ +# (C) British Crown Copyright 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +"""Unit tests for the `iris.plot._check_bounds_contiguity_and_mask` +function.""" + +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +# Import iris.tests first so that some things can be initialised before +# importing anything else. +import iris.tests as tests + +import mock + +import numpy as np +import numpy.ma as ma + +from iris.coords import DimCoord +from iris.tests.stock import (sample_2d_latlons, + make_bounds_discontiguous_at_point) + +if tests.MPL_AVAILABLE: + import iris.plot as iplt + + +@tests.skip_plot +class Test_check_bounds_contiguity_and_mask(tests.IrisTest): + def test_1d_not_checked(self): + # Test a 1D coordinate, which is not checked as atol is not set. + coord = DimCoord([1, 3, 5], bounds=[[0, 2], [2, 4], [5, 6]]) + data = np.array([278, 300, 282]) + iplt._check_bounds_contiguity_and_mask(coord, data) + + def test_1d_contiguous(self): + # Test a 1D coordinate which is contiguous. + coord = DimCoord([1, 3, 5], bounds=[[0, 2], [2, 4], [4, 6]]) + data = np.array([278, 300, 282]) + iplt._check_bounds_contiguity_and_mask(coord, data, atol=1e-3) + + def test_1d_discontigous_masked(self): + # Test a 1D coordinate which is discontiguous but masked at + # discontiguities. + coord = DimCoord([1, 3, 5], bounds=[[0, 2], [2, 4], [5, 6]]) + data = ma.array(np.array([278, 300, 282]), mask=[0, 1, 0]) + iplt._check_bounds_contiguity_and_mask(coord, data, atol=1e-3) + + def test_1d_discontigous_unmasked(self): + # Test a 1D coordinate which is discontiguous and unmasked at + # discontiguities. + coord = DimCoord([1, 3, 5], bounds=[[0, 2], [2, 4], [5, 6]]) + data = ma.array(np.array([278, 300, 282]), mask=[1, 0, 0]) + msg = 'coordinate are not contiguous and data is not masked where ' \ + 'the discontiguity occurs' + with self.assertRaisesRegexp(ValueError, msg): + iplt._check_bounds_contiguity_and_mask(coord, data, atol=1e-3) + + def test_2d_contiguous(self): + # Test a 2D coordinate which is contiguous. + cube = sample_2d_latlons() + iplt._check_bounds_contiguity_and_mask(cube.coord('longitude'), + cube.data) + + def test_2d_contiguous_atol(self): + # Check the atol is passed correctly. + cube = sample_2d_latlons() + with mock.patch('iris.coords.Coord._discontiguity_in_bounds' + ) as discontiguity_check: + # Discontiguity returns two objects that are unpacked in + # `_check_bounds_contiguity_and_mask`. + discontiguity_check.return_value = [True, None] + iplt._check_bounds_contiguity_and_mask(cube.coord('longitude'), + cube.data, + atol=1e-3) + discontiguity_check.assert_called_with(atol=1e-3) + + def test_2d_discontigous_masked(self): + # Test a 2D coordinate which is discontiguous but masked at + # discontiguities. + cube = sample_2d_latlons() + make_bounds_discontiguous_at_point(cube, 3, 4) + iplt._check_bounds_contiguity_and_mask(cube.coord('longitude'), + cube.data) + + def test_2d_discontigous_unmasked(self): + # Test a 2D coordinate which is discontiguous and unmasked at + # discontiguities. + cube = sample_2d_latlons() + make_bounds_discontiguous_at_point(cube, 3, 4) + msg = 'coordinate are not contiguous' + cube.data[3, 4] = ma.nomask + with self.assertRaisesRegexp(ValueError, msg): + iplt._check_bounds_contiguity_and_mask(cube.coord('longitude'), + cube.data) + + +if __name__ == "__main__": + tests.main() diff --git a/lib/iris/tests/unit/plot/test__get_plot_defn.py b/lib/iris/tests/unit/plot/test__get_plot_defn.py index 6dfb6f9f11..2947933c7d 100644 --- a/lib/iris/tests/unit/plot/test__get_plot_defn.py +++ b/lib/iris/tests/unit/plot/test__get_plot_defn.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2017, Met Office +# (C) British Crown Copyright 2017 - 2018, Met Office # # This file is part of Iris. # @@ -24,7 +24,7 @@ import iris.tests as tests import iris.coords -from iris.tests.stock import simple_2d +from iris.tests.stock import simple_2d, simple_2d_w_multidim_coords if tests.MPL_AVAILABLE: import iris.plot as iplt @@ -45,6 +45,12 @@ def test_axis_order_yx(self): self.assertEqual([coord.name() for coord in defn.coords], ['foo', 'bar']) + def test_2d_coords(self): + cube = simple_2d_w_multidim_coords() + defn = iplt._get_plot_defn(cube, iris.coords.BOUND_MODE) + self.assertEqual([coord.name() for coord in defn.coords], + ['bar', 'foo']) + if __name__ == "__main__": tests.main() diff --git a/lib/iris/tests/unit/plot/test__get_plot_defn_custom_coords_picked.py b/lib/iris/tests/unit/plot/test__get_plot_defn_custom_coords_picked.py new file mode 100644 index 0000000000..d0ce55a389 --- /dev/null +++ b/lib/iris/tests/unit/plot/test__get_plot_defn_custom_coords_picked.py @@ -0,0 +1,93 @@ +# (C) British Crown Copyright 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +"""Unit tests for the `iris.plot._get_plot_defn_custom_coords_picked` +function.""" + + +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +# Import iris.tests first so that some things can be initialised before +# importing anything else. +import iris.tests as tests + +from iris.coords import BOUND_MODE, POINT_MODE +from iris.tests.stock import (simple_2d, simple_2d_w_multidim_coords, + hybrid_height) + + +if tests.MPL_AVAILABLE: + import iris.plot as iplt + + +@tests.skip_plot +class Test_get_plot_defn_custom_coords_picked(tests.IrisTest): + def test_1d_coords(self): + cube = simple_2d() + defn = iplt._get_plot_defn_custom_coords_picked(cube, ('foo', 'bar'), + POINT_MODE) + self.assertEqual([coord.name() for coord in defn.coords], + ['bar', 'foo']) + self.assertFalse(defn.transpose) + + def test_1d_coords_swapped(self): + cube = simple_2d() + defn = iplt._get_plot_defn_custom_coords_picked(cube, ('bar', 'foo'), + POINT_MODE) + self.assertEqual([coord.name() for coord in defn.coords], + ['foo', 'bar']) + self.assertTrue(defn.transpose) + + def test_1d_coords_as_integers(self): + cube = simple_2d() + defn = iplt._get_plot_defn_custom_coords_picked(cube, (1, 0), + POINT_MODE) + self.assertEqual([coord for coord in defn.coords], [0, 1]) + self.assertFalse(defn.transpose) + + def test_1d_coords_as_integers_swapped(self): + cube = simple_2d() + defn = iplt._get_plot_defn_custom_coords_picked(cube, (0, 1), + POINT_MODE) + self.assertEqual([coord for coord in defn.coords], [1, 0]) + self.assertTrue(defn.transpose) + + def test_2d_coords(self): + cube = simple_2d_w_multidim_coords() + defn = iplt._get_plot_defn_custom_coords_picked(cube, ('foo', 'bar'), + BOUND_MODE) + self.assertEqual([coord.name() for coord in defn.coords], + ['bar', 'foo']) + self.assertFalse(defn.transpose) + + def test_2d_coords_as_integers(self): + cube = simple_2d_w_multidim_coords() + defn = iplt._get_plot_defn_custom_coords_picked(cube, (0, 1), + BOUND_MODE) + self.assertEqual([coord for coord in defn.coords], [1, 0]) + self.assertTrue(defn.transpose) + + def test_span_check(self): + cube = hybrid_height() + emsg = 'don\'t span the 2 data dimensions' + with self.assertRaisesRegexp(ValueError, emsg): + iplt._get_plot_defn_custom_coords_picked( + cube, ('sigma', 'level_height'), POINT_MODE) + + +if __name__ == "__main__": + tests.main() diff --git a/lib/iris/tests/unit/plot/test_pcolor.py b/lib/iris/tests/unit/plot/test_pcolor.py index 46219965f9..da9a461ade 100644 --- a/lib/iris/tests/unit/plot/test_pcolor.py +++ b/lib/iris/tests/unit/plot/test_pcolor.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2014 - 2016, Met Office +# (C) British Crown Copyright 2014 - 2018, Met Office # # This file is part of Iris. # @@ -26,64 +26,38 @@ import numpy as np from iris.tests.stock import simple_2d -from iris.tests.unit.plot import TestGraphicStringCoord, MixinCoords +from iris.tests.unit.plot import TestGraphicStringCoord +from iris.tests.unit.plot._blockplot_common import \ + MixinStringCoordPlot, Mixin2dCoordsPlot, Mixin2dCoordsContigTol + if tests.MPL_AVAILABLE: import iris.plot as iplt + PLOT_FUNCTION_TO_TEST = iplt.pcolor @tests.skip_plot -class TestStringCoordPlot(TestGraphicStringCoord): - def test_yaxis_labels(self): - iplt.pcolor(self.cube, coords=('bar', 'str_coord')) - self.assertBoundsTickLabels('yaxis') - - def test_xaxis_labels(self): - iplt.pcolor(self.cube, coords=('str_coord', 'bar')) - self.assertBoundsTickLabels('xaxis') - - def test_xaxis_labels_with_axes(self): - import matplotlib.pyplot as plt - fig = plt.figure() - ax = fig.add_subplot(111) - ax.set_xlim(0, 3) - iplt.pcolor(self.cube, coords=('str_coord', 'bar'), axes=ax) - plt.close(fig) - self.assertPointsTickLabels('xaxis', ax) - - def test_yaxis_labels_with_axes(self): - import matplotlib.pyplot as plt - fig = plt.figure() - ax = fig.add_subplot(111) - ax.set_ylim(0, 3) - iplt.pcolor(self.cube, axes=ax, coords=('bar', 'str_coord')) - plt.close(fig) - self.assertPointsTickLabels('yaxis', ax) - - def test_geoaxes_exception(self): - import matplotlib.pyplot as plt - fig = plt.figure() - ax = fig.add_subplot(111) - self.assertRaises(TypeError, iplt.pcolor, - self.lat_lon_cube, axes=ax) - plt.close(fig) +class TestStringCoordPlot(MixinStringCoordPlot, TestGraphicStringCoord): + def blockplot_func(self): + return PLOT_FUNCTION_TO_TEST @tests.skip_plot -class TestCoords(tests.IrisTest, MixinCoords): +class Test2dCoords(tests.IrisTest, Mixin2dCoordsPlot): def setUp(self): - # We have a 2d cube with dimensionality (bar: 3; foo: 4) - self.cube = simple_2d(with_bounds=True) - coord = self.cube.coord('foo') - self.foo = coord.contiguous_bounds() - self.foo_index = np.arange(coord.points.size + 1) - coord = self.cube.coord('bar') - self.bar = coord.contiguous_bounds() - self.bar_index = np.arange(coord.points.size + 1) - self.data = self.cube.data - self.dataT = self.data.T - self.mpl_patch = self.patch('matplotlib.pyplot.pcolor') - self.draw_func = iplt.pcolor + self.blockplot_setup() + + def blockplot_func(self): + return PLOT_FUNCTION_TO_TEST + + +@tests.skip_plot +class Test2dContigTol(tests.IrisTest, Mixin2dCoordsContigTol): + # Extra call kwargs expected. + additional_kwargs = dict(antialiased=True, snap=False) + + def blockplot_func(self): + return PLOT_FUNCTION_TO_TEST if __name__ == "__main__": diff --git a/lib/iris/tests/unit/plot/test_pcolormesh.py b/lib/iris/tests/unit/plot/test_pcolormesh.py index fb4ceffe28..613a559415 100644 --- a/lib/iris/tests/unit/plot/test_pcolormesh.py +++ b/lib/iris/tests/unit/plot/test_pcolormesh.py @@ -1,4 +1,4 @@ -# (C) British Crown Copyright 2014 - 2016, Met Office +# (C) British Crown Copyright 2014 - 2018, Met Office # # This file is part of Iris. # @@ -26,64 +26,38 @@ import numpy as np from iris.tests.stock import simple_2d -from iris.tests.unit.plot import TestGraphicStringCoord, MixinCoords +from iris.tests.unit.plot import TestGraphicStringCoord +from iris.tests.unit.plot._blockplot_common import \ + MixinStringCoordPlot, Mixin2dCoordsPlot, Mixin2dCoordsContigTol + if tests.MPL_AVAILABLE: import iris.plot as iplt + PLOT_FUNCTION_TO_TEST = iplt.pcolormesh @tests.skip_plot -class TestStringCoordPlot(TestGraphicStringCoord): - def test_yaxis_labels(self): - iplt.pcolormesh(self.cube, coords=('bar', 'str_coord')) - self.assertBoundsTickLabels('yaxis') - - def test_xaxis_labels(self): - iplt.pcolormesh(self.cube, coords=('str_coord', 'bar')) - self.assertBoundsTickLabels('xaxis') - - def test_xaxis_labels_with_axes(self): - import matplotlib.pyplot as plt - fig = plt.figure() - ax = fig.add_subplot(111) - ax.set_xlim(0, 3) - iplt.pcolormesh(self.cube, axes=ax, coords=('str_coord', 'bar'),) - plt.close(fig) - self.assertPointsTickLabels('xaxis', ax) - - def test_yaxis_labels_with_axes(self): - import matplotlib.pyplot as plt - fig = plt.figure() - ax = fig.add_subplot(111) - ax.set_ylim(0, 3) - iplt.pcolormesh(self.cube, axes=ax, coords=('bar', 'str_coord')) - plt.close(fig) - self.assertPointsTickLabels('yaxis', ax) - - def test_geoaxes_exception(self): - import matplotlib.pyplot as plt - fig = plt.figure() - ax = fig.add_subplot(111) - self.assertRaises(TypeError, iplt.pcolormesh, - self.lat_lon_cube, axes=ax) - plt.close(fig) +class TestStringCoordPlot(MixinStringCoordPlot, TestGraphicStringCoord): + def blockplot_func(self): + return PLOT_FUNCTION_TO_TEST @tests.skip_plot -class TestCoords(tests.IrisTest, MixinCoords): +class Test2dCoords(tests.IrisTest, Mixin2dCoordsPlot): def setUp(self): - # We have a 2d cube with dimensionality (bar: 3; foo: 4) - self.cube = simple_2d(with_bounds=True) - coord = self.cube.coord('foo') - self.foo = coord.contiguous_bounds() - self.foo_index = np.arange(coord.points.size + 1) - coord = self.cube.coord('bar') - self.bar = coord.contiguous_bounds() - self.bar_index = np.arange(coord.points.size + 1) - self.data = self.cube.data - self.dataT = self.data.T - self.mpl_patch = self.patch('matplotlib.pyplot.pcolormesh') - self.draw_func = iplt.pcolormesh + self.blockplot_setup() + + def blockplot_func(self): + return PLOT_FUNCTION_TO_TEST + + +@tests.skip_plot +class Test2dContigTol(tests.IrisTest, Mixin2dCoordsContigTol): + # Extra call kwargs expected -- unlike 'pcolor', there are none. + additional_kwargs = {} + + def blockplot_func(self): + return PLOT_FUNCTION_TO_TEST if __name__ == "__main__": diff --git a/lib/iris/tests/unit/util/test_reverse.py b/lib/iris/tests/unit/util/test_reverse.py new file mode 100644 index 0000000000..f5a21d259b --- /dev/null +++ b/lib/iris/tests/unit/util/test_reverse.py @@ -0,0 +1,185 @@ +# (C) British Crown Copyright 2018, Met Office +# +# This file is part of Iris. +# +# Iris is free software: you can redistribute it and/or modify it under +# the terms of the GNU Lesser General Public License as published by the +# Free Software Foundation, either version 3 of the License, or +# (at your option) any later version. +# +# Iris is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the +# GNU Lesser General Public License for more details. +# +# You should have received a copy of the GNU Lesser General Public License +# along with Iris. If not, see . +"""Test function :func:`iris.util.reverse`.""" + +from __future__ import (absolute_import, division, print_function) +from six.moves import (filter, input, map, range, zip) # noqa + +# Import iris.tests first so that some things can be initialised before +# importing anything else. +import iris.tests as tests + +import unittest + +import iris +from iris.util import reverse +import numpy as np + + +class Test_array(tests.IrisTest): + def test_simple_array(self): + a = np.arange(12).reshape(3, 4) + self.assertArrayEqual(a[::-1], reverse(a, 0)) + self.assertArrayEqual(a[::-1, ::-1], reverse(a, [0, 1])) + self.assertArrayEqual(a[:, ::-1], reverse(a, 1)) + self.assertArrayEqual(a[:, ::-1], reverse(a, [1])) + + msg = 'Reverse was expecting a single axis or a 1d array *' + with self.assertRaisesRegexp(ValueError, msg): + reverse(a, []) + + msg = 'An axis value out of range for the number of dimensions *' + with self.assertRaisesRegexp(ValueError, msg): + reverse(a, -1) + with self.assertRaisesRegexp(ValueError, msg): + reverse(a, 10) + with self.assertRaisesRegexp(ValueError, msg): + reverse(a, [-1]) + with self.assertRaisesRegexp(ValueError, msg): + reverse(a, [0, -1]) + + msg = 'To reverse an array, provide an int *' + with self.assertRaisesRegexp(TypeError, msg): + reverse(a, 'latitude') + + def test_single_array(self): + a = np.arange(36).reshape(3, 4, 3) + self.assertArrayEqual(a[::-1], reverse(a, 0)) + self.assertArrayEqual(a[::-1, ::-1], reverse(a, [0, 1])) + self.assertArrayEqual(a[:, ::-1, ::-1], reverse(a, [1, 2])) + self.assertArrayEqual(a[..., ::-1], reverse(a, 2)) + + msg = 'Reverse was expecting a single axis or a 1d array *' + with self.assertRaisesRegexp(ValueError, msg): + reverse(a, []) + + msg = 'An axis value out of range for the number of dimensions *' + with self.assertRaisesRegexp(ValueError, msg): + reverse(a, -1) + with self.assertRaisesRegexp(ValueError, msg): + reverse(a, 10) + with self.assertRaisesRegexp(ValueError, msg): + reverse(a, [-1]) + with self.assertRaisesRegexp(ValueError, msg): + reverse(a, [0, -1]) + + with self.assertRaisesRegexp( + TypeError, 'To reverse an array, provide an int *'): + reverse(a, 'latitude') + + +class Test_cube(tests.IrisTest): + def setUp(self): + # On this cube pair, the coordinates to perform operations on have + # matching long names but the points array on one cube is reversed + # with respect to that on the other. + data = np.arange(12).reshape(3, 4) + self.a1 = iris.coords.DimCoord([1, 2, 3], long_name='a') + self.b1 = iris.coords.DimCoord([1, 2, 3, 4], long_name='b') + a2 = iris.coords.DimCoord([3, 2, 1], long_name='a') + b2 = iris.coords.DimCoord([4, 3, 2, 1], long_name='b') + self.span = iris.coords.AuxCoord(np.arange(12).reshape(3, 4), + long_name='spanning') + + self.cube1 = iris.cube.Cube( + data, dim_coords_and_dims=[(self.a1, 0), (self.b1, 1)], + aux_coords_and_dims=[(self.span, (0, 1))]) + + self.cube2 = iris.cube.Cube( + data, dim_coords_and_dims=[(a2, 0), (b2, 1)]) + + def test_cube_dim(self): + cube1_reverse0 = reverse(self.cube1, 0) + cube1_reverse1 = reverse(self.cube1, 1) + cube1_reverse_both = reverse(self.cube1, (0, 1)) + + self.assertArrayEqual(self.cube1.data[::-1], cube1_reverse0.data) + self.assertArrayEqual(self.cube2.coord('a').points, + cube1_reverse0.coord('a').points) + self.assertArrayEqual(self.cube1.coord('b').points, + cube1_reverse0.coord('b').points) + + self.assertArrayEqual(self.cube1.data[:, ::-1], cube1_reverse1.data) + self.assertArrayEqual(self.cube1.coord('a').points, + cube1_reverse1.coord('a').points) + self.assertArrayEqual(self.cube2.coord('b').points, + cube1_reverse1.coord('b').points) + + self.assertArrayEqual(self.cube1.data[::-1, ::-1], + cube1_reverse_both.data) + self.assertArrayEqual(self.cube2.coord('a').points, + cube1_reverse_both.coord('a').points) + self.assertArrayEqual(self.cube2.coord('b').points, + cube1_reverse_both.coord('b').points) + + def test_cube_coord(self): + cube1_reverse0 = reverse(self.cube1, self.a1) + cube1_reverse1 = reverse(self.cube1, 'b') + cube1_reverse_both = reverse(self.cube1, (self.a1, self.b1)) + cube1_reverse_spanning = reverse(self.cube1, 'spanning') + + self.assertArrayEqual(self.cube1.data[::-1], cube1_reverse0.data) + self.assertArrayEqual(self.cube2.coord('a').points, + cube1_reverse0.coord('a').points) + self.assertArrayEqual(self.cube1.coord('b').points, + cube1_reverse0.coord('b').points) + + self.assertArrayEqual(self.cube1.data[:, ::-1], cube1_reverse1.data) + self.assertArrayEqual(self.cube1.coord('a').points, + cube1_reverse1.coord('a').points) + self.assertArrayEqual(self.cube2.coord('b').points, + cube1_reverse1.coord('b').points) + + self.assertArrayEqual(self.cube1.data[::-1, ::-1], + cube1_reverse_both.data) + self.assertArrayEqual(self.cube2.coord('a').points, + cube1_reverse_both.coord('a').points) + self.assertArrayEqual(self.cube2.coord('b').points, + cube1_reverse_both.coord('b').points) + + self.assertArrayEqual(self.cube1.data[::-1, ::-1], + cube1_reverse_spanning.data) + self.assertArrayEqual(self.cube2.coord('a').points, + cube1_reverse_spanning.coord('a').points) + self.assertArrayEqual(self.cube2.coord('b').points, + cube1_reverse_spanning.coord('b').points) + self.assertArrayEqual( + self.span.points[::-1, ::-1], + cube1_reverse_spanning.coord('spanning').points) + + msg = 'Expected to find exactly 1 latitude coordinate, but found none.' + with self.assertRaisesRegexp( + iris.exceptions.CoordinateNotFoundError, msg): + reverse(self.cube1, 'latitude') + + msg = 'Reverse was expecting a single axis or a 1d array *' + with self.assertRaisesRegexp(ValueError, msg): + reverse(self.cube1, []) + + msg = ('coords_or_dims must be int, str, coordinate or sequence of ' + 'these. Got cube.') + with self.assertRaisesRegexp(TypeError, msg): + reverse(self.cube1, self.cube1) + + msg = ('coords_or_dims must be int, str, coordinate or sequence of ' + 'these.') + with self.assertRaisesRegexp(TypeError, msg): + reverse(self.cube1, 3.) + + +if __name__ == '__main__': + unittest.main() diff --git a/lib/iris/util.py b/lib/iris/util.py index 4f4c42dc4b..7e5b2c36a8 100644 --- a/lib/iris/util.py +++ b/lib/iris/util.py @@ -41,6 +41,7 @@ import iris import iris.exceptions +import iris.cube def broadcast_to_shape(array, shape, dim_map): @@ -406,16 +407,19 @@ def between(lh, rh, lh_inclusive=True, rh_inclusive=True): return lambda c: lh < c < rh -def reverse(array, axes): +def reverse(cube_or_array, coords_or_dims): """ - Reverse the array along the given axes. + Reverse the cube or array along the given dimensions. Args: - * array - The array to reverse - * axes - A single value or array of values of axes to reverse + * cube_or_array: :class:`iris.cube.Cube` or :class:`numpy.ndarray` + The cube or array to reverse. + * coords_or_dims: int, str, :class:`iris.coords.Coord` or sequence of these + Identify one or more dimensions to reverse. If cube_or_array is a + numpy array, use int or a sequence of ints, as in the examples below. + If cube_or_array is a Cube, a Coord or coordinate name (or sequence of + these) may be specified instead. :: @@ -447,20 +451,42 @@ def reverse(array, axes): [15 14 13 12]]] """ - index = [slice(None, None)] * array.ndim - axes = np.array(axes, ndmin=1) - if axes.ndim != 1: + index = [slice(None, None)] * cube_or_array.ndim + + if isinstance(coords_or_dims, iris.cube.Cube): + raise TypeError('coords_or_dims must be int, str, coordinate or ' + 'sequence of these. Got cube.') + + if iris.cube._is_single_item(coords_or_dims): + coords_or_dims = [coords_or_dims] + + axes = set() + for coord_or_dim in coords_or_dims: + if isinstance(coord_or_dim, int): + axes.add(coord_or_dim) + elif isinstance(cube_or_array, np.ndarray): + raise TypeError( + 'To reverse an array, provide an int or sequence of ints.') + else: + try: + axes.update(cube_or_array.coord_dims(coord_or_dim)) + except AttributeError: + raise TypeError('coords_or_dims must be int, str, coordinate ' + 'or sequence of these.') + + axes = np.array(list(axes), ndmin=1) + if axes.ndim != 1 or axes.size == 0: raise ValueError('Reverse was expecting a single axis or a 1d array ' 'of axes, got %r' % axes) - if np.min(axes) < 0 or np.max(axes) > array.ndim-1: + if np.min(axes) < 0 or np.max(axes) > cube_or_array.ndim-1: raise ValueError('An axis value out of range for the number of ' 'dimensions from the given array (%s) was received. ' - 'Got: %r' % (array.ndim, axes)) + 'Got: %r' % (cube_or_array.ndim, axes)) for axis in axes: index[axis] = slice(None, None, -1) - return array[tuple(index)] + return cube_or_array[tuple(index)] def monotonic(array, strict=False, return_direction=False): diff --git a/requirements/all.txt b/requirements/all.txt index bf9ba34e4e..657be9ce19 100644 --- a/requirements/all.txt +++ b/requirements/all.txt @@ -5,7 +5,7 @@ #conda: esmpy>=7.0 (only python=2) #gdal : under review -- not tested at present mo_pack -nc-time-axis # conda: nc_time_axis +nc-time-axis pandas stratify #conda: python-stratify pyugrid diff --git a/requirements/core.txt b/requirements/core.txt index c28cdb04a1..09ffb8954c 100644 --- a/requirements/core.txt +++ b/requirements/core.txt @@ -4,10 +4,11 @@ # Without these, iris won't even import. cartopy +#conda: proj4<5 cf_units>=2 cftime -dask[array]>=0.17.1 #conda: dask>=0.17.1 -matplotlib>=2 +dask[array] #conda: dask +matplotlib>=2,<3 netcdf4 numpy>=1.14 scipy diff --git a/requirements/gen_conda_requirements.py b/requirements/gen_conda_requirements.py index a6a5fd84aa..e7ad2654d5 100644 --- a/requirements/gen_conda_requirements.py +++ b/requirements/gen_conda_requirements.py @@ -28,7 +28,7 @@ def read_conda_reqs(fname, options): with open(fname, 'r') as fh: for line in fh: line = line.strip() - if '#conda:' in line: + if CONDA_PATTERN in line: line_start = line.index(CONDA_PATTERN) + len(CONDA_PATTERN) line = line[line_start:].strip() if 'only python=2' in line: