Skip to content

Commit

Permalink
Merge remote-tracking branch 'upstream/main' into feature/rolling-pad
Browse files Browse the repository at this point in the history
* upstream/main:
  Save groupby codes after factorizing, pass to flox (pydata#7206)
  [skip-ci] Add compute to groupby benchmarks (pydata#7690)
  Delete built-in cfgrib backend (pydata#7670)
  Added a pronunciation guide to the word Xarray in the README.MD fil… (pydata#7677)
  boundarynorm fix (pydata#7553)
  Fix lazy negative slice rewriting. (pydata#7586)
  [pre-commit.ci] pre-commit autoupdate (pydata#7687)
  Adjust sidebar font colors (pydata#7674)
  Bump pypa/gh-action-pypi-publish from 1.8.1 to 1.8.3 (pydata#7682)
  Raise PermissionError when insufficient permissions (pydata#7629)
  • Loading branch information
dcherian committed Mar 29, 2023
2 parents b963b29 + 0ac5541 commit d02d2fd
Show file tree
Hide file tree
Showing 38 changed files with 467 additions and 531 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/pypi-release.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ jobs:
- name: Publish package to TestPyPI
if: github.event_name == 'push'
uses: pypa/[email protected].1
uses: pypa/[email protected].3
with:
user: __token__
password: ${{ secrets.TESTPYPI_TOKEN }}
Expand All @@ -90,7 +90,7 @@ jobs:
name: releases
path: dist
- name: Publish package to PyPI
uses: pypa/[email protected].1
uses: pypa/[email protected].3
with:
user: __token__
password: ${{ secrets.PYPI_TOKEN }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/testpypi-release.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ jobs:
- name: Publish package to TestPyPI
if: github.event_name == 'push'
uses: pypa/[email protected].1
uses: pypa/[email protected].3
with:
user: __token__
password: ${{ secrets.TESTPYPI_TOKEN }}
Expand Down
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ repos:
files: ^xarray/
- repo: https://github.com/charliermarsh/ruff-pre-commit
# Ruff version.
rev: 'v0.0.257'
rev: 'v0.0.259'
hooks:
- id: ruff
args: ["--fix"]
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
[![Examples on binder](https://img.shields.io/badge/launch-binder-579ACA.svg?logo=data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAFkAAABZCAMAAABi1XidAAAB8lBMVEX///9XmsrmZYH1olJXmsr1olJXmsrmZYH1olJXmsr1olJXmsrmZYH1olL1olJXmsr1olJXmsrmZYH1olL1olJXmsrmZYH1olJXmsr1olL1olJXmsrmZYH1olL1olJXmsrmZYH1olL1olL0nFf1olJXmsrmZYH1olJXmsq8dZb1olJXmsrmZYH1olJXmspXmspXmsr1olL1olJXmsrmZYH1olJXmsr1olL1olJXmsrmZYH1olL1olLeaIVXmsrmZYH1olL1olL1olJXmsrmZYH1olLna31Xmsr1olJXmsr1olJXmsrmZYH1olLqoVr1olJXmsr1olJXmsrmZYH1olL1olKkfaPobXvviGabgadXmsqThKuofKHmZ4Dobnr1olJXmsr1olJXmspXmsr1olJXmsrfZ4TuhWn1olL1olJXmsqBi7X1olJXmspZmslbmMhbmsdemsVfl8ZgmsNim8Jpk8F0m7R4m7F5nLB6jbh7jbiDirOEibOGnKaMhq+PnaCVg6qWg6qegKaff6WhnpKofKGtnomxeZy3noG6dZi+n3vCcpPDcpPGn3bLb4/Mb47UbIrVa4rYoGjdaIbeaIXhoWHmZYHobXvpcHjqdHXreHLroVrsfG/uhGnuh2bwj2Hxk17yl1vzmljzm1j0nlX1olL3AJXWAAAAbXRSTlMAEBAQHx8gICAuLjAwMDw9PUBAQEpQUFBXV1hgYGBkcHBwcXl8gICAgoiIkJCQlJicnJ2goKCmqK+wsLC4usDAwMjP0NDQ1NbW3Nzg4ODi5+3v8PDw8/T09PX29vb39/f5+fr7+/z8/Pz9/v7+zczCxgAABC5JREFUeAHN1ul3k0UUBvCb1CTVpmpaitAGSLSpSuKCLWpbTKNJFGlcSMAFF63iUmRccNG6gLbuxkXU66JAUef/9LSpmXnyLr3T5AO/rzl5zj137p136BISy44fKJXuGN/d19PUfYeO67Znqtf2KH33Id1psXoFdW30sPZ1sMvs2D060AHqws4FHeJojLZqnw53cmfvg+XR8mC0OEjuxrXEkX5ydeVJLVIlV0e10PXk5k7dYeHu7Cj1j+49uKg7uLU61tGLw1lq27ugQYlclHC4bgv7VQ+TAyj5Zc/UjsPvs1sd5cWryWObtvWT2EPa4rtnWW3JkpjggEpbOsPr7F7EyNewtpBIslA7p43HCsnwooXTEc3UmPmCNn5lrqTJxy6nRmcavGZVt/3Da2pD5NHvsOHJCrdc1G2r3DITpU7yic7w/7Rxnjc0kt5GC4djiv2Sz3Fb2iEZg41/ddsFDoyuYrIkmFehz0HR2thPgQqMyQYb2OtB0WxsZ3BeG3+wpRb1vzl2UYBog8FfGhttFKjtAclnZYrRo9ryG9uG/FZQU4AEg8ZE9LjGMzTmqKXPLnlWVnIlQQTvxJf8ip7VgjZjyVPrjw1te5otM7RmP7xm+sK2Gv9I8Gi++BRbEkR9EBw8zRUcKxwp73xkaLiqQb+kGduJTNHG72zcW9LoJgqQxpP3/Tj//c3yB0tqzaml05/+orHLksVO+95kX7/7qgJvnjlrfr2Ggsyx0eoy9uPzN5SPd86aXggOsEKW2Prz7du3VID3/tzs/sSRs2w7ovVHKtjrX2pd7ZMlTxAYfBAL9jiDwfLkq55Tm7ifhMlTGPyCAs7RFRhn47JnlcB9RM5T97ASuZXIcVNuUDIndpDbdsfrqsOppeXl5Y+XVKdjFCTh+zGaVuj0d9zy05PPK3QzBamxdwtTCrzyg/2Rvf2EstUjordGwa/kx9mSJLr8mLLtCW8HHGJc2R5hS219IiF6PnTusOqcMl57gm0Z8kanKMAQg0qSyuZfn7zItsbGyO9QlnxY0eCuD1XL2ys/MsrQhltE7Ug0uFOzufJFE2PxBo/YAx8XPPdDwWN0MrDRYIZF0mSMKCNHgaIVFoBbNoLJ7tEQDKxGF0kcLQimojCZopv0OkNOyWCCg9XMVAi7ARJzQdM2QUh0gmBozjc3Skg6dSBRqDGYSUOu66Zg+I2fNZs/M3/f/Grl/XnyF1Gw3VKCez0PN5IUfFLqvgUN4C0qNqYs5YhPL+aVZYDE4IpUk57oSFnJm4FyCqqOE0jhY2SMyLFoo56zyo6becOS5UVDdj7Vih0zp+tcMhwRpBeLyqtIjlJKAIZSbI8SGSF3k0pA3mR5tHuwPFoa7N7reoq2bqCsAk1HqCu5uvI1n6JuRXI+S1Mco54YmYTwcn6Aeic+kssXi8XpXC4V3t7/ADuTNKaQJdScAAAAAElFTkSuQmCC)](https://mybinder.org/v2/gh/pydata/xarray/main?urlpath=lab/tree/doc/examples/weather-data.ipynb)
[![Twitter](https://img.shields.io/twitter/follow/xarray_dev?style=social)](https://twitter.com/xarray_dev)

**xarray** (formerly **xray**) is an open source project and Python
**xarray** (pronounced "ex-array", formerly known as **xray**) is an open source project and Python
package that makes working with labelled multi-dimensional arrays
simple, efficient, and fun!

Expand Down
24 changes: 13 additions & 11 deletions asv_bench/benchmarks/groupby.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
# import flox to avoid the cost of first import
import flox.xarray # noqa
import numpy as np
import pandas as pd

Expand Down Expand Up @@ -27,24 +29,24 @@ def time_init(self, ndim):
@parameterized(["method", "ndim"], [("sum", "mean"), (1, 2)])
def time_agg_small_num_groups(self, method, ndim):
ds = getattr(self, f"ds{ndim}d")
getattr(ds.groupby("a"), method)()
getattr(ds.groupby("a"), method)().compute()

@parameterized(["method", "ndim"], [("sum", "mean"), (1, 2)])
def time_agg_large_num_groups(self, method, ndim):
ds = getattr(self, f"ds{ndim}d")
getattr(ds.groupby("b"), method)()
getattr(ds.groupby("b"), method)().compute()

def time_binary_op_1d(self):
self.ds1d.groupby("b") - self.ds1d_mean
(self.ds1d.groupby("b") - self.ds1d_mean).compute()

def time_binary_op_2d(self):
self.ds2d.groupby("b") - self.ds2d_mean
(self.ds2d.groupby("b") - self.ds2d_mean).compute()

def peakmem_binary_op_1d(self):
self.ds1d.groupby("b") - self.ds1d_mean
(self.ds1d.groupby("b") - self.ds1d_mean).compute()

def peakmem_binary_op_2d(self):
self.ds2d.groupby("b") - self.ds2d_mean
(self.ds2d.groupby("b") - self.ds2d_mean).compute()


class GroupByDask(GroupBy):
Expand All @@ -56,8 +58,8 @@ def setup(self, *args, **kwargs):
self.ds1d["c"] = self.ds1d["c"].chunk({"dim_0": 50})
self.ds2d = self.ds2d.sel(dim_0=slice(None, None, 2))
self.ds2d["c"] = self.ds2d["c"].chunk({"dim_0": 50, "z": 5})
self.ds1d_mean = self.ds1d.groupby("b").mean()
self.ds2d_mean = self.ds2d.groupby("b").mean()
self.ds1d_mean = self.ds1d.groupby("b").mean().compute()
self.ds2d_mean = self.ds2d.groupby("b").mean().compute()


class GroupByPandasDataFrame(GroupBy):
Expand Down Expand Up @@ -88,7 +90,7 @@ def setup(self, *args, **kwargs):
requires_dask()
super().setup(**kwargs)
self.ds1d = self.ds1d.chunk({"dim_0": 50}).to_dataframe()
self.ds1d_mean = self.ds1d.groupby("b").mean()
self.ds1d_mean = self.ds1d.groupby("b").mean().compute()

def time_binary_op_2d(self):
raise NotImplementedError
Expand Down Expand Up @@ -116,12 +118,12 @@ def time_init(self, ndim):
@parameterized(["method", "ndim"], [("sum", "mean"), (1, 2)])
def time_agg_small_num_groups(self, method, ndim):
ds = getattr(self, f"ds{ndim}d")
getattr(ds.resample(time="3M"), method)()
getattr(ds.resample(time="3M"), method)().compute()

@parameterized(["method", "ndim"], [("sum", "mean"), (1, 2)])
def time_agg_large_num_groups(self, method, ndim):
ds = getattr(self, f"ds{ndim}d")
getattr(ds.resample(time="48H"), method)()
getattr(ds.resample(time="48H"), method)().compute()


class ResampleDask(Resample):
Expand Down
1 change: 0 additions & 1 deletion ci/requirements/all-but-dask.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@ dependencies:
- bottleneck
- cartopy
- cdms2
- cfgrib
- cftime
- coveralls
- flox
Expand Down
1 change: 0 additions & 1 deletion ci/requirements/doc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@ dependencies:
- python=3.10
- bottleneck
- cartopy
- cfgrib>=0.9
- dask-core>=2022.1
- h5netcdf>=0.13
- ipykernel
Expand Down
1 change: 0 additions & 1 deletion ci/requirements/environment-py311.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@ dependencies:
- bottleneck
- cartopy
# - cdms2
- cfgrib
- cftime
- dask-core
- distributed
Expand Down
1 change: 0 additions & 1 deletion ci/requirements/environment-windows-py311.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@ dependencies:
- bottleneck
- cartopy
# - cdms2 # Not available on Windows
# - cfgrib # Causes Python interpreter crash on Windows: https://github.com/pydata/xarray/pull/3340
- cftime
- dask-core
- distributed
Expand Down
1 change: 0 additions & 1 deletion ci/requirements/environment-windows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@ dependencies:
- bottleneck
- cartopy
# - cdms2 # Not available on Windows
# - cfgrib # Causes Python interpreter crash on Windows: https://github.com/pydata/xarray/pull/3340
- cftime
- dask-core
- distributed
Expand Down
1 change: 0 additions & 1 deletion ci/requirements/environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@ dependencies:
- bottleneck
- cartopy
- cdms2
- cfgrib
- cftime
- dask-core
- distributed
Expand Down
1 change: 0 additions & 1 deletion ci/requirements/min-all-deps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,6 @@ dependencies:
- bottleneck=1.3
- cartopy=0.20
- cdms2=3.1
- cfgrib=0.9
- cftime=1.5
- coveralls
- dask-core=2022.1
Expand Down
5 changes: 0 additions & 5 deletions doc/_static/style.css
Original file line number Diff line number Diff line change
Expand Up @@ -172,19 +172,16 @@ https://github.com/bokeh/bokeh/blob/branch-2.4/sphinx/source/bokeh/static/custom
display: block;
padding: 0.25rem 1.5rem;
font-size: 90%;
color: rgba(0, 0, 0, 0.65);
}

.bd-sidebar .nav > li > a:hover {
color: rgba(0, 0, 0, 0.85);
text-decoration: none;
background-color: transparent;
}

.bd-sidebar .nav > .active > a,
.bd-sidebar .nav > .active:hover > a {
font-weight: 400;
color: #130654;
/* adjusted from original
color: rgba(0, 0, 0, 0.85);
background-color: transparent; */
Expand All @@ -199,13 +196,11 @@ https://github.com/bokeh/bokeh/blob/branch-2.4/sphinx/source/bokeh/static/custom
display: block;
padding: 0.25rem 1.5rem;
font-size: 90%;
color: rgba(0, 0, 0, 0.65);
}

.bd-sidebar .nav > li > ul > .active > a,
.bd-sidebar .nav > li > ul > .active:hover > a {
font-weight: 400;
color: #542437;
}

dt:target {
Expand Down
2 changes: 0 additions & 2 deletions doc/getting-started-guide/installing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,6 @@ For netCDF and IO
other gridded raster datasets.
- `iris <https://github.com/scitools/iris>`__: for conversion to and from iris'
Cube objects
- `cfgrib <https://github.com/ecmwf/cfgrib>`__: for reading GRIB files via the
*ECMWF ecCodes* library.

For accelerating xarray
~~~~~~~~~~~~~~~~~~~~~~~
Expand Down
2 changes: 1 addition & 1 deletion doc/user-guide/io.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1257,7 +1257,7 @@ GRIB format via cfgrib

Xarray supports reading GRIB files via ECMWF cfgrib_ python driver,
if it is installed. To open a GRIB file supply ``engine='cfgrib'``
to :py:func:`open_dataset`:
to :py:func:`open_dataset` after installing cfgrib_:

.. ipython::
:verbatim:
Expand Down
8 changes: 8 additions & 0 deletions doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,10 @@ Bug fixes

- Fix :py:meth:`xr.polyval` with non-system standard integer coeffs (:pull:`7619`).
By `Shreyal Gupta <https://github.com/Ravenin7>`_ and `Michael Niklas <https://github.com/headtr1ck>`_.
- Improve error message when trying to open a file which you do not have permission to read (:issue:`6523`, :pull:`7629`).
By `Thomas Coleman <https://github.com/ColemanTom>`_.
- Proper plotting when passing :py:class:`~matplotlib.colors.BoundaryNorm` type argument in :py:meth:`DataArray.plot`. (:issue:`4061`, :issue:`7014`,:pull:`7553`)
By `Jelmer Veenstra <https://github.com/veenstrajelmer>`_.

Documentation
~~~~~~~~~~~~~
Expand All @@ -49,6 +53,10 @@ Documentation
Internal Changes
~~~~~~~~~~~~~~~~

- Remove internal support for reading GRIB files through the ``cfgrib`` backend. ``cfgrib`` now uses the external
backend interface, so no existing code should break.
By `Deepak Cherian <https://github.com/dcherian>`_.

.. _whats-new.2023.03.0:

v2023.03.0 (March 22, 2023)
Expand Down
1 change: 0 additions & 1 deletion setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,6 @@ io =
fsspec
cftime
rasterio
cfgrib
pooch
## Scitools packages & dependencies (e.g: cartopy, cf-units) can be hard to install
# scitools-iris
Expand Down
2 changes: 0 additions & 2 deletions xarray/backends/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
DataStores provide a uniform interface for saving and loading data in different
formats. They should not be used directly, but rather through Dataset objects.
"""
from xarray.backends.cfgrib_ import CfGribDataStore
from xarray.backends.common import AbstractDataStore, BackendArray, BackendEntrypoint
from xarray.backends.file_manager import (
CachingFileManager,
Expand All @@ -30,7 +29,6 @@
"BackendEntrypoint",
"FileManager",
"CachingFileManager",
"CfGribDataStore",
"DummyFileManager",
"InMemoryDataStore",
"NetCDF4DataStore",
Expand Down
13 changes: 6 additions & 7 deletions xarray/backends/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@
T_NetcdfEngine = Literal["netcdf4", "scipy", "h5netcdf"]
T_Engine = Union[
T_NetcdfEngine,
Literal["pydap", "pynio", "pseudonetcdf", "cfgrib", "zarr"],
Literal["pydap", "pynio", "pseudonetcdf", "zarr"],
type[BackendEntrypoint],
str, # no nice typing support for custom backends
None,
Expand All @@ -64,7 +64,6 @@
"h5netcdf": backends.H5NetCDFStore.open,
"pynio": backends.NioDataStore,
"pseudonetcdf": backends.PseudoNetCDFDataStore.open,
"cfgrib": backends.CfGribDataStore,
"zarr": backends.ZarrStore.open_group,
}

Expand Down Expand Up @@ -387,7 +386,7 @@ def open_dataset(
ends with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", "cfgrib", \
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", \
"pseudonetcdf", "zarr", None}, installed backend \
or subclass of xarray.backends.BackendEntrypoint, optional
Engine to use when reading files. If not provided, the default engine
Expand Down Expand Up @@ -479,7 +478,7 @@ def open_dataset(
relevant when using dask or another form of parallelism. By default,
appropriate locks are chosen to safely read and write files with the
currently active dask scheduler. Supported by "netcdf4", "h5netcdf",
"scipy", "pynio", "pseudonetcdf", "cfgrib".
"scipy", "pynio", "pseudonetcdf".
See engine open function for kwargs accepted by each specific engine.
Expand Down Expand Up @@ -576,7 +575,7 @@ def open_dataarray(
ends with .gz, in which case the file is gunzipped and opened with
scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like
objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", "cfgrib", \
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", \
"pseudonetcdf", "zarr", None}, installed backend \
or subclass of xarray.backends.BackendEntrypoint, optional
Engine to use when reading files. If not provided, the default engine
Expand Down Expand Up @@ -666,7 +665,7 @@ def open_dataarray(
relevant when using dask or another form of parallelism. By default,
appropriate locks are chosen to safely read and write files with the
currently active dask scheduler. Supported by "netcdf4", "h5netcdf",
"scipy", "pynio", "pseudonetcdf", "cfgrib".
"scipy", "pynio", "pseudonetcdf".
See engine open function for kwargs accepted by each specific engine.
Expand Down Expand Up @@ -803,7 +802,7 @@ def open_mfdataset(
If provided, call this function on each dataset prior to concatenation.
You can find the file-name from which each dataset was loaded in
``ds.encoding["source"]``.
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", "cfgrib", \
engine : {"netcdf4", "scipy", "pydap", "h5netcdf", "pynio", \
"pseudonetcdf", "zarr", None}, installed backend \
or subclass of xarray.backends.BackendEntrypoint, optional
Engine to use when reading files. If not provided, the default engine
Expand Down
Loading

0 comments on commit d02d2fd

Please sign in to comment.