-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add intake_xarray_kwargs to ThreddsCatalog #52
Conversation
andersy005
commented
Sep 15, 2021
•
edited
Loading
edited
- Fixes add xarray_kwargs to ThreddsCatalog #51
@raybellwaves, this is my attempt at addressing #51. I opted for Right now it appears that things are broken due to how fsspec is interfering with the In [1]: import intake
In [2]: cat_url = "https://thredds.ucar.edu/thredds/catalog/grib/NCEP/GFS/Global_0p25deg/GFS_Global_0p25d
...: eg_20210913_1800.grib2/catalog.xml"
In [3]: catalog = intake.open_thredds_cat(cat_url, driver="netcdf", intake_xarray_kwargs={'xarray_kwargs'
...: : {'engine': "netcdf4"}})
In [4]: catalog = intake.open_thredds_cat(cat_url, driver="netcdf", intake_xarray_kwargs={'xarray_kwargs': {'engine': "netcdf4"}})
In [5]: source = catalog["GFS_Global_0p25deg_20210913_1800.grib2"] In [7]: source.to_dask()
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-7-75a159db6bd7> in <module>
----> 1 source.to_dask()
~/.mambaforge/envs/intake-thredds-dev/lib/python3.9/site-packages/intake_xarray/base.py in to_dask(self)
67 def to_dask(self):
68 """Return xarray object where variables are dask arrays"""
---> 69 return self.read_chunked()
70
71 def close(self):
~/.mambaforge/envs/intake-thredds-dev/lib/python3.9/site-packages/intake_xarray/base.py in read_chunked(self)
42 def read_chunked(self):
43 """Return xarray object (which will have chunks)"""
---> 44 self._load_metadata()
45 return self._ds
46
~/.mambaforge/envs/intake-thredds-dev/lib/python3.9/site-packages/intake/source/base.py in _load_metadata(self)
234 """load metadata only if needed"""
235 if self._schema is None:
--> 236 self._schema = self._get_schema()
237 self.dtype = self._schema.dtype
238 self.shape = self._schema.shape
~/.mambaforge/envs/intake-thredds-dev/lib/python3.9/site-packages/intake_xarray/base.py in _get_schema(self)
16
17 if self._ds is None:
---> 18 self._open_dataset()
19
20 metadata = {
~/.mambaforge/envs/intake-thredds-dev/lib/python3.9/site-packages/intake_xarray/netcdf.py in _open_dataset(self)
90 url = fsspec.open(self.urlpath, **self.storage_options).open()
91
---> 92 self._ds = _open_dataset(url, chunks=self.chunks, **kwargs)
93
94 def _add_path_to_ds(self, ds):
~/.mambaforge/envs/intake-thredds-dev/lib/python3.9/site-packages/xarray/backends/api.py in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, backend_kwargs, *args, **kwargs)
495
496 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None)
--> 497 backend_ds = backend.open_dataset(
498 filename_or_obj,
499 drop_variables=drop_variables,
~/.mambaforge/envs/intake-thredds-dev/lib/python3.9/site-packages/xarray/backends/netCDF4_.py in open_dataset(self, filename_or_obj, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta, group, mode, format, clobber, diskless, persist, lock, autoclose)
549
550 filename_or_obj = _normalize_path(filename_or_obj)
--> 551 store = NetCDF4DataStore.open(
552 filename_or_obj,
553 mode=mode,
~/.mambaforge/envs/intake-thredds-dev/lib/python3.9/site-packages/xarray/backends/netCDF4_.py in open(cls, filename, mode, format, group, clobber, diskless, persist, lock, lock_maker, autoclose)
351
352 if not isinstance(filename, str):
--> 353 raise ValueError(
354 "can only read bytes or file-like objects "
355 "with engine='scipy' or 'h5netcdf'"
ValueError: can only read bytes or file-like objects with engine='scipy' or 'h5netcdf' |
Just fixed that one test. Can this PR be merged? |
Thank you, @aaronspring! Let's go ahead and merge this as is. If there is any issue, we can address it later. |