Skip to content

Commit

Permalink
wip
Browse files Browse the repository at this point in the history
  • Loading branch information
tkknight committed Jul 24, 2023
1 parent 5cfbfc6 commit 1fe628d
Show file tree
Hide file tree
Showing 2 changed files with 15 additions and 18 deletions.
2 changes: 1 addition & 1 deletion docs/src/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -173,7 +173,7 @@ def _dotv(version):
# See https://numpydoc.readthedocs.io/en/latest/install.html#configuration
# TREMTEST
# numpydoc_use_plots = True # ?
numpydoc_show_class_members = False # False stops lots of warnings
numpydoc_show_class_members = False # False stops lots of warnings: https://stackoverflow.com/questions/65198998/sphinx-warning-autosummary-stub-file-not-found-for-the-methods-of-the-class-c
# numpydoc_show_inherited_class_members
# numpydoc_class_members_toctree
# numpydoc_citation_re
Expand Down
31 changes: 14 additions & 17 deletions lib/iris/fileformats/netcdf/saver.py
Original file line number Diff line number Diff line change
Expand Up @@ -2582,7 +2582,7 @@ def save(
compute=True,
):
"""
Save cube(s) to a netCDF file, given the cube and the filename.
Save cube(s) to a netCDF file, given the cube and the filename. TREMTEST
* Iris will write CF 1.7 compliant NetCDF files.
* The attributes dictionaries on each cube in the saved cube list
Expand All @@ -2609,9 +2609,7 @@ def save(
When saving to a dataset, ``compute`` **must** be ``False`` :
See the ``compute`` parameter.
Kwargs:
* netcdf_format (string):
netcdf_format : string
Underlying netCDF file format, one of 'NETCDF4', 'NETCDF4_CLASSIC',
'NETCDF3_CLASSIC' or 'NETCDF3_64BIT'. Default is 'NETCDF4' format.
local_keys : iterable of str
Expand Down Expand Up @@ -2644,7 +2642,7 @@ def save(
chunksizes : tuple of int
Used to manually specify the HDF5 chunksizes for each dimension of the
variable. A detailed discussion of HDF chunking and I/O performance is
available here: https://www.unidata.ucar.edu/software/netcdf/documentation/NUG/netcdf_perf_chunking.html.
available here: TREMTEST
Basically, you want the chunk size for each dimension to match as
closely as possible the size of the data block that users will read
from the file. `chunksizes` cannot be set if `contiguous=True`.
Expand All @@ -2656,15 +2654,15 @@ def save(
computer with the opposite format as the one used to create the file,
there may be some performance advantage to be gained by setting the
endian-ness.
least_significant_digit : int)
least_significant_digit : int
If `least_significant_digit` is specified, variable data will be
truncated (quantized). In conjunction with `zlib=True` this produces
'lossy', but significantly more efficient compression. For example, if
`least_significant_digit=1`, data will be quantized using
`numpy.around(scale*data)/scale`, where `scale = 2**bits`, and `bits`
is determined so that a precision of 0.1 is retained (in this case
`bits=4`). From
http://www.esrl.noaa.gov/psd/data/gridded/conventions/cdc_netcdf_standard.shtml:
TREMTEST:
"least_significant_digit -- power of ten of the smallest decimal place
in unpacked data that is a reliable value". Default is `None`, or no
quantization, or 'lossless' compression.
Expand All @@ -2673,7 +2671,7 @@ def save(
describes a numpy integer dtype (i.e. 'i2', 'short', 'u4') or a dict
of packing parameters as described below or an iterable of such types,
strings, or dicts. This provides support for netCDF data packing as
described in https://www.unidata.ucar.edu/software/netcdf/documentation/NUG/best_practices.html#bp_Packed-Data-Values
described in TREMTEST
If this argument is a type (or type string), appropriate values of
scale_factor and add_offset will be automatically calculated based
on `cube.data` and possible masking. For more control, pass a dict with
Expand All @@ -2693,7 +2691,7 @@ def save(
:class:`iris.cube.CubeList`, or a single element, and each element of
this argument will be applied to each cube separately.
compute: bool:
compute: bool
Default is ``True``, meaning complete the file immediately, and return ``None``.
When ``False``, create the output file but don't write any lazy array content to
Expand All @@ -2716,11 +2714,12 @@ def save(
must (re-)open the dataset for writing, which will fail if the file is
still open for writing by the caller.
Returns:
result (None, or dask.delayed.Delayed):
If `compute=True`, returns `None`.
Otherwise returns a :class:`dask.delayed.Delayed`, which implements delayed
writing to fill in the variables data.
Returns
-------
result: None or dask.delayed.Delayed
If `compute=True`, returns `None`.
Otherwise returns a :class:`dask.delayed.Delayed`, which implements delayed
writing to fill in the variables data.
.. note::
Expand All @@ -2734,9 +2733,7 @@ def save(
`chunksizes` and `endian` keywords are silently ignored for netCDF 3
files that do not use HDF5.
See Also
--------
NetCDF Context manager (:class:`~Saver`).
"""
from iris.cube import Cube, CubeList
Expand Down

0 comments on commit 1fe628d

Please sign in to comment.