Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parts of satpy fail with the dask distributed scheduler (geotiff writer, nearest neighbor resampler) #1762

Open
gerritholl opened this issue Jul 14, 2021 · 4 comments

Comments

@gerritholl
Copy link
Collaborator

gerritholl commented Jul 14, 2021

Describe the bug

When using dask.distributed.Client(), Scene.save_datasets() fails with TypeError: self._hds cannot be converted to a Python object for pickling when using the geotiff writer.

To Reproduce

if __name__ == "__main__":
    from dask.distributed import Client, LocalCluster
    cluster = LocalCluster()
    client = Client(cluster)
    from glob import glob
    from satpy import Scene
    from satpy.utils import debug_on; debug_on()
    seviri_files = glob("/media/nas/x21308/scratch/SEVIRI/202103300900/H-000*")
    sc = Scene(filenames=seviri_files, reader=["seviri_l1b_hrit"])
    sc.load(["IR_108"])
    ls = sc.resample("nqceur3km")
    ls.save_datasets()

Expected behavior

I expect the datasets to be written without error messages.

Actual results

Full console output:

[DEBUG: 2021-07-14 17:40:43 : satpy.readers.yaml_reader] Reading ('/home/gholl/checkouts/satpy/satpy/etc/readers/seviri_l1b_hrit.yaml',)
[DEBUG: 2021-07-14 17:40:43 : satpy.readers.yaml_reader] Assigning to seviri_l1b_hrit: ['/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000014___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000013___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000019___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000001___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000018___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000010___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000002___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000016___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000021___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000020___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000022___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000023___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000017___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000015___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000011___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000005___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000012___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000003___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000008___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000009___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000004___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000024___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000006___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-HRV______-000007___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_016___-000002___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_016___-000004___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_016___-000007___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_016___-000003___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_016___-000005___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_016___-000006___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_016___-000001___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_016___-000008___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_039___-000001___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_039___-000002___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_039___-000006___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_039___-000005___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_039___-000004___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_039___-000007___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_039___-000003___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_039___-000008___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_087___-000003___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_087___-000006___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_087___-000007___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_087___-000002___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_087___-000001___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_087___-000004___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_087___-000008___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_087___-000005___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_097___-000005___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_097___-000003___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_097___-000007___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_097___-000006___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_097___-000002___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_097___-000001___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_097___-000004___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_097___-000008___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_108___-000006___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_108___-000008___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_108___-000001___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_108___-000005___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_108___-000002___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_108___-000004___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_108___-000003___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_108___-000007___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_120___-000003___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_120___-000007___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_120___-000001___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_120___-000008___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_120___-000002___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_120___-000004___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_120___-000006___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_120___-000005___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_134___-000001___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_134___-000003___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_134___-000002___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_134___-000004___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_134___-000005___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_134___-000007___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_134___-000008___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-IR_134___-000006___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS006___-000007___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS006___-000005___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS006___-000008___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS006___-000001___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS006___-000004___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS006___-000002___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS006___-000006___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS006___-000003___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS008___-000006___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS008___-000001___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS008___-000002___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS008___-000004___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS008___-000005___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS008___-000008___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS008___-000007___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-VIS008___-000003___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_062___-000003___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_062___-000002___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_062___-000006___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_062___-000008___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_062___-000004___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_062___-000005___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_062___-000007___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_062___-000001___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_073___-000003___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_073___-000008___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_073___-000006___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_073___-000007___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_073___-000001___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_073___-000002___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_073___-000004___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-WV_073___-000005___-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-_________-PRO______-202103300900-__', '/media/nas/x21308/scratch/SEVIRI/202103300900/H-000-MSG4__-MSG4________-_________-EPI______-202103300900-__']
[INFO: 2021-07-14 17:40:43 : hrit_msg] No IMPF configuration field found in prologue.
[DEBUG: 2021-07-14 17:40:43 : satpy.composites.config_loader] Looking for composites config file seviri.yaml
[DEBUG: 2021-07-14 17:40:43 : satpy.composites.config_loader] Looking for composites config file visir.yaml
[DEBUG: 2021-07-14 17:40:43 : hrit_msg] Calibration time 0:00:00.009674
[DEBUG: 2021-07-14 17:40:43 : hrit_msg] Calibration time 0:00:00.008834
[DEBUG: 2021-07-14 17:40:43 : hrit_msg] Calibration time 0:00:00.009094
[DEBUG: 2021-07-14 17:40:43 : hrit_msg] Calibration time 0:00:00.009363
[DEBUG: 2021-07-14 17:40:43 : hrit_msg] Calibration time 0:00:00.010535
[DEBUG: 2021-07-14 17:40:43 : hrit_msg] Calibration time 0:00:00.012282
[DEBUG: 2021-07-14 17:40:44 : hrit_msg] Calibration time 0:00:00.009674
[DEBUG: 2021-07-14 17:40:44 : hrit_msg] Calibration time 0:00:00.009000
[DEBUG: 2021-07-14 17:40:44 : satpy.readers.yaml_reader] Requested orientation for Dataset None is 'native' (default). No flipping is applied.
[DEBUG: 2021-07-14 17:40:44 : satpy.scene] Resampling DataID(name='IR_108', wavelength=WavelengthRange(min=9.8, central=10.8, max=11.8, unit='µm'), resolution=3000.403165817, calibration=<calibration.brightness_temperature>, modifiers=())
[INFO: 2021-07-14 17:40:45 : satpy.resample] Using default KDTree resampler
[DEBUG: 2021-07-14 17:40:45 : satpy.resample] Check if ./resample_lut-f0da637997f064985a17ecef7f75545ad310ab2f.npz exists
[DEBUG: 2021-07-14 17:40:45 : satpy.resample] Computing kd-tree parameters
/home/gholl/checkouts/pyresample/pyresample/kd_tree.py:1025: DeprecationWarning: `np.float` is a deprecated alias for the builtin `float`. To silence this warning, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  input_coords = input_coords.astype(np.float)
/home/gholl/checkouts/pyresample/pyresample/kd_tree.py:1047: DeprecationWarning: `np.int` is a deprecated alias for the builtin `int`. To silence this warning, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  radius=self.radius_of_influence, dtype=np.int,
[DEBUG: 2021-07-14 17:40:45 : satpy.resample] Resampling reshape-606c5991c3b0e92e59563fb65c7f9b3b
[DEBUG: 2021-07-14 17:40:45 : satpy.writers] Reading ['/home/gholl/checkouts/satpy/satpy/etc/writers/geotiff.yaml']
[DEBUG: 2021-07-14 17:40:45 : satpy.writers] Enhancement configuration options: [{'name': 'stretch', 'method': <function stretch at 0x7fb9c5579ee0>, 'kwargs': {'stretch': 'linear'}}]
[DEBUG: 2021-07-14 17:40:45 : trollimage.xrimage] Applying stretch linear with parameters {}
[DEBUG: 2021-07-14 17:40:45 : trollimage.xrimage] Perform a linear contrast stretch.
[DEBUG: 2021-07-14 17:40:45 : trollimage.xrimage] Calculate the histogram quantiles: 
[DEBUG: 2021-07-14 17:40:45 : trollimage.xrimage] Left and right quantiles: 0.005 0.005
[DEBUG: 2021-07-14 17:40:45 : rasterio.env] Entering env context: <rasterio.env.Env object at 0x7fb9403e33d0>
[DEBUG: 2021-07-14 17:40:45 : rasterio.env] Starting outermost env
[DEBUG: 2021-07-14 17:40:45 : rasterio.env] No GDAL environment exists
[DEBUG: 2021-07-14 17:40:45 : rasterio.env] New GDAL environment <rasterio._env.GDALEnv object at 0x7fb9403e3040> created
[DEBUG: 2021-07-14 17:40:45 : rasterio._env] GDAL_DATA found in environment: '/data/gholl/miniconda3/envs/py39/share/gdal'.
[DEBUG: 2021-07-14 17:40:45 : rasterio._env] PROJ_LIB found in environment: '/data/gholl/miniconda3/envs/py39/share/proj'.
[DEBUG: 2021-07-14 17:40:45 : rasterio._env] Started GDALEnv <rasterio._env.GDALEnv object at 0x7fb9403e3040>.
[DEBUG: 2021-07-14 17:40:45 : rasterio.env] Entered env context: <rasterio.env.Env object at 0x7fb9403e33d0>
[DEBUG: 2021-07-14 17:40:45 : rasterio._io] Path: UnparsedPath(path='IR_108_20210330_090010.tif'), mode: w, driver: GTiff
[DEBUG: 2021-07-14 17:40:45 : rasterio._io] Option: ('COMPRESS', b'DEFLATE')
[DEBUG: 2021-07-14 17:40:45 : rasterio._io] Option: ('ZLEVEL', b'6')
[DEBUG: 2021-07-14 17:40:45 : rasterio._base] Nodata success: 0, Nodata value: -10000000000.000000
[DEBUG: 2021-07-14 17:40:45 : rasterio._base] Nodata success: 0, Nodata value: -10000000000.000000
[DEBUG: 2021-07-14 17:40:45 : rasterio.env] Exiting env context: <rasterio.env.Env object at 0x7fb9403e33d0>
[DEBUG: 2021-07-14 17:40:45 : rasterio.env] Cleared existing <rasterio._env.GDALEnv object at 0x7fb9403e3040> options
[DEBUG: 2021-07-14 17:40:45 : rasterio._env] Stopped GDALEnv <rasterio._env.GDALEnv object at 0x7fb9403e3040>.
[DEBUG: 2021-07-14 17:40:45 : rasterio.env] Exiting outermost env
[DEBUG: 2021-07-14 17:40:45 : rasterio.env] Exited env context: <rasterio.env.Env object at 0x7fb9403e33d0>
[INFO: 2021-07-14 17:40:45 : satpy.writers] Computing and writing results...
Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 49, in dumps
    result = pickle.dumps(x, **dump_kwargs)
  File "stringsource", line 2, in rasterio._io.DatasetWriterBase.__reduce_cython__
TypeError: self._hds cannot be converted to a Python object for pickling

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/gholl/checkouts/protocode/test-dask-distributed.py", line 12, in <module>
    ls.save_datasets()
  File "/home/gholl/checkouts/satpy/satpy/scene.py", line 1041, in save_datasets
    return writer.save_datasets(dataarrays, compute=compute, **save_kwargs)
  File "/home/gholl/checkouts/satpy/satpy/writers/__init__.py", line 689, in save_datasets
    return compute_writer_results([results])
  File "/home/gholl/checkouts/satpy/satpy/writers/__init__.py", line 534, in compute_writer_results
    da.compute(delayeds)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/dask/base.py", line 567, in compute
    results = schedule(dsk, keys, **kwargs)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/client.py", line 2687, in get
    futures = self._graph_to_futures(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/client.py", line 2614, in _graph_to_futures
    dsk = dsk.__dask_distributed_pack__(self, keyset, annotations)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/dask/highlevelgraph.py", line 1046, in __dask_distributed_pack__
    "state": layer.__dask_distributed_pack__(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/dask/highlevelgraph.py", line 425, in __dask_distributed_pack__
    dsk = toolz.valmap(dumps_task, dsk)
  File "cytoolz/dicttoolz.pyx", line 181, in cytoolz.dicttoolz.valmap
  File "cytoolz/dicttoolz.pyx", line 206, in cytoolz.dicttoolz.valmap
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/worker.py", line 3856, in dumps_task
    return {"function": dumps_function(task[0]), "args": warn_dumps(task[1:])}
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/worker.py", line 3865, in warn_dumps
    b = dumps(obj, protocol=4)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 60, in dumps
    result = cloudpickle.dumps(x, **dump_kwargs)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 73, in dumps
    cp.dump(obj)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 563, in dump
    return Pickler.dump(self, obj)
  File "stringsource", line 2, in rasterio._io.DatasetWriterBase.__reduce_cython__
TypeError: self._hds cannot be converted to a Python object for pickling

An image is written, but the image is entirely black. Here converted to PNG because GitHub won't let me upload TIF.

IR_108_20210330_090010

Environment Info:

  • OS: openSUSE 15.0
  • Satpy Version: 0.29.1.dev67+g6b38b935
  • PyResample Version: 1.20.0
  • Readers and writers dependencies (when relevant): rasterio 1.2.1

Additional context

Other writers also seem to fail. With the geotiff writer, it fails as described above. With the simple_image writer, it gets stuck in an endless loop starting with (when interrupted):

[INFO: 2021-07-14 17:49:00 : satpy.writers] Computing and writing results...
distributed.protocol.core - CRITICAL - Failed to deserialize
Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/core.py", line 105, in loads
    return msgpack.loads(
  File "msgpack/_unpacker.pyx", line 195, in msgpack._cmsgpack.unpackb
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/core.py", line 97, in _decode_default
    return merge_and_deserialize(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 472, in merge_and_deserialize
    return deserialize(header, merged_frames, deserializers=deserializers)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 406, in deserialize
    return loads(header, frames)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 169, in serialization_error_loads
    raise TypeError(msg)
TypeError: Could not serialize object of type KDTree.
Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 49, in dumps
    result = pickle.dumps(x, **dump_kwargs)
  File "stringsource", line 2, in pykdtree.kdtree.KDTree.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 329, in serialize
    header, frames = dumps(x, context=context) if wants_context else dumps(x)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 52, in pickle_dumps
    frames[0] = pickle.dumps(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 60, in dumps
    result = cloudpickle.dumps(x, **dump_kwargs)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 73, in dumps
    cp.dump(obj)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 563, in dump
    return Pickler.dump(self, obj)
  File "stringsource", line 2, in pykdtree.kdtree.KDTree.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__

distributed.worker - ERROR - Could not serialize object of type KDTree.
Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 49, in dumps
    result = pickle.dumps(x, **dump_kwargs)
  File "stringsource", line 2, in pykdtree.kdtree.KDTree.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 329, in serialize
    header, frames = dumps(x, context=context) if wants_context else dumps(x)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 52, in pickle_dumps
    frames[0] = pickle.dumps(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 60, in dumps
    result = cloudpickle.dumps(x, **dump_kwargs)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 73, in dumps
    cp.dump(obj)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 563, in dump
    return Pickler.dump(self, obj)
  File "stringsource", line 2, in pykdtree.kdtree.KDTree.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__
Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/worker.py", line 2334, in gather_dep
    response = await get_data_from_worker(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/worker.py", line 3753, in get_data_from_worker
    return await retry_operation(_get_data, operation="get_data_from_worker")
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/utils_comm.py", line 385, in retry_operation
    return await retry(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/utils_comm.py", line 370, in retry
    return await coro()
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/worker.py", line 3733, in _get_data
    response = await send_recv(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/core.py", line 647, in send_recv
    response = await comm.read(deserializers=deserializers)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/comm/tcp.py", line 218, in read
    msg = await from_frames(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/comm/utils.py", line 79, in from_frames
    res = _from_frames()
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/comm/utils.py", line 62, in _from_frames
    return protocol.loads(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/core.py", line 105, in loads
    return msgpack.loads(
  File "msgpack/_unpacker.pyx", line 195, in msgpack._cmsgpack.unpackb
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/core.py", line 97, in _decode_default
    return merge_and_deserialize(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 472, in merge_and_deserialize
    return deserialize(header, merged_frames, deserializers=deserializers)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 406, in deserialize
    return loads(header, frames)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 169, in serialization_error_loads
    raise TypeError(msg)
TypeError: Could not serialize object of type KDTree.
Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 49, in dumps
    result = pickle.dumps(x, **dump_kwargs)
  File "stringsource", line 2, in pykdtree.kdtree.KDTree.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 329, in serialize
    header, frames = dumps(x, context=context) if wants_context else dumps(x)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 52, in pickle_dumps
    frames[0] = pickle.dumps(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 60, in dumps
    result = cloudpickle.dumps(x, **dump_kwargs)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 73, in dumps
    cp.dump(obj)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 563, in dump
    return Pickler.dump(self, obj)
  File "stringsource", line 2, in pykdtree.kdtree.KDTree.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__

distributed.utils - ERROR - Could not serialize object of type KDTree.
Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 49, in dumps
    result = pickle.dumps(x, **dump_kwargs)
  File "stringsource", line 2, in pykdtree.kdtree.KDTree.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 329, in serialize
    header, frames = dumps(x, context=context) if wants_context else dumps(x)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 52, in pickle_dumps
    frames[0] = pickle.dumps(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 60, in dumps
    result = cloudpickle.dumps(x, **dump_kwargs)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 73, in dumps
    cp.dump(obj)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 563, in dump
    return Pickler.dump(self, obj)
  File "stringsource", line 2, in pykdtree.kdtree.KDTree.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__

With the NetCDF writer, it gets stuck in a different endless loop:

[INFO: 2021-07-14 17:51:13 : satpy.writers.cf_writer] Saving datasets to NetCDF4/CF.
/home/gholl/checkouts/satpy/satpy/writers/cf_writer.py:739: FutureWarning: The default behaviour of the CF writer will soon change to not compress data by default.
  warnings.warn("The default behaviour of the CF writer will soon change to not compress data by default.",
[WARNING: 2021-07-14 17:51:13 : satpy.writers.cf_writer] No time dimension in datasets, skipping time bounds creation.
distributed.protocol.core - CRITICAL - Failed to deserialize
Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/core.py", line 105, in loads
    return msgpack.loads(
  File "msgpack/_unpacker.pyx", line 195, in msgpack._cmsgpack.unpackb
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/core.py", line 97, in _decode_default
    return merge_and_deserialize(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 472, in merge_and_deserialize
    return deserialize(header, merged_frames, deserializers=deserializers)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 406, in deserialize
    return loads(header, frames)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 169, in serialization_error_loads
    raise TypeError(msg)
TypeError: Could not serialize object of type KDTree.
Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 49, in dumps
    result = pickle.dumps(x, **dump_kwargs)
  File "stringsource", line 2, in pykdtree.kdtree.KDTree.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 329, in serialize
    header, frames = dumps(x, context=context) if wants_context else dumps(x)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 52, in pickle_dumps
    frames[0] = pickle.dumps(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 60, in dumps
    result = cloudpickle.dumps(x, **dump_kwargs)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 73, in dumps
    cp.dump(obj)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 563, in dump
    return Pickler.dump(self, obj)
  File "stringsource", line 2, in pykdtree.kdtree.KDTree.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__

distributed.worker - ERROR - Could not serialize object of type KDTree.
Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 49, in dumps
    result = pickle.dumps(x, **dump_kwargs)
  File "stringsource", line 2, in pykdtree.kdtree.KDTree.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 329, in serialize
    header, frames = dumps(x, context=context) if wants_context else dumps(x)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 52, in pickle_dumps
    frames[0] = pickle.dumps(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 60, in dumps
    result = cloudpickle.dumps(x, **dump_kwargs)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 73, in dumps
    cp.dump(obj)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py", line 563, in dump
    return Pickler.dump(self, obj)
  File "stringsource", line 2, in pykdtree.kdtree.KDTree.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__
Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/worker.py", line 2334, in gather_dep
    response = await get_data_from_worker(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/worker.py", line 3753, in get_data_from_worker
    return await retry_operation(_get_data, operation="get_data_from_worker")
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/utils_comm.py", line 385, in retry_operation
    return await retry(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/utils_comm.py", line 370, in retry
    return await coro()
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/worker.py", line 3733, in _get_data
    response = await send_recv(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/core.py", line 647, in send_recv
    response = await comm.read(deserializers=deserializers)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/comm/tcp.py", line 218, in read
    msg = await from_frames(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/comm/utils.py", line 79, in from_frames
    res = _from_frames()
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/comm/utils.py", line 62, in _from_frames
    return protocol.loads(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/core.py", line 105, in loads
    return msgpack.loads(
  File "msgpack/_unpacker.pyx", line 195, in msgpack._cmsgpack.unpackb
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/core.py", line 97, in _decode_default
    return merge_and_deserialize(
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 472, in merge_and_deserialize
    return deserialize(header, merged_frames, deserializers=deserializers)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 406, in deserialize
    return loads(header, frames)
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/serialize.py", line 169, in serialization_error_loads
    raise TypeError(msg)
TypeError: Could not serialize object of type KDTree.
Traceback (most recent call last):
  File "/data/gholl/miniconda3/envs/py39/lib/python3.9/site-packages/distributed/protocol/pickle.py", line 49, in dumps
    result = pickle.dumps(x, **dump_kwargs)
  File "stringsource", line 2, in pykdtree.kdtree.KDTree.__reduce_cython__
TypeError: no default __reduce__ due to non-trivial __cinit__
@djhoese
Copy link
Member

djhoese commented Jul 14, 2021

Many parts of satpy do not support the distributed scheduler. The geotiff writer is a big one. It can be improved if we switched to rioxarray: https://corteva.github.io/rioxarray/stable/examples/dask_read_write.html

@mraspaud has played around with this idea in the past and may have other ideas. Other parts of this come down to how the data is opened in the reader. Also, the pykdtree (as your last error shows) is not serializable and can't be done over distributed. Not resampling or using the 'native' resampler are currently the only options right now. I believe we'd have to update pydktree to allow these serialized trees...or maybe the gradient search works. That's another thing @mraspaud has played with I think.

@gerritholl
Copy link
Collaborator Author

Hmm, OK. Then the update to trollflow2 that @pnuu made in pytroll/trollflow2#83 is not currently useful until there are major updates to satpy, trollimage, and pykdtree? For some reason I thought some centres were using the distributed scheduler operationally, across multiple servers, but apparently not?

@djhoese
Copy link
Member

djhoese commented Jul 14, 2021

I could of course be completely wrong, but that's how I've understood a lot of this stuff. I also think @pnuu sometimes "cheats" and generates his KDTree indexes in a threaded scheduler, caches them as zarr arrays, and then uses that cache when he runs them in a distributed manner.

@gerritholl
Copy link
Collaborator Author

It works when using resampler="gradient_search" and writer="simple_image" in combination.

@gerritholl gerritholl changed the title With dask distributed scheduler, save_datasets() fails with TypeError With dask distributed scheduler, save_datasets() fails with TypeError when using the geotiff writer Jun 20, 2024
@gerritholl gerritholl changed the title With dask distributed scheduler, save_datasets() fails with TypeError when using the geotiff writer Parts of satpy fail with the dask distributed scheduler (geotiff writer, nearest neighbor resampler) Jun 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants