-
-
Notifications
You must be signed in to change notification settings - Fork 719
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Serializing attrs classes does not work? #4226
Comments
Upon further investigation I note that cloudpickle does not work on my_class_attrs: import cloudpickle
cloudpickle.loads(cloudpickle.dumps(my_class_attrs)) , raises the same error encountered previously: TypeError: cannot pickle '_thread._local' object |
This appears to be a cloudpickle bug: Cloudpickle issue 320 As such I will close this issue. |
Note that as mentioned in the linked issues, this will work if you set |
With attrs release 21.4.0, that workaround is no longer needed; attrs classes can now be pickled with cloudpickle (e.g. via Dask). |
Working minimal example using Dask's default scheduler:
This code correctly returns 2 for result.i and result_attrs.i using Dask's default scheduler.
Failing minimal example using Dask's distributed scheduler:
This code correctly returns 2 for result.i but raises
, at line:
when using the Dask distributed scheduler.
attrs is a widely used package so I'm not sure why its use is causing a failure.
Attrs classes do pickle properly by themselves. For example, this does work:
I have also successfully used the standard multiprocessing library to do multiprocessing with attrs classes.
Environment
Any advice on how to deal with this issue would be welcome.
The text was updated successfully, but these errors were encountered: