-
Notifications
You must be signed in to change notification settings - Fork 417
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ddtrace fails to serialize Python objects by value with cloudpickle #5889
Comments
Hey @lukesturgis, Thanks for opening this issue. I was able to work around this issue by deleting the
Would this solution work for your application? |
Hey @mabdinur, thanks for looking into this. I don't believe the workaround would work good enough in practice, as we might have many nested modules as this is happening in a larger library with lots of imports (including ones that users might not know is necessarily being imported). Is there any potential that this could be something that would be supported to be compatible with cloudpickle? I read a good amount of the module watchdog code and it doesn't seem like this would be easily supported, but would be great if possible. I will likely move to open a feature request with cloudpickle to see if they are able to do anything. Thanks again! |
This seems to be caused by these lines dd-trace-py/ddtrace/internal/module.py Lines 171 to 178 in 148c93c
which we might be able to remove, or at least put behind a private env var if they are really needed. However, even by doing so, it seems that we would still get an issue, this time from six :
We are planning to drop Python 2 support with the 2.0 release of |
This issue has been automatically closed after a period of inactivity. If it's a |
Summary of problem
ddtrace does not work with registering serialized Python objects by value. This worked with a previous version of ddtrace (0.55.1) but broke when we upgraded to 1.5.0. To sanity check, I tried importing 1.12.6 as well with the latest version of cloudpickle (2.2.1) and it did not resolve the issue.
It seems to take the following route when broken down: cloudpickle/cloudpickle_fast.py:632 attempts to dump the object, which goes to cpython/blob/3.11/Lib/pickle.py:476, which dumps the object then attempts to save it in cpython/blob/3.11/Lib/pickle.py#L535, which calls save_reduce in cpython/blob/3.11/Lib/pickle.py#L603, and the object is somehow malformed.
I've confirmed this only happens when ddtrace is imported before cloudpickle, and somewhere in ddtrace's imports it seems to modify the behavior of the import structure of class objects, but did not happen before. Please let me know if I can get any more information over. Thanks!
Which version of dd-trace-py are you using?
1.12.6
Which version of pip are you using?
22.0.4
Which libraries and their versions are you using?
How can we reproduce your problem?
main.py
test.py
What is the result that you get?
What is the result that you expected?
The text was updated successfully, but these errors were encountered: