Provide default argument to json.dumps in compute_tracestate_value #1318
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Granted, this is a bit of an edge-case, but I'll explain the rationale.
In our Django application, we define
MERCURIAL_TAG
in our settings. To compute this we subprocess out tohg
. As you can imagine, this is quite expensive and has a significant impact on application startup time. So we've wrapped this indjango.utils.functional.lazy()
so that it's only evaluated when it's needed, not every time the app starts or every time a uwsgi worker is recycled.The issue is, we primarily use this setting by passing it to
sentry_sdk.init()
. When the Sentry SDK goes to report an error,compute_tracestate_value()
callsjson.dumps()
which blows up with a whole stack ofTypeError: Object of type __proxy__ is not JSON serializable
as the_got_request_exception
signal fires over and over.I've added a
default=safe_str
tojson.dumps()
to make sure that whatever weird thing some random idiot (read: me) passes to tosentry_sdk.init()
is properly converted to a string.I've left the type hint intact as
(typing.Mapping[str, str]) -> str
as it's probably worth maintaining the expectation that this is the case.