-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Thread starvation on high load with System.Text.Json #40072
Comments
Your repro is creating a new options instance each time. Does the problem go away if you instead cache and reuse a single instance of it? |
Thanks, after changing the options to a single instance the issue could not be observed any longer during a 2 hour load test. |
@steveharter, can we handle this better? While we encourage folks to cache, not doing so leads to really bad perf with all the code generation and jit'ing. |
Add analyzer for this? |
This is |
That would be good.
We should doc this better. Changing to doc issue. If just the default options are used (no explicit options instance passed in) then the default options instance is automatically cached. It is also cached under ASP.NET. Also we plan on making it easier in 6.0 to specify the default options as well, so one can use the serialization APIs that don't take the options instance but still specify a custom options instance and its location. Also, the stack appears to enter the critical section on calling the "getter" delegate. This is likely trying to JIT the IL that the serializer emitted to call the getter. With the code-gen effort that will release for 6.0, there will be no IL generated for the getter so the code for the POCOs are generated ahead-of-time, that should avoid the critical section plus the startup time will significantly improve. However, even with that, there is metadata that needs to be built up and cached, so caching of the options is still recommended even with code-gen. |
Also dup of #38982 |
Triage: moving this issue to 6.0 For 5.0 we should document best practices for using Mitigations to consider in 6.0/future:
|
We might want to consider additional mitigations, such as a global MRU cache. The issue as it stands today is pretty bad. |
Updated triage above with this comment. Tracking issue - #35084. |
Just to add some customer feedback: I only learned that I needed to cache |
Triage - for .NET 6.0, we've provided documentation about perf caveats with using |
Wouldn't this require defining a notion of structural equality for for (int i = 0; i < 1_000_000; i++)
{
var options = new JsonSerializerOptions { Converters = { new MyCoolCustomConverter() } };
JsonSerializer.Serialize(values[i], options);
} How can we determine if two lists of custom converters are equivalent? A conservative approach might be to check for reference equality but in this case it would clearly result in false negatives. We certainly want to avoid false positives as we'd be exposing ourselves to concurrency bugs impacting serialization settings. I'm also not a huge fan of authoring mitigating APIs a la |
Why? We generate IL for some Type. A cache could be at the type level, keyed off Type identity plus any other values we use as part of the IL generation. It also needn't be perfect, just good enough to mitigate the pit of failure it currently is. Solution needn't be this if there's something better. But a solution is needed. |
We could definitely try caching the generated IL globally, but I'm assuming that substantial part of the cost stems from all the reflection code run at converter construction time which we can't avoid unless we can determine that the converter caches scoped to a particular options instance are functionally identical. |
That may contribute as well, but all the stacks from all the repros I've seen are due to JIT'ing IL over and over and over.
Like I said, it doesn't need to be perfect. I bet you'd get 90% of the way there if you just handled default cases where there's nothing custom in the mix, e.g. only using the default set of converters we supply. |
My only concern is that converters are one of the most common customizations applied to serialization. For the X% of users that modify anything but converters, we could be setting them up for major performance regressions just because they happen to apply a custom converter in later iterations of their app. I'd err towards predictability here. |
Agreed. If it does turn out that caching dynamic methods eliminates 80% of the performance issues with 20% of the effort we should just do that. But there are diminishing returns in reusing equivalent |
@eiriktsarpalis and I chatted offline. We agreed at a minimum we need an analyzer, e.g. that warns about any JsonSerializerOptions construction that flows only to a call to a JsonSerializer method argument. Beyond that, he's going to profile to get a strong sense for where the actual costs are, and based on that we can decide how much further to go. If the primary costs are in JIT'ing, for example, we could look to employ some kind of tiering strategy, where we start with a purely reflection-based approach and only upgrade to reflection emit after N uses. We could look at having a copy-on-write solution for the existing caching, so that at least |
FWIW we recently added a |
I knocked together a prototype implementing memoization for dynamic methods and compared it with
|
Description
After changing the serializer from Newtonsoft to System.Text.Json in a streaming application, where a high number of objects are serialized concurrently, we observe a thread starvation issue.
After a few hours under heavy load the number of native threads in the process increase slowly until ~500, which results in severe throughput degradation of the application. As a consequence we had to switch back to Newtonsoft Json.
A memory dump revealed that most of the threads wait on a critical section, see attached data.
Configuration
Data
Callstack:
ntdll!NtWaitForAlertByThreadId+14
ntdll!RtlpWaitOnAddressWithTimeout+43
ntdll!RtlpWaitOnAddress+b2
ntdll!RtlpWaitOnCriticalSection+db
ntdll!RtlpEnterCriticalSectionContended+cc
ntdll!RtlEnterCriticalSection+40
coreclr!CEEInfo::reportInliningDecision+51a [f:\workspace_work\1\s\src\vm\jitinterface.cpp @ 8292 + 87] f:\workspace_work\1\s\src\vm\jitinterface.cpp @ 8292 + 87
clrjit!InlineResult::Report+de [f:\workspace_work\1\s\src\jit\inline.cpp @ 749] f:\workspace_work\1\s\src\jit\inline.cpp @ 749
clrjit!Compiler::fgInline+3b0 [f:\workspace_work\1\s\src\jit\flowgraph.cpp @ 22013 + a] f:\workspace_work\1\s\src\jit\flowgraph.cpp @ 22013 + a
clrjit!Compiler::fgMorph+131 [f:\workspace_work\1\s\src\jit\morph.cpp @ 17025] f:\workspace_work\1\s\src\jit\morph.cpp @ 17025
clrjit!Compiler::compCompile+4f2 [f:\workspace_work\1\s\src\jit\compiler.cpp @ 4483] f:\workspace_work\1\s\src\jit\compiler.cpp @ 4483
clrjit!Compiler::compCompileHelper+296 [f:\workspace_work\1\s\src\jit\compiler.cpp @ 6035] f:\workspace_work\1\s\src\jit\compiler.cpp @ 6035
clrjit!Compiler::compCompile+21d [f:\workspace_work\1\s\src\jit\compiler.cpp @ 5367 + 20] f:\workspace_work\1\s\src\jit\compiler.cpp @ 5367 + 20
clrjit!jitNativeCode+273 [f:\workspace_work\1\s\src\jit\compiler.cpp @ 6663 + 53] f:\workspace_work\1\s\src\jit\compiler.cpp @ 6663 + 53
clrjit!CILJit::compileMethod+92 [f:\workspace_work\1\s\src\jit\ee_il_dll.cpp @ 319] f:\workspace_work\1\s\src\jit\ee_il_dll.cpp @ 319
coreclr!invokeCompileMethod+dd [f:\workspace_work\1\s\src\vm\jitinterface.cpp @ 12496 + 6e] f:\workspace_work\1\s\src\vm\jitinterface.cpp @ 12496 + 6e
coreclr!CallCompileMethodWithSEHWrapper+50 [f:\workspace_work\1\s\src\vm\jitinterface.cpp @ 12550 + 25] f:\workspace_work\1\s\src\vm\jitinterface.cpp @ 12550 + 25
coreclr!UnsafeJitFunction+96b [f:\workspace_work\1\s\src\vm\jitinterface.cpp @ 13039 + 6a] f:\workspace_work\1\s\src\vm\jitinterface.cpp @ 13039 + 6a
coreclr!MethodDesc::JitCompileCodeLocked+2f9 [f:\workspace_work\1\s\src\vm\prestub.cpp @ 965 + 5c] f:\workspace_work\1\s\src\vm\prestub.cpp @ 965 + 5c
coreclr!MethodDesc::JitCompileCodeLockedEventWrapper+32d [f:\workspace_work\1\s\src\vm\prestub.cpp @ 827 + 18] f:\workspace_work\1\s\src\vm\prestub.cpp @ 827 + 18
coreclr!MethodDesc::JitCompileCode+15f [f:\workspace_work\1\s\src\vm\prestub.cpp @ 773 + f] f:\workspace_work\1\s\src\vm\prestub.cpp @ 773 + f
coreclr!MethodDesc::PrepareILBasedCode+185 [f:\workspace_work\1\s\src\vm\prestub.cpp @ 432 + b] f:\workspace_work\1\s\src\vm\prestub.cpp @ 432 + b
coreclr!MethodDesc::PrepareInitialCode+40 [f:\workspace_work\1\s\src\vm\prestub.cpp @ 321] f:\workspace_work\1\s\src\vm\prestub.cpp @ 321
coreclr!MethodDesc::DoPrestub+44b [f:\workspace_work\1\s\src\vm\prestub.cpp @ 2025 + 8] f:\workspace_work\1\s\src\vm\prestub.cpp @ 2025 + 8
coreclr!PreStubWorker+24c [f:\workspace_work\1\s\src\vm\prestub.cpp @ 1781 + d] f:\workspace_work\1\s\src\vm\prestub.cpp @ 1781 + d
coreclr!ThePreStub+55 [F:\workspace_work\1\s\src\vm\amd64\ThePreStubAMD64.asm @ 22] F:\workspace_work\1\s\src\vm\amd64\ThePreStubAMD64.asm @ 22
System.Text.Json.JsonPropertyInfoCommon
4[[System.__Canon, System.Private.CoreLib],[System.__Canon, System.Private.CoreLib],[System.__Canon, System.Private.CoreLib],[System.__Canon, System.Private.CoreLib]].GetValueAsObject(System.Object)+23 0x000001a7
bb9197a80x000001a7
bb9197a8 0x000001a7
7e92afc80x0000006e
0a6bd070 0x000001a7
820ab460System.Text.Json.JsonSerializer.HandleEnumerable(System.Text.Json.JsonClassInfo, System.Text.Json.JsonSerializerOptions, System.Text.Json.Utf8JsonWriter, System.Text.Json.WriteStack ByRef)+33
0x0000006e
0a6bd030 0x000001a7
7fa9e6f8System.Text.Json.JsonSerializer.Write(System.Text.Json.Utf8JsonWriter, Int32, Int32, System.Text.Json.JsonSerializerOptions, System.Text.Json.WriteStack ByRef)+33c
System.Text.Json.JsonSerializer.WriteCore(System.Text.Json.Utf8JsonWriter, System.Object, System.Type, System.Text.Json.JsonSerializerOptions)+b9
System.Text.Json.JsonSerializer.WriteCore(System.Text.Json.PooledByteBufferWriter, System.Object, System.Type, System.Text.Json.JsonSerializerOptions)+86
System.Text.Json.JsonSerializer.WriteCoreBytes(System.Object, System.Type, System.Text.Json.JsonSerializerOptions)+7a
Analysis
Serialization code in application looks like this:
The code for CustomJsonStringEnumConverter can be found here.
The text was updated successfully, but these errors were encountered: