-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DeserializeAsync #1193
Comments
They do: JToken.LoadAsync I'm not going to think about async serialization this release. |
I'll be submitting a PR for this soon, for when you do decide to think about it 😄 @benaadams Do you have any thoughts on #1161? I'm personally okay with the way things are, but thought it better to consider a way it could have been better than just plow on through. |
Was thinking specifically of |
Underway. |
@benaadams What is the recommendation on exposing a valuetype Task on a public API? The serializer will call many |
As will be mostly sync with occasional async; the fastest way is to split the two for when is data and when isn't data; something like public ValueTask<T> ReadThingAsync()
{
T obj;
return TryParse(out obj) ?
new ValueTask(obj) :
new ValueTask(ReadThingAsyncAwaited());
}
private async Task<T> ReadThingAsyncAwaited()
{
T obj;
do
{
await ReadMoreData();
} while(!TryParse(out obj));
return obj;
} |
If you want to go C#7 then you can combine the two, though is slightly slower (middle box in examples https://github.com/ljw1004/roslyn/blob/features/async-return/docs/specs/feature%20-%20arbitrary%20async%20returns.md) Otherwise ValueTask is NetStandard1.0 (et al) https://www.nuget.org/packages/System.Threading.Tasks.Extensions/ so think its fine to expose |
Is this still in progress? We deserialize directly to types from async I/O so |
@manigandham Version 10 brought async with Are there any plans for this? |
I also have encountered the need for this (async deserialization using JsonSerializer when reading from a large file). Is there any update on the status? I am happy to help contribute, but it seems like some work has already been ongoing (@JonHanna mentioned a PR in early 2017), and I would rather not start from scratch. |
FYI: after investigating this further, it seems deserialization to a type needs the entire json payload buffered anyway to be able to properly construct the object, so We originally used a |
@manigandham Is the need to buffer the entire payload a requirement of the algorithm, or a detail of the current implementation? I'm not an expert™, but it's not obvious to me why the entire payload would need to be available in order to parse JSON into a statically typed model. |
@eatdrinksleepcode You're right, I missed the code here. It does support reading from a stream incrementally and all the necessary methods are in the JsonReader.Async.cs class. The missing part is an async version of It seems to cascade through internal JsonSerializerInternalReader.cs and JsonReader.cs classes but I'm not sure of the entire scope. Buffering streams manually works well for us so far. |
@bruno-garcia how did you work around it? Is |
@michael-hawker I've been using sync methods so far. Still unaware of another way. The ASP.NET team is working on a new JSON lib to use on ASP.NET Core. Considering the announcement's main driver of a new lib is performance, from the scalability point of view an async deserializer (assuming possible) is could also be considered. James works there so perhaps he's be able to provide some more info on their plans?! :) |
Bump, this would be super helpful for my use case. Also willing to help if there if someone that has contributed before wouldn't mind mentoring a bit. Looks like @JonHanna is on long term hiatus. |
Does this issue track I'm currently serializing a lot of objects into strings which I then push into a GZipStream (and on to an Azure CloudBlobStream); I should like to serialize directly into this stream as the objects can be very large. |
Json.NET has supported streams for a long time. There are The problem is they aren't
|
@manigandham thanks, sure, but blocking on network is also a concern. Presumably this issue here would make all of JsonSerializer async? Would we really do Deserialize only? |
@ishepherd If you are looking for strongly-typed then I guess this is the issue to wait for. If you can use |
Having an Async API would eliminate the need to work around the disallowing of Synchronous operations that was added to asp.net core 3. System.InvalidOperationException: Synchronous operations are disallowed. Call ReadAsync or set AllowSynchronousIO to true instead.
at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.HttpRequestStream.Read(Byte[] buffer, Int32 offset, Int32 count)
at System.IO.StreamReader.ReadBuffer(Span`1 userBuffer, Boolean& readToUserBuffer)
at System.IO.StreamReader.ReadSpan(Span`1 buffer)
at System.IO.StreamReader.Read(Char[] buffer, Int32 index, Int32 count)
at Newtonsoft.Json.JsonTextReader.ReadData(Boolean append, Int32 charsRequired)
at Newtonsoft.Json.JsonTextReader.ParseValue()
at Newtonsoft.Json.JsonReader.ReadAndMoveToContent()
at Newtonsoft.Json.JsonReader.ReadForType(JsonContract contract, Boolean hasConverter)
at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.Deserialize(JsonReader reader, Type objectType, Boolean checkAdditionalContent)
at Newtonsoft.Json.JsonSerializer.DeserializeInternal(JsonReader reader, Type objectType)
at Newtonsoft.Json.JsonSerializer.Deserialize(JsonReader reader, Type objectType)
at Newtonsoft.Json.JsonSerializer.Deserialize[T](JsonReader reader) Currently my team is working around this by doing the following. using (FileBufferingReadStream buffer = new FileBufferingReadStream(sourceStream, 1024 * 30))
{
await buffer.DrainAsync(CancellationToken.None);
buffer.Seek(0L, SeekOrigin.Begin);
using (var streamReader = new StreamReader(buffer))
using (var jsonReader = new JsonTextReader(streamReader))
{
return _serializer.Deserialize<MyObject>(jsonReader);
}
} It would be nice to get ride of the FileBufferingReadStream. The FileBufferingReadStream has some performance implications that are painful for us in production. Moving away from Newtonsoft is not possible at this time because of third party dependencies we have that also consume Newtonsoft. We are working on creating pull requests for each of the projects we are dependent on to abstract away this hard dependency but its a slow process getting buy in from each project as to why removing this hard dependency is preferred. So Exposing an Async API gives us another avenue to take with some of these third parties. |
Asynchronous APIs (like almost all other requested features in this repo) will not appear with a 99.99% probability. This project is in a frozen state and is gradually dying out. We just need to come to terms with this and move to |
I also hit a case when sync IO is disabled and there is no suitable method in Json.NET where I can pass my settings/converters. What is also interesting that if I use serializer with BsonDataReader then it works even with disabled sync IO. (And rejected BSON support is one more blocker from mirgating to System.Text.Json) |
You know those async reader methods want a user friendly entry point 😉
The text was updated successfully, but these errors were encountered: