-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What's new in .NET 6 Preview 5 #6099
Comments
Parsing of BigIntegers from both decimal and hexadecimal strings was improved: dotnet/runtime#47842 We saw improvements of up to 89%: DrewScoggins/performance-2#5765 Provided by community: |
Inlining of certain methods involving SIMD or HWIntrinsics should now have improved codegen and perf. We saw improvements of up to 95%: DrewScoggins/performance-2#5581 |
This should make it simpler to SIMD code involving pointers or platform dependent lengths. |
Windows Forms has received a new API |
.NET Cryptography on Linux supports using OpenSSL 3 as the native provider. The any-distro "portable" builds will try OpenSSL 3 before the previous versions. |
The ChaCha20/Poly1305 algorithm now has representation in System.Security.Cryptography via the new ChaCha20Poly1305 class. OS support is required for this type (check the static dotnet/runtime#52030 (Windows). Linux support was enabled by community member |
Interop: Objective-C interoperability supportAPI: dotnet/runtime#44659 The .NET Interop team worked with the Xamarin owners to enable the Xamarin.MacOS libraries to run on top of CoreCLR. Along with providing an explicit API we also added a feature flag for implicitly adding a |
.NET SDK Optional Workload improvementsBuilding on the work that released in preview 4 for optional workloads, we've added two additional features. It will query NuGet.org for a new Workload Manifest for all workloads, update the Manifests, and download new versions of the installed workloads, and then remove all old versions of the workload |
Networking: WebSocket CompressionUser Story: dotnet/runtime#20004 This is an implementation of Most importantly, this is a big, complex and high-quality community contribution by @zlatanov. It was a long journey (3 months of work) and we faced several problems along the way. First, we realized that using compression together with encryption may lead to security implications, like CRIME/BREACH attacks. It means that a secret cannot be sent together with user-generated data in a single compression context, otherwise that secret could be extracted. To bring user's attention to these implications and help them weigh the risks, we renamed our API to Second, we stumbled upon a bug in zlib-intel implementation, that only manifested itself if a window used was not a maximal one ( There was also a follow-up by @zlatanov that reduced the memory footprint of the WebSocket when compression is disabled by about 27%. Enabling the compression from the client side is easy, see the example below. However, please bear in mind that the server can negotiate the settings, e.g. request smaller window, or deny the compression completely. var cws = new ClientWebSocket();
cws.Options.DangerousDeflateOptions = new WebSocketDeflateOptions()
{
ClientMaxWindowBits = 10,
ServerMaxWindowBits = 10
}; WebSocket compression support was also recently added to ASP.NET Core dotnet/aspnetcore#2715 but will only be part of the upcoming previews. |
The [RequiresPreviewFeatures] attribute was created in S.P.CoreLib. User Story: https://github.com/dotnet/designs/blob/main/accepted/2021/preview-features/preview-features.md While the story is not complete yet, work is underway to implement the analyzer and sdk work needed to fully flesh out the feature. This brings dotnet 1 step closer in allowing developers to ship preview features in the apps. |
System.Diagnostics Metrics supportUser Story: dotnet/runtime#44445 The new exposed APIs is implementing the OpenTelemetry APIs specification. The Metrics APIs are designed explicitly for processing raw measurements, generally with the intent to produce continuous summaries of those measurements, efficiently and simultaneously. The APIs include the Library Measurement Recording Example Meter meter = new Meter("io.opentelemetry.contrib.mongodb", "v1.0");
Counter<int> counter = meter.CreateCounter<int>("Requests");
counter.Add(1);
counter.Add(1, KeyValuePair.Create<string, object>("request", "read")); Listening Example MeterListener listener = new MeterListener();
listener.InstrumentPublished = (instrument, meterListener) =>
{
if (instrument.Name == "Requests" && instrument.Meter.Name == "io.opentelemetry.contrib.mongodb")
{
meterListener.EnableMeasurementEvents(instrument, null);
}
};
listener.SetMeasurementEventCallback<int>((instrument, measurement, tags, state) =>
{
Console.WriteLine($"Instrument: {instrument.Name} has recorded the measurement {measurement}");
});
listener.Start(); OpenTelemetry .NET implementation will depend on these exposed APIs to support the end-to-end Metrics observability scenarios. |
CodeGenCommunity contributions (@SingleAccretion)
Dynamic PGO dotnet/runtime#43618
JIT Loop Optimizations dotnet/runtime#43549
LSRA dotnet/runtime#43318
Keep Structs in Register dotnet/runtime#43867
Optimizations & Debugging experience
|
Package ValidationUser Story: #5700 Package Validation tooling will allow library developers to validate that their packages are consistent and well-formed. It involves validating that there are no breaking changes across versions. It will validate that the package have the same set of publics apis for all the different runtime-specific implementations. It will also help developers to catch any applicability holes. |
Diagnostics (EventPipe/DiagnosticsServer) - MonoVMA lot of diagnostics features have been added into MonoVM since beginning of .net6, enabling features like managed EventSource/EventListener, EventPipe and DiagnosticsServer, opening up capabilities to use diagnostics tooling like The following things are currently included/done so far, this is far from an exclusive list, but should highlight the bigger items. We continue to include more features going forward, primarily focusing on SDK integration and adapting more native runtime events
Part of iOS start up CPU sampling session viewed in SpeedScope:Android CPU sampling viewed in PerfView (main thread in infinite sleep)What's next (P6/P7):
|
Networking: Socks proxy supportIssue: dotnet/runtime#17740 Adding support for socks proxies (Socks4, Socks4a, Socks5) has been a long-standing feature request from the community (issue is 5 years old). Among other things, it enables users to test external connections via SSH or connect to the Tor network from .NET directly. The implementation came from a community contributor @huoyaoyuan! It does not impact the public API, but var handler = new HttpClientHandler
{
Proxy = new WebProxy("socks5://127.0.0.1", 9050)
};
var httpClient = new HttpClient(handler); |
Microsoft.Extensions.HostingHosting - ConfigureHostOptions APIPrior to Preview 5, configuring the host options, (e.g. configuring the shutdown timeout) would looks like this: using var host = new HostBuilder()
.ConfigureServices(servies =>
{
servies.Configure<HostOptions>(o =>
{
o.ShutdownTimeout = TimeSpan.FromMinutes(10);
});
})
.Build();
host.Run(); to make this setup less convoluted, in Preview 5, we added a new ConfigureHostOptions API on IHostBuilder: using var host = new HostBuilder()
.ConfigureHostOptions(o =>
{
o.ShutdownTimeout = TimeSpan.FromMinutes(10);
})
.Build();
host.Run(); Microsoft.Extensions.DependencyInjectionDependency Injection - CreateAsyncScope APIsYou might have noticed that disposal of a service provider which is wrapped in a using statement, will throw an InvalidOperationException when it happens to register an IAsyncDisposable service: using System;
using System.Threading.Tasks;
using Microsoft.Extensions.DependencyInjection;
await using var provider = new ServiceCollection()
.AddScoped<Foo>()
.BuildServiceProvider();
using (var scope = provider.CreateScope())
{
var foo = scope.ServiceProvider.GetRequiredService<Foo>();
}
class Foo : IAsyncDisposable
{
public ValueTask DisposeAsync() => default;
} We workaround this today by casting the returned scope to IAsyncDisposable: var scope = provider.CreateScope();
var foo = scope.ServiceProvider.GetRequiredService<Foo>();
await ((IAsyncDisposable)scope).DisposeAsync(); To mitigate such issues, with Preview 5 release you will notice new CreateAsyncScope APIs to help simplify this code snippet to: await using (var scope = provider.CreateAsyncScope())
{
var foo = scope.ServiceProvider.GetRequiredService<Foo>();
} |
Compile-time source generation for System.Text.JsonUser stories: dotnet/runtime#1568, dotnet/runtime#45441 Dedicated preview 5 blog post: https://aka.ms/JsonSourceGenPreview5 BackgroundSource generators allow developers to generate C# source files that can be added to an assembly during the course of compilation. Generating source code at compile time can provide many benefits to .NET applications, including improved performance. In .NET 6, we are releasing a new source generator as part of System.Text.Json. The JSON source generator works in conjuction with
Generating optimized serialization logic
We have defined a set of Given a simple type: namespace Test
{
internal class JsonMessage
{
public string Message { get; set; }
}
} We can configure the source generator to generate serialization logic for instances of this type, given some pre-defined serializer options: using System.Text.Json.Serialization;
namespace Test
{
[JsonSerializerOptons(
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingDefault,
IgnoreRuntimeCustomConverters = true,
NamingPolicy = JsonKnownNamingPolicy.BuiltInCamelCase)]
[JsonSerializable(typeof(JsonMessage), GenerationMode = JsonSourceGenerationMode.Serialization)]
internal partial class JsonContext : JsonSerializerContext
{
}
} The source generator would then augment the internal partial class JsonContext : JsonSerializerContext
{
public JsonContext Default { get; }
public JsonTypeInfo<JsonMessage> JsonMessage { get; }
public JsonContext(JsonSerializerOptions options) { }
public override JsonTypeInfo GetTypeInfo(Type type) => throw null;
} To serialize instances of this type and get the best possible performance, one could invoke the generated code as follows: using MemoryStream ms = new();
using Utf8JsonWriter writer = new(ms);
JsonContext.Default.JsonMessage.Serialize!(writer, new JsonMessage { "Hello, world!" });
writer.Flush();
// Writer contains:
// {"message":"Hello, world!"} The generate code can also be passed to JsonSerializer.Serialize(jsonMessage, JsonContext.Default.JsonMessage);
JsonSerializer.Serialize(jsonMessage, typeof(JsonMessage), JsonContext.Default); This source generation mode is only available for serialization, but not deserialization. A mode that generates optimized deserialization logic using Generating type-metadata initialization logicIn some scenarios, we may want benefits of JSON source generation, but our serialization feature requirements might not be compatible with what can be honored in the generator mode that generates serialization logic. For instance, reference handling and async serialization are two features that the source generator does not provide optimized serialization logic for. The generator provides a different mode, where instead of generating serialization logic, we can generate type-metadata initialization logic. This mode can provide all the afforementioned benefits of source generation, with the exception of improved serialization throughput. This mode can also provide benefits when deserializing JSON payloads. In previous versions of System.Text.Json, serialization metadata could only be computed at runtime, during the first serialization or deserialization routine of every type in any object graph passed to the serializer. At a high level, this metadata includes delegates to constructors, property setters and getters, along with user options indicated at both runtime and design (e.g. whether to ignore a property value when serializing and it is null). After this metadata is generated, the serializer performs the actual serialization and deserialization. The generation phase is based on reflection, and is computationally expensive both in terms of throughput and allocations. We can refer to this phase as the serializer's "warm-up" phase. With the type-metadata initialization mode, we shift this runtime metadata generation phase to compile-time, substantially reducing the cost of the first serialization or deserialization procedures. This metadata is generated to the compiling assembly, where it can be initialized and passed directly to We configure this mode for the source generator in a similar way to what's shown above, except that we don't specify features ahead of time, and we change the generation mode: using System.Text.Json.Serialization;
namespace Test
{
[JsonSerializable(typeof(JsonMessage), GenerationMode = JsonSourceGenerationMode.Metadata)]
internal partial class JsonContext : JsonSerializerContext
{
}
} The generator augments the partial context class with the same shape as above. We can then perform serialization using advanced features like reference handling, but still measure and observe performance improvements: JsonSerializerOptions options = new()
{
ReferenceHander = ReferenceHandler.Preserve,
PropertyNamingPolicy = JsonNamingPolicy.CamelCase
};
JsonContext context = new(options);
string json = JsonSerializer.Serialize(jsonMessage, context.JsonMessage);
// {"id":1,"message":"Hello, world!"} Generating both serialization logic and metadata initialization logicIn some cases, we want the generator to generate both serialization and metadata-initialization logic. For instance, we might only need features compatible with the serialization logic mode on serialization, but also wish to experience some perf improvements on deserialization, including decreased app size after ILLinker trimming. We'd configure the generator as follows: using System.Text.Json.Serialization;
namespace Test
{
[JsonSerializerOptons(
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingDefault,
IgnoreRuntimeCustomConverters = true,
NamingPolicy = JsonKnownNamingPolicy.BuiltInCamelCase)]
[JsonSerializable(typeof(JsonMessage), GenerationMode = JsonSourceGenerationMode.MetadataAndSerialization)]
internal partial class JsonContext : JsonSerializerContext
{
}
} Again, the generated API shape remains the same. We can use the generated source as follows: // Serializer invokes pre-generated for increased throughput and other benefits.
string json = JsonSerializer.Serialize(jsonMessage, JsonContext.Default.JsonMessage);
// Serializer uses pre-generated type-metadata and avoids warm-up stage for deserialization, alongside other benefits.
JsonMessage message = JsonSerializer.Deserialize(json, JsonContext.Default.JsonMessage); How to consume the JSON source generatorYou can try out the source generator in preview 5 by using the latest preview bits of the System.Text.Json NuGet package. We are working on a proposal for shipping source generators inbox, starting in .NET 6.0. The implementation will be ready by the time we ship .NET 6.0. Additional notes
|
Add platform guard attributes to allow custom guards in Platform Compatibility AnalyzerUser story: dotnet/runtime#44922 The CA1416 Platform Compatibility analyzer already recognizes platform guards using the methods in OperatingSystem/RuntimeInformation, such as For allowing custom guard possibilities we added new attributes Usage Examples [UnsupportedOSPlatformGuard("browser")] // The platform guard attribute
#if TARGET_BROWSER
internal bool IsSupported => false;
#else
internal bool IsSupported => true;
#endif
[UnsupportedOSPlatform("browser")]
void ApiNotSupportedOnBrowser() { }
void M1()
{
ApiNotSupportedOnBrowser(); // Warns: This call site is reachable on all platforms.'ApiNotSupportedOnBrowser()' is unsupported on: 'browser'
if (IsSupported)
{
ApiNotSupportedOnBrowser(); // Not warn
}
}
[SupportedOSPlatform("Windows")]
[SupportedOSPlatform("Linux")]
void ApiOnlyWorkOnWindowsLinux() { }
[SupportedOSPlatformGuard("Linux")]
[SupportedOSPlatformGuard("Windows")]
private readonly bool _isWindowOrLinux = OperatingSystem.IsLinux() || OperatingSystem.IsWindows();
void M2()
{
ApiOnlyWorkOnWindowsLinux(); // This call site is reachable on all platforms.'ApiOnlyWorkOnWindowsLinux()' is only supported on: 'Linux', 'Windows'.
if (_isWindowOrLinux)
{
ApiOnlyWorkOnWindowsLinux(); // Not warn
}
}
} |
its released? |
No. It was supposed to release today. We're hoping for Thursday. |
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
This comment has been minimized.
I'm confusing of how to diagnostics in .net any more .
and now there is a new what should be use for libraries ? |
@John0King there is docs https://docs.microsoft.com/en-us/dotnet/core/diagnostics/ which is helpful to understand which class to use in which scenario.
Where are you seeing |
@tarekgh The library or framework must write code to support those "check points", and now there have 3 choice:
the problem is that you are not asking the library author to use them all right ? and this is what I confusing ! for example: // just a example , don't take it seriously
public async Task LibraryDoSomething()
{
using(Tracing.Begin("MyLibaray.Key"))
{
await doTheWorkAsync();
}
}
there is the information:
and and it seem now, that I use |
What's new in .NET 6 Preview 5
This issue is for teams to highlight work for the community that will release .NET 6 Preview 5.
To add content, use a new conversation entry. The entry should include the team name and feature title as the first line as shown in the template below.
Preview 1: #5853
Preview 2: #5889
Preview 3: #5890
Preview 4: #6098
Preview 5: #6099
The text was updated successfully, but these errors were encountered: