Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add IAsyncDisposable support, drop obsolete inheritance-based APIs #56

Merged
merged 5 commits into from
Sep 12, 2022

Conversation

nblumhardt
Copy link
Member

@nblumhardt nblumhardt commented Sep 12, 2022

Enables asynchronous flushing of queued batches when disposed via IAsyncDisposable.DisposeAsync(). Batched sinks will now also be disposed via DisposeAsync() if they support it.

See also: serilog/serilog#1750

Dropping subclassing support

The interactions of Dispose(), Dispose(bool), DisposeAsync(), and DisposeAsyncCore() make retrofitting support in a base class like PeriodicBatchingSink error-prone, opening the door to skipped disposal or double-disposal in base classes not explicitly written for async disposal.

Since subclassing of PeriodicBatchingSink was already obsoleted, this PR skips the whole unfruitful mess and simply takes the next step by removing subclassing support altogether. Major version bumped to 3.0.0.

Migrating existing code to the new API is easy - instead of deriving from PeriodicBatchingSink, consumers implement IBatchedLogEventSink instead, and wrap an instance of the batched sink in a new PeriodicBatchingSink(batchedSink, options).

Copy link
Contributor

@skomis-mm skomis-mm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great! Github doesn't allow put comments on unmodified blocks 🤦 .

  1. .ConfigureAwait(false) should be placed now within OnTick body
await _batchedLogEventSink.OnEmptyBatchAsync();
await _batchedLogEventSink.EmitBatchAsync(_waitingBatch);

since it is called from DisposeAsync.
2. Bump AssemblyVersion to 3.0.0?

{
}
// This is the place where SynchronizationContext.Current is unknown and can be != null
// so we prevent possible deadlocks here for sync-over-async downstream implementations.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

comment refers to sync Dispose and can be removed

// ReSharper disable once MethodHasAsyncOverload
EmitBatch(events);
if (_batchedLogEventSink is IAsyncDisposable asyncDisposable)
await asyncDisposable.DisposeAsync();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

.ConfigureAwait(false)

</PropertyGroup>

<ItemGroup>
<PackageReference Include="Serilog" Version="2.0.0" />
<PackageReference Include="Serilog" Version="2.12.0-dev-*" />
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Technically this is not needed and can be released independently from the core package. End users would need to update their code anyway by upgrading to the new Serilog version.
But I am ok with this change.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the great review 👍

I rolled this change back; my initial thinking was that downstream sinks relying on async disposal would have more chance of getting support via the app if the version of Serilog itself supported async dispoal - but then realised that most of the time this will need to be via the ASP.NET (and other) hosting APIs, so pulling in an updated Serilog isn't really important here.

@nblumhardt
Copy link
Member Author

Think I've got everything; I'll merge this and queue up a release since we'll want it to go out in sync with Serilog 2.12.0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants