Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High memory usage when using many streams over extended time #232

Closed
Espen-Kalhagen-Element-Logic opened this issue Nov 30, 2022 · 3 comments

Comments

@Espen-Kalhagen-Element-Logic
Copy link

Espen-Kalhagen-Element-Logic commented Nov 30, 2022

Describe the bug
We have a project that uses a high number of streams short streams over an extended period of time. This causes high memory usage that forces us to restart the service once a day to keep memory usage in check. This seems to stem from the fact that we only register one EventStoreClient as a singleton in DI. The client does not seem to completely clean up everything after each stream has finished.

We are noticing that the memory growth comes from the usage of the grpc client. Where a HashSet of ActiveCalls continues to grow. It is supposed to be cleaned up when disposing of the grpc client but that is never done by the EventStore Client as far as I can see. For more information see Config/Logs/Screenshots below. We have tired many other lifetimes for the EventStore Client but these seem to cause other problems.

We can see in the EventStore database logs that the subscriptions are being disposed on that end so we know we are disconnecting from them.

To Reproduce
Steps to reproduce the behavior:

  1. Register EventStoreClient as singleton as reccomended in the documentation.
  2. Subscribe to a very high number of streams over an extended time.
  3. Cancel the CancellationToken sent into the stream subscription and let it be garbage collected.
  4. Watch memory usage of service grow.

I have created a minimum example that displays the behavior. It is meant to have two running instances, one master and one slave. It creates very many streams and sends ping pong between the services.

The master create a stream by publishing a ping event and then subscribes to this stream. The slave subscribes to the stream waits for a ping event and publishes a pong event. When the master receives a pong it cancels the subscription and starts the process again.

This can be found here: Test project reproducing problem. Specifically look at Orchestrator.cs for behavior.

Expected behavior
The memory usage does not grow over time even when subscribing to many new streams.

Actual behavior
Memory usage grows over time as subscriptions are not cleaned up completely.

Config/Logs/Screenshots

Here is a picture from dotMemory where you can see a high memory usage because alot of old ActiveCalls are still in memory.
Example from dotMemory with high memory usage

Plot of how memory usage increases over number of streams. Collected from dotnet-counters. Plot memory usage

dotnet-dumps that showcase the high memory usage of the minimum example can be found here. Where specifically the 97 000 streames have high memory usage.

EventStore details

  • EventStore server version: 21.10.0

  • Operating system: Windows 10 / Ubuntu 18.04

  • EventStore client version (if applicable): grpc client 22.0.0

Additional context
If there is any other way to use the client which does not lead to memory growth like in the example above any pointers or other help is highly appreciated.

@ylorph
Copy link
Contributor

ylorph commented Dec 6, 2022

Thanks for the report & reproduction @Espen-Kalhagen-Element-Logic .
We'll be looking into it

@ylorph
Copy link
Contributor

ylorph commented Dec 6, 2022

@timothycoleman
Copy link
Contributor

timothycoleman commented Dec 9, 2022

Added more information and a workaround to the stackoverflow question https://stackoverflow.com/questions/74685529/how-do-i-use-event-store-db-client-without-continued-memory-usage-growth
Root cause is here #219

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants