Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Document how to size the direct memory under expected event size #476

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

andsel
Copy link
Contributor

@andsel andsel commented Jul 25, 2023

Release notes

[rn:skip]

What does this PR do?

Better document how this plugin allocates direct memory so that a better capacity planning can be applied.

Why is it important/What is the impact to the user?

The user, given some statistical measures, can determine if the sizing of direct memory is correct so that he can avoid potential out-of-memory errors in the direct memory.

Checklist

  • My code follows the style guidelines of this project
  • [ ] I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • [ ] I have made corresponding change to the default configuration files (and/or docker env variables)
  • [ ] I have added tests that prove my fix is effective or that my feature works

@andsel andsel self-assigned this Jul 25, 2023
@andsel andsel added the docs label Jul 27, 2023
@andsel andsel marked this pull request as ready for review July 27, 2023 08:06
@andsel andsel requested a review from robbavey July 27, 2023 08:06
@andsel andsel marked this pull request as draft September 22, 2023 14:12
@andsel andsel force-pushed the fix/describe_how_to_size_direct_memory branch from ab84f1a to ecaf36a Compare September 25, 2023 09:52
…ine how many batch in flight we have, but the number of connected channels.

To correctly size the direct memory to sustain the flow of incoming Beats connections, the medium size of the transmitted
log lines and the batch size used by Beats (default to 2048), has to be known. For each connected client, a batch of events
is read and due to the way the decompressing and decoding part works, it keeps two copies of the batch in memory.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note for reviewer:
Beats decoding part keeps 2 copies of the buffer it's processing in memory:

@andsel andsel marked this pull request as ready for review September 25, 2023 10:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants