-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Prevent excessive memory allocation while decoding #30
Prevent excessive memory allocation while decoding #30
Conversation
Enforce reasonable maximum values for record and header length while decoding sflow packets.
Use maximum header size from sflow reference implementation
// This maximum prevents from excessive memory allocation for decoding. | ||
// The value is derived from INM_MAX_HEADER_SIZE 256 in sflow reference implementation | ||
// https://github.com/sflow/sflowtool/blob/bd3df6e11bdf8261a42734c619abfe8b46e1202f/src/sflowtool.h#L28 | ||
MaximumHeaderLength = 256 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure about this one. I typically sample more than 256 bytes, and Brocade allows you to capture up to 1300 (see Specifying the maximum flow sample size).
I think 1300 is a better choice.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I could not find any information about the maximum in the sflow specification. First my setting was 1500 and I corrected it in the second commit based on the sflowtool source code. I am fine with 1300 or 1500.
The main issue is, the header length field in the sflow package is uint32, so the maximum would be 0xFFFFFFFF, respectively 4,294,967,295 bytes. And this is by far to much for the make([]byte, f.HeaderSize+padding)
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree, let's not make it too high :). I'll let you choose either 1300 or 1500.
LGTM besides what I described in the comment. Thanks for your help! 👍 |
This reverts commit 0776769.
I just reverted my last commit so we are back to 1500 for |
Prevent excessive memory allocation while decoding
Enforce reasonable maximum values for record and header length while decoding sflow packets.
Fixes #29