Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prevent excessive memory allocation while decoding #30

Merged
merged 3 commits into from
Dec 10, 2015

Conversation

breml
Copy link
Contributor

@breml breml commented Dec 10, 2015

Enforce reasonable maximum values for record and header length while decoding sflow packets.

Fixes #29

Enforce reasonable maximum values for record and header length while decoding sflow packets.
Use maximum header size from sflow reference implementation
// This maximum prevents from excessive memory allocation for decoding.
// The value is derived from INM_MAX_HEADER_SIZE 256 in sflow reference implementation
// https://github.com/sflow/sflowtool/blob/bd3df6e11bdf8261a42734c619abfe8b46e1202f/src/sflowtool.h#L28
MaximumHeaderLength = 256
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure about this one. I typically sample more than 256 bytes, and Brocade allows you to capture up to 1300 (see Specifying the maximum flow sample size).

I think 1300 is a better choice.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I could not find any information about the maximum in the sflow specification. First my setting was 1500 and I corrected it in the second commit based on the sflowtool source code. I am fine with 1300 or 1500.
The main issue is, the header length field in the sflow package is uint32, so the maximum would be 0xFFFFFFFF, respectively 4,294,967,295 bytes. And this is by far to much for the make([]byte, f.HeaderSize+padding).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, let's not make it too high :). I'll let you choose either 1300 or 1500.

@Preetam
Copy link
Member

Preetam commented Dec 10, 2015

LGTM besides what I described in the comment. Thanks for your help! 👍

@breml
Copy link
Contributor Author

breml commented Dec 10, 2015

I just reverted my last commit so we are back to 1500 for MaximumHeaderLength.

Preetam added a commit that referenced this pull request Dec 10, 2015
Prevent excessive memory allocation while decoding
@Preetam Preetam merged commit 23065cb into Cistern:master Dec 10, 2015
@breml breml deleted the go-fuzz-excessive-memory-allocation branch December 10, 2015 15:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants