-
Notifications
You must be signed in to change notification settings - Fork 17.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
encoding/base32: Decoder with NoPadding doesn't fill the passed buffer #65166
Comments
Something similar seems to also be happening also on package This one at least does not vary depending on the decoded size. I'll wait for this issue to be triaged before creating a new issue and investigate for base64. |
This comment was marked as outdated.
This comment was marked as outdated.
This has been failing as far back as Go 1.11, so not a regression (of relevance for upcoming Go 1.22 release). |
cc @rsc |
If unpadded content was passed, in some occassions content was omitted, because the division result was floored. Ceiling it makes sure all content is always read. Fixes golang#65166 Signed-off-by: Niklas Ott <[email protected]>
If unpadded content was passed, in some occassions content was omitted, because the division result was floored. Ceiling it makes sure all content is always read. Fixes golang#65166 Signed-off-by: Niklas Ott <[email protected]>
Change https://go.dev/cl/581415 mentions this issue: |
Change https://go.dev/cl/581416 mentions this issue: |
Go version
go 1.21.6
Output of
go env
in your module/workspace:What did you do?
Here's a proof of concept: playground
Here's a second example, which shows how the success of the tests change based on whether we add extra length to the destination buffer: https://go.dev/play/p/A28R-myUHjT
What did you see happen?
First playground link output
Second playground link output
This specifically happens when using the decoder with a NoPadding encoding. It appears that package
base32
underestimates the amount of data it can write to the buffer when using NoPadding.When using Decoder and trying to read a chunk of data using a buffer, appropriately sized to what is described as "the maximum length in bytes of the decoded data", the decoder doesn't attempt to fill the entire buffer.
This seems to come from the following piece of code:
nn
is the number of bytes that is attempted to be read later, upper bounded by the decoder's internal buffer (d.buf
). It seems that the amount of bytes that should be read from the encoded input is calculated based on a copy of theEncodedLen
code, which fails to take into account the difference forNoPadding
.What did you expect to see?
I expected that using Decoder to decode data into a buffer not to be dependent on the size of the input buffer when it is already of the size provided by
DecodedLen
, unless of course this was upper bounded by the decoder's internal buffer.This has not created a production issue for me, but it seems to be based on a reasonable expectation (albeit a specific use case where Decode with the buffer + DecodedLen would probably be a better option).
The text was updated successfully, but these errors were encountered: