Skip to content

Commit

Permalink
src: fix limit calculation
Browse files Browse the repository at this point in the history
Coverity reported that the use of sizeof along with pointer
arithmetic was likely an error as the pointer arithmetic
would already be accounting for the size of what the
pointer points to.

Looking at the code that looked right but removing the
extra sizeOf caused tests to fail.

Looking more closely it seems like we were not allocating
a big enough buffer but the extra sizeof was allowing
us to convert even though it might have been corrupting
memory.

Signed-off-by: Michael Dawson <[email protected]>

PR-URL: #41026
Reviewed-By: Antoine du Hamel <[email protected]>
  • Loading branch information
mhdawson authored Dec 10, 2021
1 parent ef7a686 commit 98ec909
Showing 1 changed file with 4 additions and 3 deletions.
7 changes: 4 additions & 3 deletions src/node_i18n.cc
Original file line number Diff line number Diff line change
Expand Up @@ -447,8 +447,9 @@ void ConverterObject::Decode(const FunctionCallbackInfo<Value>& args) {

// When flushing the final chunk, the limit is the maximum
// of either the input buffer length or the number of pending
// characters times the min char size.
size_t limit = converter->min_char_size() *
// characters times the min char size, multiplied by 2 as unicode may
// take up to 2 UChars to encode a character
size_t limit = 2 * converter->min_char_size() *
(!flush ?
input.length() :
std::max(
Expand All @@ -474,7 +475,7 @@ void ConverterObject::Decode(const FunctionCallbackInfo<Value>& args) {
UChar* target = *result;
ucnv_toUnicode(converter->conv(),
&target,
target + (limit * sizeof(UChar)),
target + limit,
&source,
source + source_length,
nullptr,
Expand Down

0 comments on commit 98ec909

Please sign in to comment.