Segmentation fault in destructor in case of large inputs #1835
Labels
kind: bug
release item: 🐛 bug fix
solution: proposed fix
a fix for the issue has been proposed and waits for confirmation
Milestone
Project bad_json_parsers tested how JSON parser libraries react on deeply nested inputs. It turns out that this library segfaults at a certain nesting depth. I analyzed the result, and it turned out that the segmentation fault does not occur in the parser/construction part, but rather during destruction. But avoiding destruction (i.e., by adjusting the program as follows:
the library can process much longer inputs (46875000 and greater).
Either use bad_json_parsers, or generate a deeply nested file (like input.json.zip), parse it, and let the destructor be called.
No segmentation fault, but rather an out-of-memory error during parsing of large inputs.
On my machine, I get
and a stack trace of 94000 frames.
MacOS Version 10.14.6 (18G1007), Xcode Version 11.0 (11A420a). It is a supported compiler.
develop
branch?Both
develop
and version 3.7.1.Yes, everything compiles and runs fine.
The text was updated successfully, but these errors were encountered: