-
-
Notifications
You must be signed in to change notification settings - Fork 6.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Big file parsing #1393
Comments
The DOM parser builds the complete JSON object into memory. If you just want to do a syntax check or want only perform a few specific operations, you should use the SAX parser. What is your usecase? |
I need to iterate over json elements so I need entire DOM. It seems that parser increases document size in 10+ times, in my case original minified json is about 40Mb. |
A third option could be the callback function with which you can work on the data yet still decide whether you want to store them for later use. |
I mean just regular nodejs json parse takes about 2 seconds for the same file. |
I would be interested in the file to understand where we take so much time. |
Sure, here's a small repo to reproduce the flow. |
Sorry, it was my fault, it was test with enabled memory corruption limits. Real execution time is: |
I have prettified json file over 110Mb which is relatively big but not huge. I read it into string for less than 1 second and then call json::parse which takes 225 seconds and over 500Mb of memory. Am I doing something wrong? Why is it that slow and consumes such amount of memory?
The text was updated successfully, but these errors were encountered: