You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I don't know if this is the right place to put this, and I hope I don't come off as negative.
I spent a lot of time trying to apply yajl to stream based json parsing with node bindings, as did @vibornoff, in node-yajl.
What I found was that the built in json parsing in V8 is crazy fast in constructing json objects, and I was not able to touch it in terms of performance and memory usage. I did find that one place where yajl was awesome (2x faster) was in stream validation of json without ever mapping the json data into the node VM. A benefit that yajl had was that it could parse directly out of utf8 buffers without first converting the data to a different encoding.
So my question here is basically, what are the performance goals of the parser? When will you advise people that it's better to do stream parsing as opposed to parsing an entire document?
I ultimately got a bit discouraged when I found out just how efficient the built in V8 json parser is, primarily because it employs tricks not available to extensions to construct javascript objects.
Don't get me wrong, I'm think there's opportunities here, especially when you start talking about filtering (JSONSelect or JSONPath) on the server on behalf of the client, but just wanted to share my experience and figure out where you're thinking the biggest wins will be.
The text was updated successfully, but these errors were encountered:
I hear you. I am not focused on row performance yet. I try to make something that solve some use cases. The most obvious one is not buffering very large strings of JSON before parsing them. I believe only this can make things faster as I/O operation are usually the bottleneck. When we have finer events for the parsing, it'll also be possible to process data before we receive the whole object. In some cases, like streaming JSON, the full object may never finish.
Another reason I have not run some performance tests, is that I'm really not an expert at this. If you think you can write a relevant benchmark against node-json-streams, I'd be happy to include it in the project.
So, to summarize. I'd like to focus on features and usability before focusing on raw performance. I'm not negating the importance of it, but it's not a priority.
I don't know if this is the right place to put this, and I hope I don't come off as negative.
I spent a lot of time trying to apply yajl to stream based json parsing with node bindings, as did @vibornoff, in node-yajl.
What I found was that the built in json parsing in V8 is crazy fast in constructing json objects, and I was not able to touch it in terms of performance and memory usage. I did find that one place where yajl was awesome (2x faster) was in stream validation of json without ever mapping the json data into the node VM. A benefit that yajl had was that it could parse directly out of utf8 buffers without first converting the data to a different encoding.
So my question here is basically, what are the performance goals of the parser? When will you advise people that it's better to do stream parsing as opposed to parsing an entire document?
I ultimately got a bit discouraged when I found out just how efficient the built in V8 json parser is, primarily because it employs tricks not available to extensions to construct javascript objects.
Don't get me wrong, I'm think there's opportunities here, especially when you start talking about filtering (JSONSelect or JSONPath) on the server on behalf of the client, but just wanted to share my experience and figure out where you're thinking the biggest wins will be.
The text was updated successfully, but these errors were encountered: