Skip to content
This repository has been archived by the owner on Jul 27, 2022. It is now read-only.

Large File Silent Failure #298

Closed
natduca opened this issue Sep 22, 2014 · 17 comments
Closed

Large File Silent Failure #298

natduca opened this issue Sep 22, 2014 · 17 comments
Labels

Comments

@natduca
Copy link
Contributor

natduca commented Sep 22, 2014

From [email protected] on July 10, 2013 14:04:35

What steps will reproduce the problem? 1. Have large (>=100MB) json trace file
2. Using either svn source, or current google chrome build, load tracing data
3. Data silently fails to load/render, does not download What is the expected output? What do you see instead? Tracing output, rendering. The file is ignored, and not loaded. What version of the product are you using? On what operating system? Mac OS X 10.8.4, Google Version 28.0.1500.71 Please provide any additional information below. I have a tool that filters the file down to about ~30MB, that file takes a very long time to render, often asking to kill the page. However it will render the file, often using 10X the file size amount of RAM.

Original issue: http://code.google.com/p/trace-viewer/issues/detail?id=292

@natduca
Copy link
Contributor Author

natduca commented Sep 22, 2014

From [email protected] on July 11, 2013 18:39:17

If you zip the file, will the issue tracker let you attach it? Otherwise, can you send me the file ([email protected]) and I can take a look?

I've loaded large (150meg) files in the past and they worked, although they were very slow. This is something I've been working on so any examples that cause extreme slowness would be appreciated.

@natduca
Copy link
Contributor Author

natduca commented Sep 22, 2014

From [email protected] on July 24, 2013 17:49:13

Labels: -Tool-All

@dj2 dj2 added Bug and removed imported labels May 30, 2015
@egonelbre
Copy link
Contributor

Josh Bleecher Snyder reported an issue with regards a large trace output from Go.

Essentially https://dl.dropboxusercontent.com/u/4300994/Go/trace-1.5M.zip has ~1.5M events (~130MB) and crashes Chrome tab with an "Aw Snap". I was making it smaller and it seems the limit currently is around 1M events. Although, there is no significant slowness when showing 1M (https://dl.dropboxusercontent.com/u/4300994/Go/trace-1M.zip) events - so I'm guessing some memory limit is triggered.

@egonelbre
Copy link
Contributor

After a bit more debugging and inspecting found out that the 1.5M events file works in Chrome Canary (45.0.2448.0).

@natduca
Copy link
Contributor Author

natduca commented Jul 4, 2015

We tweaked canary recently to be better with memory so that may be what it is.

Maybe we should warn folks when you exceed 1.5m events? The viewer itself really isn't mean to cope with much more, but I don't see much harm in warning folks up front...

@egonelbre
Copy link
Contributor

I also wrote a quick alternate trace-viewer to see how much memory minimally could be used, and during some tests I used up 2GB of memory while loading, so it doesn't actually seem to be caused by memory. I'm not sure how to see what was causing Chrome to fail with Aw Snap message. Now started thinking maybe the bug is still there, but adjustments to the memory made it less likely or hid it.

The highest I was able to load was 1.07M events, based on that I would put the limit at 1M. But I'm concerned how that number translates to other trace files. i.e. Will a trace recorded in chrome have the same limit or break earlier or later? I guess if the message is non-intrusive and shown ibelow importing dialog then there isn't a big drawback to it.

@egonelbre
Copy link
Contributor

It seems that it isn't as easy as putting some message below the importer delay. It needs some task to be run, just after creating importers (at that point we know the number of events) and before importing (that's where the crash happens).

I failed to notice a nice way to handle a message asynchronously inside importing. Easiest solution was to use:

lastTask = lastTask.after(function(task){
  var total = 0;
  for(var i = 0; i < importers.length; i++){
    var importer = importers[i];
    total += importer.eventCountEstimate || 0;
  }

  if(total > 1e6) {
    window.alert("Whoa, you seem to be loading lots of data.\n" +
           "This may cause problems with trace-viewer and browser.");
  }
}, this);

Just before the //Run the import. task. Ideally the window.alert should be timed, after 10s closes. What are your feelings about window.alert (I generally don't like it, but in this case is a really easy solution for a corner case) or is there something I missed. I can imagine how to create such from scratch, but not quite sure what the best approach would be there.

@natduca
Copy link
Contributor Author

natduca commented Jul 7, 2015

Hmm you're right. I'm genuinely not sure how to handle this gracefully. If
you just do is json.parse (manually, in the trace2html bootstrap code) how
many events can you handle? I'm half wondering if this should be a warning
at trace2html file creation time... The reason for my json question is to
see If we can even parse the big dump to the point that we can show a
warning.
On Sun, Jul 5, 2015 at 6:46 AM Egon Elbre [email protected] wrote:

It seems that it isn't as easy as putting some message below the importer
delay. It needs some task to be run, just after creating importers (at that
point we know the number of events) and before importing (that's where the
crash happens).

I failed to notice a nice way to handle a message asynchronously inside
importing. Easiest solution was to use:

lastTask = lastTask.after(function(task){
var total = 0;
for(var i = 0; i < importers.length; i++){
var importer = importers[i];
total += importer.eventCountEstimate || 0;
}

if(total > 1e6) {
window.alert("Whoa, you seem to be loading a lot of data.\n" +
"This may cause problems with trace-viewer and browser.");
}
}, this);

Just before the //Run the import. task. Ideally the window.alert should
be timed, after 10s closes. What are your feelings about window.alert or is
there something I missed. I can imagine how to create such from scratch,
but not quite sure what the best approach would be there.


Reply to this email directly or view it on GitHub
#298 (comment)
.

@egonelbre
Copy link
Contributor

We actually don't even have to use JSON.parse to handle this, we can write a simple parser that just counts the number of objects in the dataset. It should correlate very well with the total number of events. Or even simpler would be to estimate this based on the data-size, or combined - i.e. if data is below 50MB, assume it's fine... if it's above 50MB, count the number of objects; if number of objects / 3 (or whatever the ratio objects/events is) is bigger than 1M, show a warning.

I.e. as long as we can load it into the memory we can show a warning.

@natduca
Copy link
Contributor Author

natduca commented Jul 8, 2015

How do you count the number of objects though? We start with a string that
is a gzipped, stringified json object...

On Wed, Jul 8, 2015 at 3:35 AM Egon Elbre [email protected] wrote:

Hmm you're right. I'm genuinely not sure how to handle this gracefully. If
you just do is json.parse (manually, in the trace2html bootstrap code) how
many events can you handle?

We actually don't even have to do JSON.parse to handle this, we can write
a simple parser that just counts the number of objects in the dataset. It
should correlate very well with the total number of events. Or even simpler
would be to estimate this based on the data-size, or combined - i.e. if
data is below 50MB, assume it's fine... if it's above 50MB, count the
number of objects; if number of objects / 3 (or whatever the ratio
objects/events is) is bigger than 1M, show a warning.

I.e. as long as we can load it into the memory we can show a warning.


Reply to this email directly or view it on GitHub
#298 (comment)
.

@egonelbre
Copy link
Contributor

Hmm, so the importer takes a gzipped input? I thought it got a regular JSON and the XHR unpacked it.

As long as the JSON is correct, then this (ignoring any bugs) should work:

var data = '{"x": { "b{}": ["c", {}, {}]}';
var count = 0;
for(var i = 0; i < data.length; i++) {
    if(data[i] == '"'){
        for(; i < data.length; i++) {
            if(data[i] == '"'){
                break;
            } else if (data[i] == '\\') {
                i++;
                if(data[i] == 'u') {
                    i += 4; // \u0000
                }
            }
        }
    } else if ((data[i] == '{') || (data[i] == '[')) {
        count++;
    }
}

@dj2
Copy link
Contributor

dj2 commented Jul 8, 2015

We accept both regular JSON and gzipped JSON. Chrome is now always sending gzipped JSON to chrome://tracing.

Why do we need to precheck the size? We can just do the import and, if # of events > 1.5million throw up a warning dialog and say things maybe sluggish.

@egonelbre
Copy link
Contributor

@dj2 problem is that Chrome may "crash" (the "Aw Snap" screen) or throw an "this page is not responding" during the importing. So, showing the warning after importing is too late. Although, I think it should always be able to handle JSON.parse - but we can estimate the number of events without properly parsing; this means we could show the warning immediately after loading the JSON data (and unzipping).

@catapult-bot
Copy link

Migrated to catapult-project/catapult#298

@zaccharieramzi
Copy link

Hi all,

I currently have a large trace file which is silently breaking the chrome trace viewer.
I see that the issue has been migrated but to a repository which is now archived.

Do you know if this issue has a fix?

@paulirish
Copy link
Member

Chrome DevTools performance tab can also render traces.

And https://ui.perfetto.dev/#!/viewer is the successor to trace-viewer and may have more luck rendering a megalarge (100MB+) trace.

@zaccharieramzi
Copy link

Thank you so much for answering: I didn't know about perfetto, looks like a great tool.

Just so you know I got into:
image

And then into:

/tmp/trace_processor_shell-linux-a3ce2cbf4cbe4f86cc10b02957db727cecfafae8: /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.29' not found (required by /tmp/trace_processor_shell-linux-a3ce2cbf4cbe4f86cc10b02957db727cecfafae8)
/tmp/trace_processor_shell-linux-a3ce2cbf4cbe4f86cc10b02957db727cecfafae8: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.28' not found (required by /tmp/trace_processor_shell-linux-a3ce2cbf4cbe4f86cc10b02957db727cecfafae8)

I posted a related issue on perfetto's GitHub.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

6 participants