Skip to content
This repository has been archived by the owner on Nov 20, 2018. It is now read-only.

session feature with huge number of files incredibly slow on IE10 and IE11 #1466

Closed
Korijn opened this issue Sep 10, 2015 · 35 comments
Closed

Comments

@Korijn
Copy link
Contributor

Korijn commented Sep 10, 2015

It frequently occurs in our case that FineUploader has to load ~1000 initial files through the session feature upon start. Adding these to the uploader and UI takes about 20-30 seconds on IE10 and IE11, and the page doesn't respond at all during this time. Anything we can do about this?

As usual, I'm open to writing a PR to fix this if you can give me a hint on how to approach this.

@rnicholus rnicholus changed the title session feature incredibly slow on IE10 and IE11 session feature with huge number of files incredibly slow on IE10 and IE11 Sep 10, 2015
@rnicholus
Copy link
Member

The first step to fixing this would be identifying where the perf problem lies. Your case is very much on the outer edge of normal use, so it may be a little while before I am able to look at this further. There are a number of other questions/requests in the queue at the moment that I must attend to first.

@Korijn
Copy link
Contributor Author

Korijn commented Sep 10, 2015

That's OK, I might have a look at it myself before then, otherwise, I'll hear about it later. Thanks for the (incredibly) quick response!

@Korijn
Copy link
Contributor Author

Korijn commented Sep 11, 2015

The files are added using a callback options.addFileRecord in fine-uploader/client/js/session.js.

That callback is filled in by _addCannedFile in fine-uploader/client/js/uploader.basic.api.js. This is the source of that method:

        _addCannedFile: function(sessionData) {
            var id = this._uploadData.addFile({
                uuid: sessionData.uuid,
                name: sessionData.name,
                size: sessionData.size,
                status: qq.status.UPLOAD_SUCCESSFUL
            });

            sessionData.deleteFileEndpoint && this.setDeleteFileEndpoint(sessionData.deleteFileEndpoint, id);
            sessionData.deleteFileParams && this.setDeleteFileParams(sessionData.deleteFileParams, id);

            if (sessionData.thumbnailUrl) {
                this._thumbnailUrls[id] = sessionData.thumbnailUrl;
            }

            this._netUploaded++;
            this._netUploadedOrQueued++;

            return id;
        },

So there are three methods being called here, the first is _uploadData.addFile, and the other two methods just set endpoint and params.

        addFile: function(spec) {
            var status = spec.status || qq.status.SUBMITTING,
                id = data.push({
                    name: spec.name,
                    originalName: spec.name,
                    uuid: spec.uuid,
                    size: spec.size == null ? -1 : spec.size,
                    status: status
                }) - 1;

            if (spec.batchId) {
                data[id].batchId = spec.batchId;

                if (byBatchId[spec.batchId] === undefined) {
                    byBatchId[spec.batchId] = [];
                }
                byBatchId[spec.batchId].push(id);
            }

            if (spec.proxyGroupId) {
                data[id].proxyGroupId = spec.proxyGroupId;

                if (byProxyGroupId[spec.proxyGroupId] === undefined) {
                    byProxyGroupId[spec.proxyGroupId] = [];
                }
                byProxyGroupId[spec.proxyGroupId].push(id);
            }

            data[id].id = id;
            byUuid[spec.uuid] = id;

            if (byStatus[status] === undefined) {
                byStatus[status] = [];
            }
            byStatus[status].push(id);

            uploaderProxy.onStatusChange(id, null, status);

            return id;
        },

The code doesn't look inefficient (constant running time), except for perhaps the call to uploaderProxy.onStatusChange:

                onStatusChange: function(id, oldStatus, newStatus) {
                    self._onUploadStatusChange(id, oldStatus, newStatus);
                    self._options.callbacks.onStatusChange(id, oldStatus, newStatus);
                    self._maybeAllComplete(id, newStatus);

                    if (self._totalProgress) {
                        setTimeout(function() {
                            self._totalProgress.onStatusChange(id, oldStatus, newStatus);
                        }, 0);
                    }
                }

I guess the setTimeout is the culprit. It should only be scheduled once.

Since we process all n files in a loop, the setTimeout callback gets scheduled to run n times, but only after all the files have been added. So I think only 1 call should suffice as well, and give the UI thread more time to update.

@Korijn
Copy link
Contributor Author

Korijn commented Sep 11, 2015

I tried solving that with this little adjustment:

                    if (self._totalProgress && self._totalProgressTimeoutId === null) {
                        self._totalProgressTimeoutId = setTimeout(function() {
                            self._totalProgress.onStatusChange(id, oldStatus, newStatus);
                            self._totalProgressTimeoutId = null;
                        }, 0);
                    }

But that doesn't help at all.

So, I decided to run the profiler in IE10 and it turns out 99% of the time is spent inserting all the items into the DOM (specifically appendChild(), insertBefore() and set textContent). The trick to speeding this up big time is to update all of the HTML in one go, so that only one reflow & repaint operation has to be performed by the browser.

I must admit, I don't know how to tackle that one. Any advice?

@rnicholus
Copy link
Member

Good detective work! Yes, this perhaps can be improved by batching all UI updates to ensure they happen all at once instead of staggered. I suspect this will be a bit tricky though, and this likely does not just affect stored files. Instead, you will probably see the same issue if you select/drop thousands of files in IE10/11.

@Korijn
Copy link
Contributor Author

Korijn commented Sep 14, 2015

Just tested that; you are correct. It does look like it is going to be tough. :/

@morgunder
Copy link

can you remove the main block from the dom while you’re doing the manipulation and then put it back at the end? sometimes that’s the easiest way to speed up dom math.

On Sep 14, 2015, at 4:25 AM, Korijn van Golen [email protected] wrote:

Just tested that; you are correct. It does look like that is going to be tough. :/


Reply to this email directly or view it on GitHub #1466 (comment).

@Korijn
Copy link
Contributor Author

Korijn commented Sep 14, 2015

I'm not sure about the design of the templating module FineUploader uses, so I can't predict if it can handle the detached DOM node (for example it might query the DOM in an intermediate step). @rnicholus can you shine some light on this (valuable) suggestion?

@rnicholus
Copy link
Member

What, specifically, are you looking to temporarily remove from the DOM? Fine Uploader looks for elements given a reference element. That reference element is usually the file list container element - .qq-upload-list-selector. It does not use document as a reference element after the template has been initially rendered. So, that seems like it would work without issue.

@morgunder
Copy link

i had a function that recalculated things and reapplied classes on a couple of hundred elements and it was slow. then i found a few articles that said that i should detach the dom element manipulate it then reattach it once i was done. this sped up that function considerably. i forget the exact amount, but at least 2x and maybe 5x.

some other articles on that basic concept

http://desalasworks.com/article/javascript-performance-techniques/ http://desalasworks.com/article/javascript-performance-techniques/
https://learn.jquery.com/performance/detach-elements-before-work-with-them/ https://learn.jquery.com/performance/detach-elements-before-work-with-them/

On Sep 14, 2015, at 12:16 PM, Ray Nicholus [email protected] wrote:

What, specifically, are you looking to temporarily remove from the DOM? Fine Uploader looks for elements given a reference element. That reference element is usually the file list container element - .qq-upload-list-selector. It does not use document as a reference element after the template has been initially rendered. So, that seems like it would work without issue.


Reply to this email directly or view it on GitHub #1466 (comment).

@rnicholus
Copy link
Member

Typical of the jQuery docs, the exact reason why you should perform many
operations on a set of orphaned elements is missing. But yes, I believe
that this will likely solve the issue as this should drastically limit the
number of reflows/repaints.

On Mon, Sep 14, 2015 at 11:49 AM morgunder [email protected] wrote:

i had a function that recalculated things and reapplied classes on a
couple of hundred elements and it was slow. then i found a few articles
that said that i should detach the dom element manipulate it then reattach
it once i was done. this sped up that function considerably. i forget the
exact amount, but at least 2x and maybe 5x.

some other articles on that basic concept

http://desalasworks.com/article/javascript-performance-techniques/ <
http://desalasworks.com/article/javascript-performance-techniques/>
https://learn.jquery.com/performance/detach-elements-before-work-with-them/
<
https://learn.jquery.com/performance/detach-elements-before-work-with-them/

On Sep 14, 2015, at 12:16 PM, Ray Nicholus [email protected]
wrote:

What, specifically, are you looking to temporarily remove from the DOM?
Fine Uploader looks for elements given a reference element. That reference
element is usually the file list container element -
.qq-upload-list-selector. It does not use document as a reference element
after the template has been initially rendered. So, that seems like it
would work without issue.


Reply to this email directly or view it on GitHub <
#1466 (comment)
.


Reply to this email directly or view it on GitHub
#1466 (comment)
.

@rnicholus
Copy link
Member

Caution: we probably do not want to remove the entire file list from the DOM whenever new files are added. This will be a jarring experience for users. Instead, perhaps the new set of file elements can be built-up in a detached state, and then added to the DOM once they are all in place. Don't think this will be trivial though.

@Korijn
Copy link
Contributor Author

Korijn commented Sep 14, 2015

I didn't think so either. A hacky way would be to add the files in small batches using a recursive setTimeout strategy but that doesn't really solve the underlying problem.

@rmckeel
Copy link

rmckeel commented Sep 14, 2015

Hello, incidentally I am investigating this same issue. The client may have 3-4K files uploaded, but that could be even larger. When the session/endpoint is pulled, it takes 30 seconds even on a MacBook Pro late 2013 model on Chrome's V8 engine.

My profile attempts through Chrome show that the hangup is primarily in the getByClass -> querySelectorAll function called by handleFileItems. Perhaps that will help someone as they dive into the root cause. (I have a slightly edited custom version of FineUploader, so line numbers will not match up perfectly).

image

Hope that helps ~

@Korijn
Copy link
Contributor Author

Korijn commented Sep 14, 2015

Maybe we can cache the selector results in some places? We could also switch to id's for unique elements.

@rnicholus
Copy link
Member

@rmckeel Do you know where the bulk of these calls to getByClass are coming from? I'm interested in the stack for that particular call.

@Korijn IDs are, in most cases, to be avoided. They are an anti-pattern. There are a lot of ways to shoot yourself in the foot with IDs. Determining what is "unique" in a document that you do not have full control over is tough. I would also suggest not caching any of these results, as we could potentially run into memory issues. Optimizing theses selectors is probably the best course.

@rmckeel
Copy link

rmckeel commented Sep 14, 2015

@rnicholus Ray, where is a good place to send a file to you privately? I can share with you my .cpuprofile (and custom fineuploader JS file) that you could load into Chrome. I can't upload anything but images here.

@rnicholus
Copy link
Member

@rmckeel Thanks for your response. Even with your custom JS/profile data, I'll still want to reproduce locally so I can be sure this is properly fixed. I've updated this case and I'll begin looking into this perf issue ahead of everything else once I have a bigger break in my time this week.

@rnicholus rnicholus added this to the 5.3.2 milestone Sep 14, 2015
@rmckeel
Copy link

rmckeel commented Sep 14, 2015

Thank you, Ray. Excellent support.

@rnicholus
Copy link
Member

I spent an hour yesterday investigating further. I was able to reproduce the issue. I'm using a Macbook Pro late-2014 model w/ 16 GMB RAM, i7, etc, etc. I setup my endpoint to return an initial file list of 3000 entires, consisting solely of name and uuid properties. Initial load of the list on Chrome 45 took about 10-15 seconds. During this time, I was able to scroll, but the browser did not respond to any other actions.

Using Chrome's profiler, I identified a huge number of calls to qq.getByClass. These calls seemed to account for the bulk of the work. I ultimately identified 4 issues here:

Too many DOM elements

Each file entry in the DOM is made up of at least 20 elements. If the initial file list consists of 3,000 entries, this means 60,000 extra DOM elements! That is a huge number of elements, and this will likely have a noticeable perf impact on most selectors. Putting performance aside, you can't expect your users will want to scroll through 3000 files. IMHO, your initial files endpoint should never return anything close to 3000 files. Instead, consider providing a small subset of these files. If you really expect your users to page through all of the files, you should return them in pages. I realize this is not very easy with the current API, but this could be much easier if I added an API method that allowed you to add more initial files to the list. You could then pull the first set on load via Fine Uploader's built-in code, and then request subsequent pages yourself on-demand, passing these into the new API method for Fine Uploader to render.

Too many elements selector calls

For each file added to the list, there are perhaps a dozen calls to the document to select various parts of the file or file list. For a "normal" amount of files in a batch, this is likely not an issue. But at scale, this has noticeable perf consequences. The best solution to this is probably to determine when we are handling a batch of files, and maintain a cache of file IDs to file HTML elements. The internal templating module selectors can grab from this cache instead of hitting the DOM. This cache will be cleared after we are done processing the batch.

Inefficient element selectors

There were quite a few calls like this: fileListEl.querySelectorAll('.someClass')[0]. All we want is the first match, but instead we are querying for all matches and throwing all but the first away. A simple fix would be to use fileListEl.querySelector('.someClass') instead.

To many page reflows/repaints

In a batch of 3000 files, each file is added to the DOM separately. This results in a huge number of reflows/repaints, which is a perf issue. Instead, perhaps the code can be adjusted to build up this list of file elements detached from the document, and then add the entire batch at once when we are done processing the batch of files.

I am working to make the changes described above now. This will be a bit of work, but progress will be reported here as it happens. After all of the changes are complete, I'll make a new pre-release available for testing.

@morgunder
Copy link

excellent news. while i agree that this is the extreme case, i think that the end result will be much more polished feel for even the single file case. its rare that you have a chance to streamline the DOM for your layout. you might end up dropping a few elements and possible finding how to do some of the js logic in pure css for even more wins. as i was looking at some of your samples, i wondered if making a version with font awesome icons vs the images you are currently using. a switch to refer to fa fa-trash may make it possible to make all your buttons be one element vs two plus save on the img download entirely. didn’t try to prove it would be much faster. but maybe someone can shed some light on this or other other performance tweaks.

On Sep 15, 2015, at 12:02 PM, Ray Nicholus [email protected] wrote:

I spent an hour yesterday investigating further. I was able to reproduce the issue. I'm using a Macbook Pro late-2014 model w/ 16 GMB RAM, i7, etc, etc. I setup my endpoint to return an initial file list of 3000 entires, consisting solely of name and uuid properties. Initial load of the list took about 10-15 seconds. During this time, I was able to scroll, but the browser did not respond to any other actions.

Using Chrome's profiler, I identified a huge number of calls to qq.getByClass. These calls seemed to account for the bulk of the work. I ultimately identified 4 issues here:

Too many DOM elements

Each file entry in the DOM is made up of at least 20 elements. If the initial file list consists of 3,000 entries, this means 60,000 extra DOM elements! That is a huge number of elements, and this will likely have a noticeable perf impact on most selectors. Putting performance aside, you can't expect your users will want to scroll through 3000 files. IMHO,** your initial files endpoint should never return anything close to 3000 files**. Instead, consider providing a small subset of these files. If you really expect your users to page through all of the files, you should return them in pages. I realize this is not very easy with the current API, but this could be much easier if I added an API method that allowed you to either add more initial files to the list. You could then pull the first set on load via Fine Uploader's built-in code, and then request subsequent pages yourself on-demand, passing these into the new API method for Fine Upl oader to render.

Too many elements selector calls

For initial file added to the list, there are perhaps a dozen calls to the document to select various parts of the file or file list. For a "normal" amount of files in a batch, this is likely not an issue. But at scale, this has to cause noticeable perf consequences. The best solution to this is probably to determine when we are handling a batch of files, and maintain a cache of file IDs to file HTML elements. The internal templating module selectors can grab from this cache instead of hitting the DOM. This cache will be cleared after we are done processing the batch.

Inefficient element selectors

There were quite a few calls like this: fileListEl.querySelectorAll('.someClass')[0]. All we want it the first match, but instead we are querying for all matches and throwing all but the first away. A simple fix would be to use fileListEl.querySelector('.someClass') instead.

To many page reflows/repaints

In a batch of 3000 files, each file is added to the DOM separately. This results in a huge number of reflows/repaints, which is a perf issue. Instead, perhaps the code can be adjusted to build up this list of file elements detached from the document, and then add the entire batch at once when we are done processing the batch of files.

I am working to make the changes described above now. This will be a bit of work, but progress will be reported here as it happens. After all of the changes are complete, I'll make a new pre-release available for testing.


Reply to this email directly or view it on GitHub #1466 (comment).

@rnicholus
Copy link
Member

you might end up dropping a few elements and possible finding how to do some of the js logic in pure css for even more wins

I have no plans to do this at the moment. These types of changes could be considered breaking changes, and I intend to make this a hotfix release.

i wondered if making a version with font awesome icons vs the images you are currently using

No plans to tie Fine Uploader to any specific 3rd-party dependency. If you wish to use different icons, you can certainly do this via CSS overrides.

@rnicholus
Copy link
Member

I just pushed some changes up to the hotfix/5.3.2 branch. Without these changes, a 5,000 item initial file list pretty much crashes Chrome/Firefox - spinners + constant long-running-script warnings, etc for quite a while. With my changes, the file list loads in a few seconds. That's a pretty substantial improvement. Profiling suggests that there are no obvious JS-specific or DOM-specific bottlenecks that can be further improved.

The last piece of this solution is to simply not load 3,000+ files for your users to sort though. And if you must do this, do so in pages. The work remaining in this case:

  • Provide API endpoint to pass in additional "canned"/initial files
  • Update docs
  • Consider using perf enhancements for dropped/selected file batches too.

@Korijn
Copy link
Contributor Author

Korijn commented Sep 16, 2015

In my case, I also use the list of files to filter duplicates/prevet files from being uploaded twice, using md5 hashes. So I do need the whole list. But, I could just not add them to the UI as you say. Excellent changes by the way. Looking forward to the release!

@rnicholus
Copy link
Member

Is there any interest in an API method that allows you to pass in additional "initial"/canned files after Fine Uploader has loaded? This will add a bit more complexity, but has already been requested in #1191.

@Korijn
Copy link
Contributor Author

Korijn commented Sep 16, 2015

Besides the situation you described earlier, such a method could also be used to push files through WebSockets so multiple users (perhaps not even FineUploader but some other process) can upload files into the same container and see eachother's uploads as they're coming in.

@Korijn
Copy link
Contributor Author

Korijn commented Sep 16, 2015

To answer your question, I would probably not use it in the near future, if your current work already fixes the current issue at hand. Perhaps the requirement will come up later though.

@rmckeel
Copy link

rmckeel commented Sep 17, 2015

@rnicholus Ray, my vote (on behalf of several clients sites that use FineUploader) is no interest in passing in lists of files after load. There is no change that would warrant it in the single session. I suppose group uploading would be a good reason to add that feature in, I just wouldn't expect group uploading to be that common! :) Thanks ~

@rnicholus
Copy link
Member

Hi everyone. Just a reminder that this new proposed api method isn't meant
to pass ALL of your initial files. It's purpose is to allow you to "page"
them. For example, your initial set on component load return 100, and you
include a "more results" button. When this is pressed, you request 100 more
and then pass the results into this new api method, appending them to the
existing list.
On Wed, Sep 16, 2015 at 11:44 PM Ryan McKeel [email protected]
wrote:

@rnicholus https://github.com/rnicholus Ray, my vote (on behalf of
several clients sites that use FineUploader) is no interest in passing in
lists of files after load. There is no change that would warrant it in the
single session. I suppose group uploading would be a good reason to add
that feature in, I just wouldn't expect group uploading to be that common!
:) Thanks ~


Reply to this email directly or view it on GitHub
#1466 (comment)
.

@Korijn
Copy link
Contributor Author

Korijn commented Sep 17, 2015

That was clear IMO.

@rnicholus
Copy link
Member

In order to get this out sooner, I'm narrowed the TODO list to:

  • Update docs
  • Consider using perf enhancements for dropped/selected file batches too.

I may drop the last item as well if it proves to be too complicated or risky relative to the benefit. If the last item does need to be addressed and it is not addressed in this hotfix, I'll open up a new case.

@Korijn
Copy link
Contributor Author

Korijn commented Sep 20, 2015

Sounds great!

@rnicholus
Copy link
Member

I don't think any change to user-submitted files is in order at this time. And since I plan to leave the work mentioned in #1191 for another time, I don't think there is much else to do here. Has anyone interested in this fix tested it out? If not, do you need more time? I plan to release my changes in a 5.3.2 version tomorrow otherwise.

@Korijn
Copy link
Contributor Author

Korijn commented Oct 8, 2015

Just wanted to let you know that we've confirmed this issue as fixed on our testing servers, and we'll go live with it soon. Thanks for the excellent support!

@rnicholus
Copy link
Member

great, thanks for the confirmation

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

4 participants