-
Notifications
You must be signed in to change notification settings - Fork 1.9k
session feature with huge number of files incredibly slow on IE10 and IE11 #1466
Comments
The first step to fixing this would be identifying where the perf problem lies. Your case is very much on the outer edge of normal use, so it may be a little while before I am able to look at this further. There are a number of other questions/requests in the queue at the moment that I must attend to first. |
That's OK, I might have a look at it myself before then, otherwise, I'll hear about it later. Thanks for the (incredibly) quick response! |
The files are added using a callback That callback is filled in by _addCannedFile: function(sessionData) {
var id = this._uploadData.addFile({
uuid: sessionData.uuid,
name: sessionData.name,
size: sessionData.size,
status: qq.status.UPLOAD_SUCCESSFUL
});
sessionData.deleteFileEndpoint && this.setDeleteFileEndpoint(sessionData.deleteFileEndpoint, id);
sessionData.deleteFileParams && this.setDeleteFileParams(sessionData.deleteFileParams, id);
if (sessionData.thumbnailUrl) {
this._thumbnailUrls[id] = sessionData.thumbnailUrl;
}
this._netUploaded++;
this._netUploadedOrQueued++;
return id;
}, So there are three methods being called here, the first is addFile: function(spec) {
var status = spec.status || qq.status.SUBMITTING,
id = data.push({
name: spec.name,
originalName: spec.name,
uuid: spec.uuid,
size: spec.size == null ? -1 : spec.size,
status: status
}) - 1;
if (spec.batchId) {
data[id].batchId = spec.batchId;
if (byBatchId[spec.batchId] === undefined) {
byBatchId[spec.batchId] = [];
}
byBatchId[spec.batchId].push(id);
}
if (spec.proxyGroupId) {
data[id].proxyGroupId = spec.proxyGroupId;
if (byProxyGroupId[spec.proxyGroupId] === undefined) {
byProxyGroupId[spec.proxyGroupId] = [];
}
byProxyGroupId[spec.proxyGroupId].push(id);
}
data[id].id = id;
byUuid[spec.uuid] = id;
if (byStatus[status] === undefined) {
byStatus[status] = [];
}
byStatus[status].push(id);
uploaderProxy.onStatusChange(id, null, status);
return id;
}, The code doesn't look inefficient (constant running time), except for perhaps the call to onStatusChange: function(id, oldStatus, newStatus) {
self._onUploadStatusChange(id, oldStatus, newStatus);
self._options.callbacks.onStatusChange(id, oldStatus, newStatus);
self._maybeAllComplete(id, newStatus);
if (self._totalProgress) {
setTimeout(function() {
self._totalProgress.onStatusChange(id, oldStatus, newStatus);
}, 0);
}
} I guess the Since we process all |
I tried solving that with this little adjustment: if (self._totalProgress && self._totalProgressTimeoutId === null) {
self._totalProgressTimeoutId = setTimeout(function() {
self._totalProgress.onStatusChange(id, oldStatus, newStatus);
self._totalProgressTimeoutId = null;
}, 0);
} But that doesn't help at all. So, I decided to run the profiler in IE10 and it turns out 99% of the time is spent inserting all the items into the DOM (specifically I must admit, I don't know how to tackle that one. Any advice? |
Good detective work! Yes, this perhaps can be improved by batching all UI updates to ensure they happen all at once instead of staggered. I suspect this will be a bit tricky though, and this likely does not just affect stored files. Instead, you will probably see the same issue if you select/drop thousands of files in IE10/11. |
Just tested that; you are correct. It does look like it is going to be tough. :/ |
can you remove the main block from the dom while you’re doing the manipulation and then put it back at the end? sometimes that’s the easiest way to speed up dom math.
|
I'm not sure about the design of the templating module FineUploader uses, so I can't predict if it can handle the detached DOM node (for example it might query the DOM in an intermediate step). @rnicholus can you shine some light on this (valuable) suggestion? |
What, specifically, are you looking to temporarily remove from the DOM? Fine Uploader looks for elements given a reference element. That reference element is usually the file list container element - |
i had a function that recalculated things and reapplied classes on a couple of hundred elements and it was slow. then i found a few articles that said that i should detach the dom element manipulate it then reattach it once i was done. this sped up that function considerably. i forget the exact amount, but at least 2x and maybe 5x. some other articles on that basic concept http://desalasworks.com/article/javascript-performance-techniques/ http://desalasworks.com/article/javascript-performance-techniques/
|
Typical of the jQuery docs, the exact reason why you should perform many On Mon, Sep 14, 2015 at 11:49 AM morgunder [email protected] wrote:
|
Caution: we probably do not want to remove the entire file list from the DOM whenever new files are added. This will be a jarring experience for users. Instead, perhaps the new set of file elements can be built-up in a detached state, and then added to the DOM once they are all in place. Don't think this will be trivial though. |
I didn't think so either. A hacky way would be to add the files in small batches using a recursive setTimeout strategy but that doesn't really solve the underlying problem. |
Hello, incidentally I am investigating this same issue. The client may have 3-4K files uploaded, but that could be even larger. When the session/endpoint is pulled, it takes 30 seconds even on a MacBook Pro late 2013 model on Chrome's V8 engine. My profile attempts through Chrome show that the hangup is primarily in the getByClass -> querySelectorAll function called by handleFileItems. Perhaps that will help someone as they dive into the root cause. (I have a slightly edited custom version of FineUploader, so line numbers will not match up perfectly). Hope that helps ~ |
Maybe we can cache the selector results in some places? We could also switch to id's for unique elements. |
@rmckeel Do you know where the bulk of these calls to @Korijn IDs are, in most cases, to be avoided. They are an anti-pattern. There are a lot of ways to shoot yourself in the foot with IDs. Determining what is "unique" in a document that you do not have full control over is tough. I would also suggest not caching any of these results, as we could potentially run into memory issues. Optimizing theses selectors is probably the best course. |
@rnicholus Ray, where is a good place to send a file to you privately? I can share with you my .cpuprofile (and custom fineuploader JS file) that you could load into Chrome. I can't upload anything but images here. |
@rmckeel Thanks for your response. Even with your custom JS/profile data, I'll still want to reproduce locally so I can be sure this is properly fixed. I've updated this case and I'll begin looking into this perf issue ahead of everything else once I have a bigger break in my time this week. |
Thank you, Ray. Excellent support. |
I spent an hour yesterday investigating further. I was able to reproduce the issue. I'm using a Macbook Pro late-2014 model w/ 16 GMB RAM, i7, etc, etc. I setup my endpoint to return an initial file list of 3000 entires, consisting solely of Using Chrome's profiler, I identified a huge number of calls to Too many DOM elementsEach file entry in the DOM is made up of at least 20 elements. If the initial file list consists of 3,000 entries, this means 60,000 extra DOM elements! That is a huge number of elements, and this will likely have a noticeable perf impact on most selectors. Putting performance aside, you can't expect your users will want to scroll through 3000 files. IMHO, your initial files endpoint should never return anything close to 3000 files. Instead, consider providing a small subset of these files. If you really expect your users to page through all of the files, you should return them in pages. I realize this is not very easy with the current API, but this could be much easier if I added an API method that allowed you to add more initial files to the list. You could then pull the first set on load via Fine Uploader's built-in code, and then request subsequent pages yourself on-demand, passing these into the new API method for Fine Uploader to render. Too many elements selector callsFor each file added to the list, there are perhaps a dozen calls to the document to select various parts of the file or file list. For a "normal" amount of files in a batch, this is likely not an issue. But at scale, this has noticeable perf consequences. The best solution to this is probably to determine when we are handling a batch of files, and maintain a cache of file IDs to file HTML elements. The internal templating module selectors can grab from this cache instead of hitting the DOM. This cache will be cleared after we are done processing the batch. Inefficient element selectorsThere were quite a few calls like this: To many page reflows/repaintsIn a batch of 3000 files, each file is added to the DOM separately. This results in a huge number of reflows/repaints, which is a perf issue. Instead, perhaps the code can be adjusted to build up this list of file elements detached from the document, and then add the entire batch at once when we are done processing the batch of files. I am working to make the changes described above now. This will be a bit of work, but progress will be reported here as it happens. After all of the changes are complete, I'll make a new pre-release available for testing. |
excellent news. while i agree that this is the extreme case, i think that the end result will be much more polished feel for even the single file case. its rare that you have a chance to streamline the DOM for your layout. you might end up dropping a few elements and possible finding how to do some of the js logic in pure css for even more wins. as i was looking at some of your samples, i wondered if making a version with font awesome icons vs the images you are currently using. a switch to refer to fa fa-trash may make it possible to make all your buttons be one element vs two plus save on the img download entirely. didn’t try to prove it would be much faster. but maybe someone can shed some light on this or other other performance tweaks.
|
I have no plans to do this at the moment. These types of changes could be considered breaking changes, and I intend to make this a hotfix release.
No plans to tie Fine Uploader to any specific 3rd-party dependency. If you wish to use different icons, you can certainly do this via CSS overrides. |
I just pushed some changes up to the hotfix/5.3.2 branch. Without these changes, a 5,000 item initial file list pretty much crashes Chrome/Firefox - spinners + constant long-running-script warnings, etc for quite a while. With my changes, the file list loads in a few seconds. That's a pretty substantial improvement. Profiling suggests that there are no obvious JS-specific or DOM-specific bottlenecks that can be further improved. The last piece of this solution is to simply not load 3,000+ files for your users to sort though. And if you must do this, do so in pages. The work remaining in this case:
|
In my case, I also use the list of files to filter duplicates/prevet files from being uploaded twice, using md5 hashes. So I do need the whole list. But, I could just not add them to the UI as you say. Excellent changes by the way. Looking forward to the release! |
Is there any interest in an API method that allows you to pass in additional "initial"/canned files after Fine Uploader has loaded? This will add a bit more complexity, but has already been requested in #1191. |
Besides the situation you described earlier, such a method could also be used to push files through WebSockets so multiple users (perhaps not even FineUploader but some other process) can upload files into the same container and see eachother's uploads as they're coming in. |
To answer your question, I would probably not use it in the near future, if your current work already fixes the current issue at hand. Perhaps the requirement will come up later though. |
@rnicholus Ray, my vote (on behalf of several clients sites that use FineUploader) is no interest in passing in lists of files after load. There is no change that would warrant it in the single session. I suppose group uploading would be a good reason to add that feature in, I just wouldn't expect group uploading to be that common! :) Thanks ~ |
Hi everyone. Just a reminder that this new proposed api method isn't meant
|
That was clear IMO. |
In order to get this out sooner, I'm narrowed the TODO list to:
I may drop the last item as well if it proves to be too complicated or risky relative to the benefit. If the last item does need to be addressed and it is not addressed in this hotfix, I'll open up a new case. |
Sounds great! |
I don't think any change to user-submitted files is in order at this time. And since I plan to leave the work mentioned in #1191 for another time, I don't think there is much else to do here. Has anyone interested in this fix tested it out? If not, do you need more time? I plan to release my changes in a 5.3.2 version tomorrow otherwise. |
Just wanted to let you know that we've confirmed this issue as fixed on our testing servers, and we'll go live with it soon. Thanks for the excellent support! |
great, thanks for the confirmation |
It frequently occurs in our case that FineUploader has to load ~1000 initial files through the
session
feature upon start. Adding these to the uploader and UI takes about 20-30 seconds on IE10 and IE11, and the page doesn't respond at all during this time. Anything we can do about this?As usual, I'm open to writing a PR to fix this if you can give me a hint on how to approach this.
The text was updated successfully, but these errors were encountered: