-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Generating huge files on client #394
Comments
Screw-FileReader is a cross browser solution to turn a blob into a web stream Another solution is just to turn that blob into a stream using var stream = new Response(blob).body The advantage with web stream is that you don't have to include the hole node stream + node buffer libs to the browser... |
either way this might be interesting to you also #343 |
This code:
controls the output, But what are the methods to control the input? My input isn't this:
It is this:
So I managed to cut the huge file up into small pieces using a loop of However, how do I insert those small pieces into the |
Hi @Pacerier and @jimmywarting, Goal: Allow a user to upload multiple files/folders of any size, via drag-and-drop, from their local file system into their browser. Proposed Solution: Create a zip file of the desired files/folders and chunk that up into multiple POSTs to the server. The server then assembles these chunks into a single zip file that can be extracted. I have gotten this to work, and it works very well with a small enough set of items, but .... Problem: zip.file() will end up reading all of the file data into an arraybuffer in memory, as it prepares each file (https://github.com/Stuk/jszip/blob/master/dist/jszip.js#L3471) This ends up ballooning the browser memory. A single file over 4GB will break things completely. A large set of smaller items will use memory equal to the total size of the files (8GB of files = 8GB of memory use, for example). Is there a way to avoid this? The ability for generateInternalStream to stream with StreamFiles: true is great, but it doesn't seem very useful when the files added to the zip are all loaded into memory beforehand. |
Thanks @jimmywarting! This looks exactly what I figured should be happening. They should definitely merge this in. |
How do you pipe huge files on client?
I'm cutting files using a loop of:
but where is the place to input the output string? How do we pipe the output string to the
.pipe
of:Aside from that, I'm getting the error
nodestream is not supported by this platform
.The text was updated successfully, but these errors were encountered: