You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I want to deliver a zip that contains a lot of files
# find . | wc -l
2192
I get this error:
2015/03/21 14:04:11 [crit] 2312#0: *2213 open() "/home/XXX" failed (24: Too many open files), client: XXX.XXX.XXX.XXX, server: ~^XXX$, request: "GET /XXX HTTP/1.1", subrequest: "/XXX", host: "XXX"
I am able to download the whole zip when I resume the download. With a fast connection I am able to download more bytes per retry than with a slow one.
The text was updated successfully, but these errors were encountered:
Issue #16 notwithstanding, I am also seeing this behavior with nginx 1.8.1 and mod_zip 1.1.6, no CRC checksums in the manifest.
As a work-around, I used two copies of nginx. One runs mod_zip. Instead of directly reading the file in a location statement it proxies to the second nginx which sends the file contents back. This seems to get around the open files limit. At least I could get it reliably to fail one way and work the other for the same download package.
We could increase the open files limit, but with our usage there may be many downloads with many files simultaneously, so how well this would work in the long run is questionable.
Seems like nginx doesn't close the file descriptors (opened by subrequests) as long as the main request is active.
I didn't dig too deep into the nginx architecture but it should be possible to close the file without breaking the cycle iteration.
evanmiller
changed the title
mod_zip opens excessively many files for zip files with many files
File descriptors are not closed until all subrequests have finished
Oct 7, 2017
When I want to deliver a zip that contains a lot of files
I get this error:
2015/03/21 14:04:11 [crit] 2312#0: *2213 open() "/home/XXX" failed (24: Too many open files), client: XXX.XXX.XXX.XXX, server: ~^XXX$, request: "GET /XXX HTTP/1.1", subrequest: "/XXX", host: "XXX"
I am able to download the whole zip when I resume the download. With a fast connection I am able to download more bytes per retry than with a slow one.
The text was updated successfully, but these errors were encountered: