You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When uploading a large file (I've used an Ubuntu 24.04 Desktop ISO for testing at 6GB in size) with the NGINX bouncer enabled, about 2GB worth of data will be uploaded and then the upload will fail making it impossible to upload large files. Making the situation worse, each time the upload fails, chunks in /var/lib/nginx/body/ would not be deleted eventually resulting in the server's disk space being full, I had to clear it manually free up disk space.
The only thing interesting I found in my logs was this:
[lua] memory allocation error
I'm not using AppSec so the request body shouldn't be inspected, but this behavior is somehow implying that it is being inspected.
When I was working on getting the required information for this issue, I wasn't able to reproduce it again for some reason. It must happen under a certain condition or configuration. I will update this issue if I can reproduce it again and do so consistently.
Server config: NGINX version: 1.24.0 Server OS: Ubuntu 24.04 Bouncer version: 1.0.8
Server is a reverse proxy.
The only time outside of appsec we attempt to read the body is when the user has a captcha remediation and is in a "pending" state to be verified EG: waiting for the response to the captcha form.
We should realistically check the body size before attempting to read it as generally a response to a captcha form should be less than couple of mb (even kb's) at most and anything above that we should ignore if it turns out that this is what happened, because a case where this could happen is your have a local client which sends a request but doesnt handle the captcha response so sends a POST whilst ignoring it.
@LaurenceJJones I forgot to mention I was also told this in the Discord, but as I said the behavior is implying that the body is being parsed. The user didn't have a pending captcha remediation but I still got the out of memory error 🤷
I have a hunch it's because Nextcloud isn't setting the content type, but I haven't tested it yet. Or maybe it's an issue with WebDav methods like PUT?
When uploading a large file (I've used an Ubuntu 24.04 Desktop ISO for testing at 6GB in size) with the NGINX bouncer enabled, about 2GB worth of data will be uploaded and then the upload will fail making it impossible to upload large files. Making the situation worse, each time the upload fails, chunks in
/var/lib/nginx/body/
would not be deleted eventually resulting in the server's disk space being full, I had to clear it manually free up disk space.The only thing interesting I found in my logs was this:
I'm not using AppSec so the request body shouldn't be inspected, but this behavior is somehow implying that it is being inspected.
When I was working on getting the required information for this issue, I wasn't able to reproduce it again for some reason. It must happen under a certain condition or configuration. I will update this issue if I can reproduce it again and do so consistently.
Server config:
NGINX version: 1.24.0
Server OS: Ubuntu 24.04
Bouncer version: 1.0.8
Server is a reverse proxy.
Might be related to: #71
The text was updated successfully, but these errors were encountered: