Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[lua] memory allocation error when uploading large files sizes (6GB) to Nextcloud #89

Open
GNU-Plus-Windows-User opened this issue Dec 8, 2024 · 2 comments

Comments

@GNU-Plus-Windows-User
Copy link

When uploading a large file (I've used an Ubuntu 24.04 Desktop ISO for testing at 6GB in size) with the NGINX bouncer enabled, about 2GB worth of data will be uploaded and then the upload will fail making it impossible to upload large files. Making the situation worse, each time the upload fails, chunks in /var/lib/nginx/body/ would not be deleted eventually resulting in the server's disk space being full, I had to clear it manually free up disk space.

The only thing interesting I found in my logs was this:

[lua] memory allocation error

I'm not using AppSec so the request body shouldn't be inspected, but this behavior is somehow implying that it is being inspected.

When I was working on getting the required information for this issue, I wasn't able to reproduce it again for some reason. It must happen under a certain condition or configuration. I will update this issue if I can reproduce it again and do so consistently.

Server config:
NGINX version: 1.24.0
Server OS: Ubuntu 24.04
Bouncer version: 1.0.8
Server is a reverse proxy.

Might be related to: #71

@LaurenceJJones
Copy link
Contributor

Hey 👋🏻

The only time outside of appsec we attempt to read the body is when the user has a captcha remediation and is in a "pending" state to be verified EG: waiting for the response to the captcha form.

We should realistically check the body size before attempting to read it as generally a response to a captcha form should be less than couple of mb (even kb's) at most and anything above that we should ignore if it turns out that this is what happened, because a case where this could happen is your have a local client which sends a request but doesnt handle the captcha response so sends a POST whilst ignoring it.

@GNU-Plus-Windows-User
Copy link
Author

@LaurenceJJones I forgot to mention I was also told this in the Discord, but as I said the behavior is implying that the body is being parsed. The user didn't have a pending captcha remediation but I still got the out of memory error 🤷

I have a hunch it's because Nextcloud isn't setting the content type, but I haven't tested it yet. Or maybe it's an issue with WebDav methods like PUT?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants