You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When calling the lambda for hundreds of files at once I was getting failures due to lambda invocations that were running out of the available 640 MB. I raised the memory limit to 1024 MB and again saw out-of-memory failures. I eventually discovered this was due to files from previous lambda runs being left in the /tmp directory of a lambda that was being reused by AWS. I've temporarily solved this issue by emptying out the /tmp directory at the end of lambda_function.py just before the response is returned. shutil.rmtree('/tmp', ignore_errors=True)
The text was updated successfully, but these errors were encountered:
When calling the lambda for hundreds of files at once I was getting failures due to lambda invocations that were running out of the available 640 MB. I raised the memory limit to 1024 MB and again saw out-of-memory failures. I eventually discovered this was due to files from previous lambda runs being left in the /tmp directory of a lambda that was being reused by AWS. I've temporarily solved this issue by emptying out the /tmp directory at the end of lambda_function.py just before the response is returned.
shutil.rmtree('/tmp', ignore_errors=True)
The text was updated successfully, but these errors were encountered: