Memory issues with long crawls #258
Unanswered
claytongray
asked this question in
Q&A
Replies: 1 comment
-
Can probably close this. It seemed to be an issue with error handling in my Laravel jobs. All is good now. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Allowed memory size of 1073741824 bytes exhausted (tried to allocate 836767744 bytes)
I currently run roach to crawl a very large data set inside of a queue worker. Sometimes, I get memory allocation errors when there is a page with a very large set of data and sub pages. Is there a way to better handle when crawls that can be pretty large? I've increased the allowed memory to 1.5 Gigs but it still seems to error out.
Beta Was this translation helpful? Give feedback.
All reactions