You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Which package is this bug report for? If unsure which one to select, leave blank
@crawlee/core
Issue description
While using a router from createCheerioRouter , after scrapping a few hundreds pages from a website, the error Failed to prolong lock for cached request occurred a couple of times. I ran the code with node 18 and [email protected] then with node 20 and [email protected] and i got the same error at different times. I have used the same code previously without this error, the last time i ran it was in 2024-07 but I'm not sure if i was using a crawlee version <= 3.10. When setting the experiments: { requestLocking: false }, there was no error.
RequestQueue2(c4ff936c-6e71-40dc-8022-9b3bcf3175d4, default):�[39m Failed to prolong lock for cached request ZBjmM6IptmoMC54, either lost the lock or the request was already handled
�[90m {"err":{"name":"Error","message":"ENOENT: no such file or directory, open '/home/mibu/Work/Web_projects/scrappers/storage/request_queues/default/ZBjmM6IptmoMC54.json'","stack":"Error: ENOENT: no such file or directory, open '/home/mibu/Work/Web_projects/scrappers/storage/request_queues/default/ZBjmM6IptmoMC54.json'\n at async open (node:internal/fs/promises:639:25)\n at async readFile (node:internal/fs/promises:1242:14)\n at async /home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/fs/request-queue/fs.js:62:40\n at async lockAndCallback (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/background-handler/fs-utils.js:56:16)\n at async RequestQueueFileSystemEntry.get (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/fs/request-queue/fs.js:61:20)\n at async RequestQueueClient.prolongRequestLock (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/resource-clients/request-queue.js:237:33)\n at async RequestQueue._prolongRequestLock (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_queue_v2.js:259:25)\n at async RequestQueue.getOrHydrateRequest (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_queue_v2.js:236:27)\n at async RequestQueue.fetchNextRequest (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_queue_v2.js:106:25)\n at async /home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/basic/internals/basic-crawler.js:808:23","errno":-2,"code":"ENOENT","syscall":"open","path":"/home/mibu/Work/Web_projects/scrappers/storage/request_queues/default/ZBjmM6IptmoMC54.json"}}�[39m
�[33mWARN�[39m �[33m RequestQueue2(c4ff936c-6e71-40dc-8022-9b3bcf3175d4, default):�[39m Failed to prolong lock for cached request AUr4h6wyQk65X2g, either lost the lock or the request was already handled
�[90m {"err":{"name":"Error","message":"ENOENT: no such file or directory, open '/home/mibu/Work/Web_projects/scrappers/storage/request_queues/default/AUr4h6wyQk65X2g.json'","stack":"Error: ENOENT: no such file or directory, open '/home/mibu/Work/Web_projects/scrappers/storage/request_queues/default/AUr4h6wyQk65X2g.json'\n at async open (node:internal/fs/promises:639:25)\n at async readFile (node:internal/fs/promises:1242:14)\n at async /home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/fs/request-queue/fs.js:62:40\n at async lockAndCallback (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/background-handler/fs-utils.js:56:16)\n at async RequestQueueFileSystemEntry.get (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/fs/request-queue/fs.js:61:20)\n at async RequestQueueClient.prolongRequestLock (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/resource-clients/request-queue.js:237:33)\n at async RequestQueue._prolongRequestLock (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_queue_v2.js:259:25)\n at async RequestQueue.getOrHydrateRequest (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_queue_v2.js:236:27)\n at async RequestQueue.fetchNextRequest (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_queue_v2.js:106:25)\n at async /home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/basic/internals/basic-crawler.js:808:23","errno":-2,"code":"ENOENT","syscall":"open","path":"/home/mibu/Work/Web_projects/scrappers/storage/request_queues/default/AUr4h6wyQk65X2g.json"}}�[39m
Dataset item Count: 845
Processing url: https://cl.puma.com/zapatillas-de-training-pwrframe-tr-3-para-mujer-379560-02.html
Price didn't change, but the product still exists https://cl.puma.com/zapatillas-de-training-pwrframe-tr-3-para-mujer-379560-02.html
Dataset item Count: 846
�[33mWARN�[39m �[33m RequestQueue2(c4ff936c-6e71-40dc-8022-9b3bcf3175d4, default):�[39m Failed to prolong lock for cached request 9ptafWkK75u4Dfx, either lost the lock or the request was already handled
�[90m {"err":{"name":"Error","message":"ENOENT: no such file or directory, open '/home/mibu/Work/Web_projects/scrappers/storage/request_queues/default/9ptafWkK75u4Dfx.json'","stack":"Error: ENOENT: no such file or directory, open '/home/mibu/Work/Web_projects/scrappers/storage/request_queues/default/9ptafWkK75u4Dfx.json'\n at async open (node:internal/fs/promises:639:25)\n at async readFile (node:internal/fs/promises:1242:14)\n at async /home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/fs/request-queue/fs.js:62:40\n at async lockAndCallback (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/background-handler/fs-utils.js:56:16)\n at async RequestQueueFileSystemEntry.get (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/fs/request-queue/fs.js:61:20)\n at async RequestQueueClient.prolongRequestLock (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/resource-clients/request-queue.js:237:33)\n at async RequestQueue._prolongRequestLock (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_queue_v2.js:259:25)\n at async RequestQueue.getOrHydrateRequest (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_queue_v2.js:236:27)\n at async RequestQueue.fetchNextRequest (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_queue_v2.js:106:25)\n at async /home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/basic/internals/basic-crawler.js:808:23","errno":-2,"code":"ENOENT","syscall":"open","path":"/home/mibu/Work/Web_projects/scrappers/storage/request_queues/default/9ptafWkK75u4Dfx.json"}}�[39m
�[31mERROR�[39m�[33m CheerioCrawler:AutoscaledPool:�[39m isTaskReadyFunction failed
Error: ENOENT: no such file or directory, open '/home/mibu/Work/Web_projects/scrappers/storage/request_queues/default/ZBjmM6IptmoMC54.json'
at async open (node:internal/fs/promises:639:25)
at async readFile (node:internal/fs/promises:1242:14)
at async /home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/fs/request-queue/fs.js:62:40
at async lockAndCallback (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/background-handler/fs-utils.js:56:16)
at async RequestQueueFileSystemEntry.get (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/fs/request-queue/fs.js:61:20)
at async RequestQueueClient.listAndLockHead (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/resource-clients/request-queue.js:207:33)
at async RequestQueue._listHeadAndLock (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_queue_v2.js:163:26)
at async RequestQueue.ensureHeadIsNonEmpty (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_queue_v2.js:160:9)
at async RequestQueue.isEmpty (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_provider.js:467:9)
at async CheerioCrawler._isTaskReadyFunction (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/basic/internals/basic-crawler.js:944:38)
node:internal/modules/run_main:129
triggerUncaughtException(
^
Error: ENOENT: no such file or directory, open '/home/mibu/Work/Web_projects/scrappers/storage/request_queues/default/ZBjmM6IptmoMC54.json'
at async open (node:internal/fs/promises:639:25)
at async readFile (node:internal/fs/promises:1242:14)
at async /home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/fs/request-queue/fs.js:62:40
at async lockAndCallback (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/background-handler/fs-utils.js:56:16)
at async RequestQueueFileSystemEntry.get (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/fs/request-queue/fs.js:61:20)
at async RequestQueueClient.listAndLockHead (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/memory-storage/resource-clients/request-queue.js:207:33)
at async RequestQueue._listHeadAndLock (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_queue_v2.js:163:26)
at async RequestQueue.ensureHeadIsNonEmpty (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_queue_v2.js:160:9)
at async RequestQueue.isEmpty (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/core/storages/request_provider.js:467:9)
at async CheerioCrawler._isTaskReadyFunction (/home/mibu/Work/Web_projects/scrappers/node_modules/@crawlee/basic/internals/basic-crawler.js:944:38) {
errno: -2,
code: 'ENOENT',
syscall: 'open',
path: '/home/mibu/Work/Web_projects/scrappers/storage/request_queues/default/ZBjmM6IptmoMC54.json'
}
Node.js v20.17.0
Code sample
No response
Package version
3.11.2
Node.js version
20.17.0
Operating system
Manjaro
Apify platform
Tick me if you encountered this issue on the Apify platform
I have tested this on the next release
No response
Other context
No response
The text was updated successfully, but these errors were encountered:
Which package is this bug report for? If unsure which one to select, leave blank
@crawlee/core
Issue description
While using a router from
createCheerioRouter
, after scrapping a few hundreds pages from a website, the errorFailed to prolong lock for cached request
occurred a couple of times. I ran the code withnode 18
and[email protected]
then withnode 20
and[email protected]
and i got the same error at different times. I have used the same code previously without this error, the last time i ran it was in2024-07
but I'm not sure if i was using acrawlee
version<= 3.10
. When setting theexperiments: { requestLocking: false },
there was no error.Code sample
No response
Package version
3.11.2
Node.js version
20.17.0
Operating system
Manjaro
Apify platform
I have tested this on the
next
releaseNo response
Other context
No response
The text was updated successfully, but these errors were encountered: