Multiple crawlers in different folders #1380
Answered
by
dadoonet
asafeclemente
asked this question in
Q&A
-
I have several folders and each one I need to put in an elasticsearch index, for that I need I need to instantiate several fscralwler services, but due to the amount, I can't run them all at the same time. Is there any way to automatically shut down when a crawler finishes indexing documents? Or some smarter way to do this process? |
Beta Was this translation helpful? Give feedback.
Answered by
dadoonet
Feb 8, 2022
Replies: 1 comment 1 reply
-
Yes. You can use see https://fscrawler.readthedocs.io/en/fscrawler-2.9/user/options.html |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
dadoonet
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Yes. You can use
-loop 1
option. It will run once and exit.see https://fscrawler.readthedocs.io/en/fscrawler-2.9/user/options.html