You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using katana i want to scan them however using the default -l flag won't work for me as i need to set crawl duration for each URL using -ct flag . I tried the following:
cat sites.txt | while read URL; do katana -u $URL -ct 10s; done
Ouput
$ cat foobar | while read URL; do katana -u $URL -ct 10s; done
__ __
/ /_____ _/ /____ ____ ___ _
/ '_/ _ / __/ _ / _ \/ _ /
/_/\_\\_,_/\__/\_,_/_//_/\_,_/
projectdiscovery.io
[INF] Current katana version v1.1.0 (latest)
[INF] Started standard crawling for => https://example.com
[INF] Started standard crawling for => https://google.com
As you can see from above ouput, katana starts to crawl both the sites immediately instead of waiting for first to finish or setting any rules this can cause many issues such as:
If there are multiple sites in the file say 1000. my system will crash immediatly since katana starts crawling all of them without any rules.
I checked whether my above approach give me desired result i.e. "crawl maximum for 10s for each site" but it didn't work. see output below
$ time cat sites.txt | while read URL; do katana -u $URL -ct 10s; done
__ __
/ /_____ _/ /____ ____ ___ _
/ '_/ _ / __/ _ / _ \/ _ /
/_/\_\\_,_/\__/\_,_/_//_/\_,_/
projectdiscovery.io
[INF] Started standard crawling for => https://google.com
[INF] Started standard crawling for => https://example.com
https://google.com
https://example.com
real 0m15.171s
user 0m0.089s
sys 0m0.027s
As you can see it took only approx 15s to execute the command but it must be > 20s. since i set 10s + 10s for both sites. so this approach didn't work.
I would be grateful if anyone could bless me with a fix for this. Thanks and Sorry i am a complete newbie in GoLang so i am unable to find a solution for it by myself.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
i have a file named sites.txt like this
Using katana i want to scan them however using the default
-l
flag won't work for me as i need to set crawl duration for each URL using-ct
flag . I tried the following:Ouput
As you can see from above ouput, katana starts to crawl both the sites immediately instead of waiting for first to finish or setting any rules this can cause many issues such as:
As you can see it took only approx 15s to execute the command but it must be > 20s. since i set 10s + 10s for both sites. so this approach didn't work.
I would be grateful if anyone could bless me with a fix for this. Thanks and Sorry i am a complete newbie in GoLang so i am unable to find a solution for it by myself.
Beta Was this translation helpful? Give feedback.
All reactions