Skip to content

Simple web crawler to crawl website and organise URLs based on page extensions.

Notifications You must be signed in to change notification settings

rakeshmane/Links-Crawler

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Links-Crawler

Simple web crawler for finding endpoints.

Features:

  1. All crawled URLs are organized by page extensions.
  2. All parameters of same URL are organized and displayed together.

#Running from terminal

Alt text

#Accessing crawled links

Alt text

Alt text

#Installation

pip3 install nyawc

git clone https://github.com/rakeshmane/Links-Crawler.git

cd Links-Crawler

python3 Links_Crawler.py

About

Simple web crawler to crawl website and organise URLs based on page extensions.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published