Basic overview • Key features • Technology used • How to use • Project overview • You may also like
The application's sole purpose is to keep track of the products price available on Tunisian e-commerce websites,then provide the user with the lowest price. An automated scraping script is used to provide the products data.
- Scraping popular tunisian e-commerce websites using Scrapy web-crawling framework
- Hosting scrapyd (local scrapy spiders runner) on heroku cloud
- Using Cron jobs to automate the daily scraping process.
- Implemented 'atlas search' service to rank relevant search results using react in the frontend and flask as a backend server.
To clone and run this application, you'll need Git and Node.js (which comes with npm) installed on your computer and Setting the development environment. From your command line:
# Clone this repository
$ git clone https://github.com/Hassene66/Flask_React_Project
# Go into the repository
$ cd Flask_React_Project
# Install dependencies
$ npm install
# Run the app
$ npm run dev