Available at api.coderx.io
- /SPL
Filter by set_id, labeler, package_ndc, product_ndc, product_name, inactive_ingredient_name, inactive_ingredient_unii or schedule
Example filter by schedule: http://api.coderx.io/spl/?schedule=CIII
This method is intended for internal testing only. It has not been secured for external access.
- Download SPL zip files
python3 get_zips.py
Example arguments to download SPL zip 4 and unpack 100 SPL filespython3 get_zips.py --select 4 --unzip 100
For further assistancepython3 get_zips.py -h
- Create docker container
docker-compose up -d
to bring up the Django API - Optional: load the database
docker-compose exec -d api sh -c "cd /dailymed-api/scraper/ && scrapy crawl json_extract"
An alternative command isdocker exec -d -it -w /dailymed-api/scraper dailymed-api scrapy crawl json_extract
This method is for using docker-compose.prod.yml
- Update secret in Django settings.py
- Disable debug mode in Django settings.py
- Install & configure Nginx to serve static folder and proxy Gunicorn
- Download SPL zip files
sudo -u www-data python3 get_zips.py
- Create directory
mkdir /opt/dailymed
- Change owner
chown www-data:www-data /opt/dailymed
- Change directory
cd /opt/dailymed
- Clone repo
sudo -u www-data git clone https://github.com/coderxio/dailymed-api
An alternative command isgit clone https://github.com/coderxio/dailymed-api && chown -R www-data:www-data /opt/dailymed
- Change directory
cd dailymed-api
- Create docker container
docker-compose -f docker-compose.prod.yaml up --build -d
- Optional: load the database
docker-compose exec -d api sh -c "cd /dailymed-api/scraper/ && scrapy crawl json_extract"
An alternative command isdocker exec -d -it -w /dailymed-api/scraper dailymed-api scrapy crawl json_extract