Service for handling backend calls in LITE.
- Clone the repository:
git clone https://github.com/uktrade/lite-api.git
cd lite-api
- Running the service with Docker is highly recommended.
- If running on native hardware locally (using your local
pipenv
), make sure to change theDATABASE_URL
to use the port exposed by docker-compose which is5462
(double check by viewing thedocker-compose
file) and also changeELASTICSEARCH_HOST
- More information is available in the docs at without_docker.md.
-
First time setup
-
Set up your local config file:
cp local.env .env
- you will want to set this up with valid values, ask another developer or get them from Vault. If you want to run in Docker then uncomment the appropriate line in.env
referring toDATABASE_URL
.- In
.env
, also fill in the email field forINTERNAL_USERS
andEXPORTER_USERS
with valid values.
-
HAWK Authentication is enabled on API/Exporter/caseworker services by default. If you get issuing making any api calls i.e missing authentication / key mismatch
- Check LITE_INTERNAL_HAWK_KEY ENV Keys match on API and caseworker
- Check LITE_EXPORTER_HAWK_KEY ENV Keys match on API and expoter
-
Initialise submodules
git submodule init
git submodule update
-
Ensure Docker Desktop (Docker daemon) is running
-
Build and start Docker images:
docker network create lite
- shared network to allow API and frontend to communicatedocker-compose build
- build the container imagedocker-compose up -d db
- to bring up the database to allow the migration to succeed
-
Once you have an empty database, you will need to run migrations.
-
Option 1:
docker exec -it api /bin/bash
- open a shell session in apipipenv shell
- activate your pipenv environment./manage.py migrate
- perform the Django migrations (seemakefile
for convenience versions of this command)- (Known issue: by default Elasticsearch is enabled in the
.env
file and this can show a connection error at the end of the migration script. This can be safely ignored as the migration succeeding is not dependent on Elasticsearch running. Or you can disable Elasticsearch temporarily in the.env
file if you prefer.)
-
Option 2:
- Run the
make doc-migrate
command which does all of the above in one
- Run the
-
Option 3:
- Run
docker-compose up
to start the API's Django server - Run the
make first-run
command which will run the migrations, seedall and populate the databse with test data
- Run
-
-
Starting the service for the first time
docker-compose up
- to start the API's Django server
-
Go to the caseworker home page (e.g.
http://localhost:8200
) -
If your database is empty (i.e. you just ran the migrations) then at this point you might want to seed your database with the static data
docker exec -it api pipenv run ./manage.py seedall
- running with Dockerpipenv run ./manage.py seedall
- without Docker
-
You'll also want to populate the db with some test values
docker exec -it api pipenv run ./manage.py create_test_data 10
- running with Dockerpipenv run ./manage.py create_test_data 10
- without Docker
-
Starting the service
- In general you can use
docker-compose up --build
if you want to make sure new changes are included in the build
- In general you can use
See troubleshooting.md in the docs for a list of current known issues.
- Install pre-commit (see instructions here: https://pre-commit.com/#install)
- Run
pre-commit install
to activate pre-commit locally - Run following to scan all files for issues
pre-commit run --all-files
- After this initial setup, pre-commit should run automatically whenever you run
git commit
locally. - All developers must use the pre-commit hooks for the project. This is to make routine tasks easier (e.g. linting, style checking) and to help ensure secrets and personally identifiable information (PII) are not leaked.
- You should be able to use the project python environment to run pre-commit, but if the project python does not work for you, you should find a workaround for your dev environment (e.g. running a different higher python version just for pre-commit) or raise it with other developers. Ignoring pre-commit errors is not an option.
Run the following command from your api pipenv shell to initially add users (adds users defined in INTERNAL_USERS
and EXPORTER_USERS
in .env
):
./manage.py seedrolepermissions
./manage.py seedinternalusers
./manage.py seedexporterusers
There is also the make seed
command which does the same thing as the above. See the section Makefile commands below.
to add subsequent new users:
INTERNAL_USERS='[{"email"=>"[email protected]"}]' ./manage.py seedinternalusers
EXPORTER_USERS='[{"email"=>"[email protected]"}]' ./manage.py seedexporterusers
We currently use celery for async tasks and scheduling in LITE;
- celery: a celery container is running by default when using docker-compose. If a working copy
"on the metal" without docker, run celery with
watchmedo auto-restart -d . -R -p '*.py' -- celery -A api.conf worker -l info
- celery-scheduler: a celery container is running by default when using docker-compose. This is to monitor any scheduled tasks If a working copy
"on the metal" without docker, run celery with
watchmedo auto-restart -d . -R -p '*.py' -- celery -A api.conf beat
To produce PDF documents you will also need to install WeasyPrint. Do this after installing the python packages in the Pipfile;
MacOS: https://weasyprint.readthedocs.io/en/stable/install.html#macos
Linux: https://weasyprint.readthedocs.io/en/stable/install.html#debian-ubuntu
To digitally sign documents endesive
requires the OS library swig
to be installed. To install run sudo apt-get install swig
A .p12
file is also required. Please see https://uktrade.atlassian.net/wiki/spaces/ILT/pages/1390870733/PDF+Document+Signing
API Docs available on GitHub Pages
For more docs on the development process and ongoing dev work, check out LITE House Keeping.
Running locally without Docker
ER diagrams can be viewed in docs/entity-relation-diagrams/.
You'll need to install any dev graphviz dependencies (on ubuntu sudo apt install libgraphviz-dev
) and then pygraphviz
.
Gegenerate diagrams
DJANGO_SETTINGS_MODULE=api.conf.settings_dev pipenv run ./manage.py create_er_diagrams
pipenv run pytest
pipenv pytest --cov=. --cov-report term --cov-config=.coveragerc
lite-api - Service for handling backend calls in LITE.
lite-frontend - The web frontend for LITE.
pipenv run bandit -r .
The control list entries are maintained in an Excel sheet in an internal repo. They are arranged in a tree structure with each category in a different sheet. As with the changes in policy we need to update the list. This involves adding new entries, decontrolling existing entries or updating the description of entries.
To add a new entry simply add the description, rating at the correct level. Mark whether it is decontrolled or not by entering 'x' in the decontrolled column. Usually everything is controlled unless otherwise marked in this column.
If only description is to be updated then just edit the description text.
Once the changes are done run the seed command to populate the database with the updated entries and ensure no errors are reported.
pipenv run ./manage.py seedcontrollistentries
Deploy the API so that it picks up the updated entries in the corresponding environment. Because of the way the seeding commands are executed during deployment we have to deploy twice to see these changes. During deployment it first runs the seed commands and then deploys new changes so during the first deployment we are still seeding existing entries. If we deploy twice then the updated entries get seeded.
Once the API is deployed then restart the frontends because the control list entries are cached in the frontend and to discard them and pull the updated list we need to restart the app.
The makefile
in the top-level directory is where frequently used shell commands can be stored for convenience. For example, running the command:
make seed
is equivalent to
./manage.py seedrolepermissions
./manage.py seedinternalusers
./manage.py seedexporterusers
but is more convenient to use. This is helpful to have when doing things like tearing down and setting up the database multiple times per day.
The commands should be self-documenting with a name that clearly describes what the command does.