- Django (Python)
- Gunicorn (wsgi)
- Nginx (webserver)
- Supervisor (process control)
- Debian (OS)
- Docker (container)
Common workflow tasks are wrapped up using Fabric commands. Refer to fabfile.py
for the
current commands. Add commands as required.
This project uses Docker for configuring the development environment and managing it. By overriding the container's environment variables, the same Docker image can be used for the production service. Thus, for development work, you must have Docker installed and running.
Note that the following covers only local configuration, not deployment. See the IAC README in this repository for more information.
- copy the sample
.secrets.env.sample
file and fill in the blanks:
cp .secrets.env.sample .secrets.env
- copy the sample
.env.sample
file and fill in the blanks:
cp .env.sample .env
To maintain code quality, this project uses pre-commit to run a series of checks on the code before it is committed. To install pre-commit, run the following:
virtualenv venv
source venv/bin/activate
pip install -r requirements-dev.txt
pre-commit install
When updating the local development python environment, be sure to run pre-commit uninstall
followed by pre-commit install
.
Once Docker is installed and local environment variables set, run the following:
$ fab build
$ fab up
If this is the first time running the up command, the api image will be built and postgis image will be downloaded. Then the containers will be started.
With a database already created and persisted in an S3 bucket via
$ fab dbbackup
,
$ fab dbrestore
will recreate and populate the local database with the latest dump. Without the S3 dump (i.e. running for the first time), you'll need to create a local database and then run
$ fab migrate
to create its schema.
A shortcut for the above steps, once S3 is set up, is available via:
$ fab freshinstall:[env]
env: local (default), dev, prod
Once everything is installed, run the following to have the API server running in the background:
$ fab runserver
The project directory api
is mounted to the container, so any changes you make outside the container (e.g. using
an IDE installed on your host OS) are available inside the container.
Please note that running fab up
does NOT rebuild the image. So if you are making changes to the Dockerfile, for
example adding a dependency to the requirement.txt
file, you will need to do the following:
$ fab down // Stops and Removes the containers
$ fab build // Builds a new image
$ fab up
Note: this will delete your local database; all data will be lost.
$ fab dbbackup:<env>
env: local, dev, prod
Backup the database from a named S3 key
$ fab dbrestore:<env>
env: local, dev, prod
The MERMAID API forms the backbone for a growing family of apps that allow for coral reef data collection, management and reporting, and visualization: https://github.com/data-mermaid
Pull Requests welcome! When we move to Python 3 this repo will use Black. Send development questions to [email protected].
- Install Session Manager Plugin for AWS-CLI locally
- Ensure AWS
mermaid
profile is set (~/.aws/config and ~/.aws/credentials) - Ensure AWS resource IDs are set in environment
- export MERMAID_CLUSTER=
- export MERMAID_SERVICE=
- export MERMAID_DBHOST=
- export MERMAID_DBPORT=
$ make cloud_shell
- su webapp
- bash