This guide provides concise instructions for setting up and running the Daisy project using Docker Compose. It includes commands for managing the Django application and other services within the project.
- Docker
- Docker Compose
git clone https://github.com/elixir-luxembourg/daisy.git
cd daisy
Create a .env
file in the project root to override default environment variables if necessary. See .env.template file for more detail. Additionally, create elixir_daisy/settings_local.py
file from elixir_daisy/settings_local.template.py
.
Build and start all services defined in docker-compose.yaml
:
docker compose up -d --build
Run database migrations:
docker compose exec web python manage.py migrate
Build the Solr schema required for full-text search:
docker compose exec web python manage.py build_solr_schema -c /solr/daisy/conf -r daisy -u default
The project uses frontend assets that need to be compiled (e.g., with npm), you need to build them and collect static files.
docker compose exec web npm --prefix web/static/vendor ci
docker compose exec web npm --prefix web/static/vendor run build
From the project root:
docker compose exec web python manage.py collectstatic --noinput
Load initial data, such as controlled vocabularies and initial list of institutions and cohorts.
docker compose exec web bash -c "
cd core/fixtures/ && \
wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/edda.json && \
wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hpo.json && \
wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hdo.json && \
wget https://git-r3lab.uni.lu/pinar.alper/metadata-tools/raw/master/metadata_tools/resources/hgnc.json
"
docker compose exec web python manage.py load_initial_data
Note: This step can take several minutes to complete.
In case you are provisioning a demo instance, following command loads demo data, including mock datasets, projects, and a demo admin account:
docker compose exec web python manage.py load_demo_data
You can log in with the demo admin credentials provided during the demo data setup (username: admin
, password:admin
by default) or as one of the regular users (see the About
page for more detail).
After loading data, build the search index for Solr:
docker compose exec web python manage.py rebuild_index --noinput
The application should now be accessible on https://localhost/
To ensure the backups are properly set up, please refer to the Backup manual
To restore from a legacy backup file (e.g., daisy_prod.tar.gz
):
docker compose stop nginx flower worker beat web mq solr
# Copy the legacy backup file into the backup container
docker cp ../daisy_prod.tar.gz $(docker compose ps -q backup):/code/daisy_prod.tar.gz
# Execute the legacy restore script inside the backup container
docker compose exec backup sh /code/scripts/legacy_restore.sh /code/daisy_prod.tar.gz
# Remove the backup file from the container
docker compose exec backup rm /code/daisy_prod.tar.gz
docker compose up -d solr mq web worker beat flower nginx
# Rebuild the Solr index
docker compose exec web python manage.py rebuild_index --noinput