Skip to content
This repository has been archived by the owner on Nov 25, 2019. It is now read-only.

Use redis as Airflow broker #49

Merged
merged 3 commits into from
Sep 16, 2019
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion config/airflow.dev.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -379,7 +379,7 @@ worker_log_server_port = 8793
# information.
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#broker-settings
# broker_url = sqla+mysql://airflow:airflow@localhost:3306/airflow
broker_url = sqla+postgresql+psycopg2://postgres:example@db/airflow-db
broker_url = redis://message-bus

# The Celery result_backend. When a job finishes, it needs to update the
# metadata of the job. Therefore it will post a message on a message bus,
Expand Down
13 changes: 6 additions & 7 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,11 @@ services:
AWS_ACCESS_KEY_ID: longkey
AWS_SECRET_ACCESS_KEY: verysecretkey

message-bus:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I assume this should be private to Celery, as we don't write or read values from the DAGs code for example. Possibilities for a service name?

  • celery-broker (not very precise because brokers usually have their own protocol like AMQP, this is just part of the broker concept)
  • celery-broker- backend (matches db as that doesn't mention postgres, and Celery's terminology)
  • celery-broker-redis (to communicate what this is)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

name updated to airflow-broker

image: redis:5.0.5-alpine
healthcheck:
test: ["CMD", "nc", "-z", "localhost", "6379"]
diversemix marked this conversation as resolved.
Show resolved Hide resolved

db:
image: postgres:11.2
environment:
Expand All @@ -34,9 +39,6 @@ services:
POSTGRES_DB: airflow-db
healthcheck:
test: ["CMD-SHELL", "pg_isready -U postgres"]
interval: 30s
timeout: 30s
retries: 3
giorgiosironi marked this conversation as resolved.
Show resolved Hide resolved

# only used to tag the base image ready to be uploaded to dockerhub
app:
Expand All @@ -58,7 +60,7 @@ services:
- app
- db
environment:
WAIT_HOSTS: db:5432
WAIT_HOSTS: db:5432, message-bus:6379
AIRFLOW_CONN_REMOTE_LOGS: http://s3:9000?host=http://s3:9000
volumes:
- ./dags:/airflow/dags
Expand All @@ -72,9 +74,6 @@ services:
- 8080:8080
healthcheck:
test: ["CMD-SHELL", "python ./scripts/airflow_webserver_healthcheck.py"]
interval: 30s
timeout: 30s
retries: 3

airflow_scheduler:
<<: *airflow
Expand Down
2 changes: 1 addition & 1 deletion requirements/base.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
apache-airflow[postgres, s3, celery]==1.10.3
apache-airflow[redis, postgres, s3, celery]==1.10.3
Wand==0.5.4
Flask==1.0.3