We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When we run docker-compose in windows env, you may encounter the following two problems:
The docker-compose file and env file changed below.
version: '3.2' networks: airflow: ipam: config: - subnet: 172.32.0.0/16 volumes: pgdata: driver: local pglog: driver: local services: postgres: image: postgres:13.1 environment: - POSTGRES_USER=airflow - POSTGRES_DB=airflow - POSTGRES_PASSWORD=airflow - PGDATA=/var/lib/postgresql/data/pgdata ports: - 5432:5432 volumes: - /var/run/docker.sock:/var/run/docker.sock - pgdata:/var/lib/postgresql/data/pgdata - pglog:/var/lib/postgresql/data/log command: > postgres -c listen_addresses=* -c logging_collector=on -c log_destination=stderr -c max_connections=200 networks: airflow: ipv4_address: 172.32.0.2 redis: image: redis:5.0.5 environment: REDIS_HOST: redis REDIS_PORT: 6379 ports: - 6379:6379 networks: airflow: ipv4_address: 172.32.0.3 webserver: env_file: - .env image: apache/airflow:2.0.0-python3.8 ports: - 8080:8080 volumes: - E:\airflow_in_docker_compose\airflow_files\dags:/opt/airflow/dags - E:\airflow_in_docker_compose\logs:/opt/airflow/logs - E:\airflow_in_docker_compose\files:/opt/airflow/files - /var/run/docker.sock:/var/run/docker.sock deploy: restart_policy: condition: on-failure delay: 5s max_attempts: 3 window: 120s depends_on: - postgres - redis - initdb command: webserver healthcheck: test: ["CMD-SHELL", "[ -f /opt/airflow/airflow-webserver.pid ]"] interval: 30s timeout: 30s retries: 3 networks: airflow: ipv4_address: 172.32.0.4 flower: image: apache/airflow:2.0.0-python3.8 env_file: - .env ports: - 5555:5555 depends_on: - redis deploy: restart_policy: condition: on-failure delay: 8s max_attempts: 3 volumes: - E:\airflow_in_docker_compose\logs:/opt/airflow/logs command: celery flower networks: airflow: ipv4_address: 172.32.0.5 scheduler: image: apache/airflow:2.0.0-python3.8 env_file: - .env volumes: - E:\airflow_in_docker_compose\airflow_files\dags:/opt/airflow/dags - E:\airflow_in_docker_compose\logs:/opt/airflow/logs - E:\airflow_in_docker_compose\files:/opt/airflow/files - /var/run/docker.sock:/var/run/docker.sock command: scheduler depends_on: - initdb deploy: restart_policy: condition: any delay: 5s window: 120s networks: airflow: ipv4_address: 172.32.0.6 initdb: image: apache/airflow:2.0.0-python3.8 env_file: - .env volumes: - E:\airflow_in_docker_compose\airflow_files\dags:/opt/airflow/dags - E:\airflow_in_docker_compose\logs:/opt/airflow/logs - E:\airflow_in_docker_compose\files:/opt/airflow/files - /var/run/docker.sock:/var/run/docker.sock entrypoint: /bin/bash deploy: restart_policy: condition: on-failure delay: 8s max_attempts: 5 command: -c "airflow db init && airflow users create --firstname admin --lastname admin --email admin --password admin --username admin --role Admin" depends_on: - redis - postgres networks: airflow: ipv4_address: 172.32.0.7 worker: image: apache/airflow:2.0.0-python3.8 env_file: - .env volumes: - E:\airflow_in_docker_compose\airflow_files\dags:/opt/airflow/dags - E:\airflow_in_docker_compose\logs:/opt/airflow/logs - E:\airflow_in_docker_compose\files:/opt/airflow/files - /var/run/docker.sock:/var/run/docker.sock command: celery worker depends_on: - scheduler deploy: restart_policy: condition: on-failure delay: 8s max_attempts: 3 networks: - airflow
AIRFLOW__CORE__EXECUTOR=CeleryExecutor AIRFLOW__WEBSERVER__RBAC=False AIRFLOW__CORE__CHECK_SLAS=False AIRFLOW__CORE__STORE_SERIALIZED_DAGS=False AIRFLOW__CORE__PARALLELISM=50 AIRFLOW__CORE__LOAD_EXAMPLES=False AIRFLOW__CORE__LOAD_DEFAULT_CONNECTIONS=False AIRFLOW__SCHEDULER__SCHEDULER_HEARTBEAT_SEC=10 AIRFLOW__CELERY__BROKER_URL=redis://:@172.32.0.3:6379/0 AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://airflow:[email protected]:5432/airflow AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:[email protected]:5432/airflow AIRFLOW__CORE__FERNET_KEY=P_gYHVxUHul5GNhev_Pde-Kr8qvCeurfSCF9OT7cJQM=
The text was updated successfully, but these errors were encountered:
No branches or pull requests
When we run docker-compose in windows env, you may encounter the following two problems:
The docker-compose file and env file changed below.
The text was updated successfully, but these errors were encountered: