You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have an airflow service running with CeleryExecutor with 4 workers. Currently I've got 15 DAGs. Recently I've tried to run another DAG and it fails. The error I got is: Executor reports task instance finished (failed) although the task says its queued. Was the task killed externally?
My DAG is stucked in the first task and it don't continue from it.
Psdata: This airflow service is currently running in a server with 8 GB Memory and 80 GB Disk. The memory has around 70% of its use.
What happened?
The text was updated successfully, but these errors were encountered:
I have an airflow service running with CeleryExecutor with 4 workers. Currently I've got 15 DAGs. Recently I've tried to run another DAG and it fails. The error I got is: Executor reports task instance finished (failed) although the task says its queued. Was the task killed externally?
My DAG is stucked in the first task and it don't continue from it.
Psdata: This airflow service is currently running in a server with 8 GB Memory and 80 GB Disk. The memory has around 70% of its use.
What happened?
The text was updated successfully, but these errors were encountered: