You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[2022-02-05, 22:38:18 UTC] {taskinstance.py:1259} INFO - Executing <Task(DockerOperator): docker_op_tester> on 2022-02-03 07:00:00+00:00
[2022-02-05, 22:38:18 UTC] {standard_task_runner.py:52} INFO - Started process 325 to run task
[2022-02-05, 22:38:18 UTC] {standard_task_runner.py:76} INFO - Running: ['airflow', 'tasks', 'run', 'docker_dag', 'docker_op_tester', 'scheduled__2022-02-03T07:00:00+00:00', '--job-id', '34', '--raw', '--subdir', 'DAGS_FOLDER/docker_dag.py', '--cfg-path', '/tmp/tmpqd592sdh', '--error-file', '/tmp/tmp3e4mxwk4']
[2022-02-05, 22:38:18 UTC] {standard_task_runner.py:77} INFO - Job 34: Subtask docker_op_tester
[2022-02-05, 22:38:18 UTC] {logging_mixin.py:109} INFO - Running <TaskInstance: docker_dag.docker_op_tester scheduled__2022-02-03T07:00:00+00:00 [running]> on host 1b2f3575c860
[2022-02-05, 22:38:18 UTC] {taskinstance.py:1424} INFO - Exporting the following env vars:
[email protected]
AIRFLOW_CTX_DAG_OWNER=airflow
AIRFLOW_CTX_DAG_ID=docker_dag
AIRFLOW_CTX_TASK_ID=docker_op_tester
AIRFLOW_CTX_EXECUTION_DATE=2022-02-03T07:00:00+00:00
AIRFLOW_CTX_DAG_RUN_ID=scheduled__2022-02-03T07:00:00+00:00
[2022-02-05, 22:38:18 UTC] {docker.py:227} INFO - Starting docker container from image centos:latest
[2022-02-05, 22:38:19 UTC] {docker.py:289} INFO - tata
[2022-02-05, 22:38:19 UTC] {xcom.py:333} ERROR - Could not serialize the XCom value into JSON. If you are using pickle instead of JSON for XCom, then you need to enable pickle support for XCom in your airflow config.
[2022-02-05, 22:38:19 UTC] {taskinstance.py:1700} ERROR - Task failed with exception
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1329, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1455, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1514, in _execute_task
self.xcom_push(key=XCOM_RETURN_KEY, value=result)
File "/usr/local/lib/python3.9/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 2135, in xcom_push
XCom.set(
File "/usr/local/lib/python3.9/site-packages/airflow/utils/session.py", line 67, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/models/xcom.py", line 100, in set
value = XCom.serialize_value(value)
File "/usr/local/lib/python3.9/site-packages/airflow/models/xcom.py", line 331, in serialize_value
return json.dumps(value).encode('UTF-8')
File "/usr/local/lib/python3.9/json/__init__.py", line 231, in dumps
return _default_encoder.encode(obj)
File "/usr/local/lib/python3.9/json/encoder.py", line 199, in encode
chunks = self.iterencode(o, _one_shot=True)
File "/usr/local/lib/python3.9/json/encoder.py", line 257, in iterencode
return _iterencode(o, 0)
File "/usr/local/lib/python3.9/json/encoder.py", line 179, in default
raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type CancellableStream is not JSON serializable
[2022-02-05, 22:38:19 UTC] {taskinstance.py:1267} INFO - Marking task as FAILED. dag_id=docker_dag, task_id=docker_op_tester, execution_date=20220203T070000, start_date=20220205T223818, end_date=20220205T223819
[2022-02-05, 22:38:19 UTC] {standard_task_runner.py:89} ERROR - Failed to execute job 34 for task docker_op_tester
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/airflow/task/task_runner/standard_task_runner.py", line 85, in _start_by_fork
args.func(args, dag=self.dag)
File "/usr/local/lib/python3.9/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/utils/cli.py", line 92, in wrapper
return f(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/cli/commands/task_command.py", line 298, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/usr/local/lib/python3.9/site-packages/airflow/cli/commands/task_command.py", line 107, in _run_task_by_selected_method
_run_raw_task(args, ti)
File "/usr/local/lib/python3.9/site-packages/airflow/cli/commands/task_command.py", line 180, in _run_raw_task
ti._run_raw_task(
File "/usr/local/lib/python3.9/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1329, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1455, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1514, in _execute_task
self.xcom_push(key=XCOM_RETURN_KEY, value=result)
File "/usr/local/lib/python3.9/site-packages/airflow/utils/session.py", line 70, in wrapper
return func(*args, session=session, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 2135, in xcom_push
XCom.set(
File "/usr/local/lib/python3.9/site-packages/airflow/utils/session.py", line 67, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/models/xcom.py", line 100, in set
value = XCom.serialize_value(value)
File "/usr/local/lib/python3.9/site-packages/airflow/models/xcom.py", line 331, in serialize_value
return json.dumps(value).encode('UTF-8')
File "/usr/local/lib/python3.9/json/__init__.py", line 231, in dumps
return _default_encoder.encode(obj)
File "/usr/local/lib/python3.9/json/encoder.py", line 199, in encode
chunks = self.iterencode(o, _one_shot=True)
File "/usr/local/lib/python3.9/json/encoder.py", line 257, in iterencode
return _iterencode(o, 0)
File "/usr/local/lib/python3.9/json/encoder.py", line 179, in default
raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type CancellableStream is not JSON serializable
[2022-02-05, 22:38:19 UTC] {local_task_job.py:154} INFO - Task exited with return code 1
[2022-02-05, 22:38:19 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check
What you expected to happen
Docker Operator does not fail :)
How to reproduce
See the issue description - it is all there :)
Operating System
ANy
Versions of Apache Airflow Providers
not released yet docker provider from main
Deployment
Other
Deployment details
No details
Anything else
We cancelled providers's rlease and (likely tomorrow) we would like to release an RC2 wave. Would be great to fix it before.
Apache Airflow version
main (development)
What happened
As discussed in #21348
cc: @ferruzzi @raphaelauv
Seems that main docker operator is buggy even aftter #21175
with
do_xcom_push=False
it failfail
with
What you expected to happen
Docker Operator does not fail :)
How to reproduce
See the issue description - it is all there :)
Operating System
ANy
Versions of Apache Airflow Providers
not released yet docker provider from
main
Deployment
Other
Deployment details
No details
Anything else
We cancelled providers's rlease and (likely tomorrow) we would like to release an RC2 wave. Would be great to fix it before.
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: