You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One of my team faces the following issue. Has anyone come across a similar issue before?
Airflow Log files to Cleversafe S3 Buckets with default configuration as provided in Airflow ( {dag_id}/{task_id}/{execution_date}/{try_number}.log).
Above dag_id, task_id, execution_date, try_number...All of these are dynamic and will keep on changing in real time.
So essentially logs path come out something like below:
s3://XXXX/airflow/logs/XXX_YYY//2020-04-26T16:01:00+00:00/1.log
When trying to read the log via Logstash S3 connector, it can not read this location as execution_date has + sign and it replaces this + sign with Space. Hence logstash can not read these log files and does not find the location.
Any ideas / solutions to overcome this issue is appreciated. Thanks.
The text was updated successfully, but these errors were encountered:
One of my team faces the following issue. Has anyone come across a similar issue before?
Airflow Log files to Cleversafe S3 Buckets with default configuration as provided in Airflow ( {dag_id}/{task_id}/{execution_date}/{try_number}.log).
Above dag_id, task_id, execution_date, try_number...All of these are dynamic and will keep on changing in real time.
So essentially logs path come out something like below:
s3://XXXX/airflow/logs/XXX_YYY//2020-04-26T16:01:00+00:00/1.log
When trying to read the log via Logstash S3 connector, it can not read this location as execution_date has + sign and it replaces this + sign with Space. Hence logstash can not read these log files and does not find the location.
Any ideas / solutions to overcome this issue is appreciated. Thanks.
The text was updated successfully, but these errors were encountered: