You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When any dbt run/test/snapshot is executed using Google Cloud Composer 2 (airflow managed by google), with the jobs running in a GKE pod the logs displayed in the log explorer show a wrong format.
Steps To Reproduce
Set up a dag in Airflow that spin up a GKE cluster with this image ghcr.io/dbt-labs/dbt-snowflake:1.2.latest, dbt-utils==0.8.1 and that runs any dbt run/test/snapshot command.
Check the logs in the airflow UI.
Expected behavior
I would expect something like this:
Screenshots and log output
Instead I have something like this:
Note the 0m and 32m at the dbt timestamps and the success 11972.
System information
The output of dbt --version:
Core:
- installed: 1.2.0
- latest: 1.2.1 - Update available!
Your version of dbt-core is out of date!
You can find instructions for upgrading here:
https://docs.getdbt.com/docs/installation
Plugins:
- snowflake: 1.2.0 - Up to date!
github-actionsbot
changed the title
Logs from jobs running using Google Cloud Composer in a GKE pod have a wrong format.
[CT-1119] Logs from jobs running using Google Cloud Composer in a GKE pod have a wrong format.
Sep 5, 2022
Yup, that solved it. Thanks. However, is not possible to apply the config to one specific target (e.g. prod) and leave it enabled in another (e.g. dev), right?
That would be an interesting feature to have.
@rloredo Today, you can enable/disable global configs in three ways:
CLI flag, --no-use-colors, takes top precedence
Env var, DBT_USE_COLORS=0
"user config" in profiles.yml (though cannot be set for different profiles/targets defined within the same profiles.yml)
So while you can't set the "use colors" behavior right within your dev/prod target definitions, it should be simple enough to change the setting for different invocations / environments.
Describe the bug
When any dbt run/test/snapshot is executed using Google Cloud Composer 2 (airflow managed by google), with the jobs running in a GKE pod the logs displayed in the log explorer show a wrong format.
Steps To Reproduce
Set up a dag in Airflow that spin up a GKE cluster with this image
ghcr.io/dbt-labs/dbt-snowflake:1.2.latest
,dbt-utils==0.8.1
and that runs any dbt run/test/snapshot command.Check the logs in the airflow UI.
Expected behavior
I would expect something like this:
Screenshots and log output
Instead I have something like this:
Note the
0m
and32m
at the dbt timestamps and thesuccess 11972
.System information
The output of
dbt --version
:The operating system you're using:
The output of
python --version
:Python 3.10.5
Additional context
When I run it locally using the docker image it shows the output to the console correctly.
The text was updated successfully, but these errors were encountered: