-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Logging with google-cloud-logging-logback on K8S missing some resource.labels #5765
Comments
@jessewebb i tested on K8S and i am able see resouce.labels.
|
Hmmmm, it's interesting that you get those labels but I don't. This problem is happening for several micro-services of ours (8+) so I am curious what might be different between our setups. We have akka based apps, with these dependencies in our
Here is one of our
Maybe this isn't a problem with the library, since you did not get the same problem, but it does seem to affect other users too. As mentioned earlier, another user has posted elsewhere that they are experiencing the same problem. |
I have the same issue, it seems that this is related to the google-cloud-logging dependencies, if I deploy my app with google-cloud-logging, my containers are missing some information when viewed in stackdriver kubernetes monitoring (container cannot be associated to a pod): If I remove the google-cloud-logging dependencies, the container can be associated to a pod again: I've used following dependency version:
|
I tested this again today, and it seems that my resource labels are being correctly set now. |
We tried upgrading to the latest version of the lib ( We decided to just switch our apps to use JSON logging instead to work around the problem. To do this, we replaced the Google cloud logging dependency in our
And we also updated our
Now the Log Viewer seems to show all of the correct labels for us. |
I also added a comment today on closed issue #2912. But I figure it's better to use this issue that is still open. We are using the Spring Boot starter |
Tested to use
|
Obviously the
=
=
=
|
Thanks for the report. I was able to reproduce this issue as well. The client library sources all of these field from one of two places. First is the metadata serve, which @ohlen1 points out does not have the information we need( At this time I don't think it is possible for the client library to source this information. The appropriate next step would be to open up a feature request to GKE via the Google Cloud Issue Tracker.. We would need them to either expose these bits of information from either the metadata server or environment variables by default to the pods. Once this is done we can open up another issue to properly source in the fields from the client lib. @jessewebb Could I ask you to raise that issue if it is not too much trouble? |
As I said above this is not an issue the client libraries can currently solve. I have opened up a feature request to GKE. If that feature is implemented, we can work on exposing these values. Issue is now being tracked via: https://issuetracker.google.com/145009094 |
Environment: GKE
Java Library:
google-cloud-logging-logback
Version:
0.97.0-alpha
I am running some Java/Scala micro-services in GKE. Initially, we just logged to stdout and our logs appeared in Log Viewer automatically. Unfortunately, the logs were not showing proper severities and multi-line log entries were appearing as multiple entries.
Using the instructions on this doc page, we changed our apps to use the
google-cloud-logging-logback
lib to use Stackdriver Logging instead of stdout. This fixed our severity and multi-line issues.But now we noticed that the logs created using the Stackdriver Logging logback lib doesn't include all of the correct
resource.labels
that exist when using stdout logging.A stdout log entry looks like this:
But a Stackdriver Logging entry looks like this:
This is causing our logs to not show up when viewing "Container logs" from a GKE workload. When you are viewing the details of a GKE workload, and you click the link for "Container logs", it sends you to Log Viewer with the following filter:
This default filter doesn't show the logs correctly because of the missing filters.
I found an older, "Closed" issue which was created because all
resource.labels
were at one point empty.#2912
During work on that issue last year, some of the fields were changed to be populated, but not all. It seems
resource.labels.container_name
andresource.labels.namespace_id
are still empty. Another user (@JoaoPPinto) even commented on that closed issue after it was closed, mentioning that they are also facing the same problem that I am.Please update this lib to include the
resource.labels.container_name
andresource.labels.namespace_id
metadata on the logs so that "Container logs" filter works properly.The text was updated successfully, but these errors were encountered: