Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[docker_stats] - Could not parse docker containerStats for container id #34194

Open
julianocosta89 opened this issue Jul 22, 2024 · 3 comments
Open

Comments

@julianocosta89
Copy link
Member

Component(s)

receiver/dockerstats

What happened?

Description

When running the OTel Demo we started getting the following errors (also reported here open-telemetry/opentelemetry-demo#1677):

2024-07-22T12:11:15.647Z	error	[email protected]/docker.go:194	Could not parse docker containerStats for container id	{"kind": "receiver", "name": "docker_stats", "data_type": "metrics", "id": "fcd69fe4a34e8b15153005bc42fda1e7e41c6f743cec83b8171a06a432561452", "error": "context canceled"}
github.com/open-telemetry/opentelemetry-collector-contrib/internal/docker.(*Client).toStatsJSON
	github.com/open-telemetry/opentelemetry-collector-contrib/internal/[email protected]/docker.go:194
github.com/open-telemetry/opentelemetry-collector-contrib/internal/docker.(*Client).FetchContainerStatsAsJSON
	github.com/open-telemetry/opentelemetry-collector-contrib/internal/[email protected]/docker.go:144
github.com/open-telemetry/opentelemetry-collector-contrib/receiver/dockerstatsreceiver.(*metricsReceiver).scrapeV2.func1
	github.com/open-telemetry/opentelemetry-collector-contrib/receiver/[email protected]/receiver.go:92
2024-07-22T12:11:15.663Z	error	scraperhelper/scrapercontroller.go:197	Error scraping metrics	{"kind": "receiver", "name": "docker_stats", "data_type": "metrics", "error": "context canceled", "scraper": "docker_stats"}
go.opentelemetry.io/collector/receiver/scraperhelper.(*controller).scrapeMetricsAndReport
	go.opentelemetry.io/collector/[email protected]/scraperhelper/scrapercontroller.go:197
go.opentelemetry.io/collector/receiver/scraperhelper.(*controller).startScraping.func1
	go.opentelemetry.io/collector/[email protected]/scraperhelper/scrapercontroller.go:169

Steps to Reproduce

git clone [email protected]:open-telemetry/opentelemetry-demo.git

cd opentelemetry-demo/

docker compose up -d

Expected Result

No errors on the OTel collector logs.

Actual Result

Errors mentioned above in the Collector logs.

Collector version

Tested with 0.102.1, 0.104.0 and 0.105.0

Environment information

Environment

System Version: macOS 14.5 (23F79)
Kernel Version: Darwin 23.5.0

❯ docker info
Client:
 Version:    24.0.7
 Context:    desktop-linux
Server:
 Runtimes: runc io.containerd.runc.v2
 Default Runtime: runc
 Init Binary: docker-init
 containerd version: d8f198a4ed8892c764191ef7b3b06d8a2eeb5c7f
 runc version: v1.1.10-0-g18a0cb0
 init version: de40ad0
 Security Options:
  seccomp
   Profile: unconfined
  cgroupns
 Kernel Version: 6.5.11-linuxkit
 Operating System: Docker Desktop
 OSType: linux
 Architecture: aarch64

OpenTelemetry Collector configuration

No response

Log output

No response

Additional context

No response

@julianocosta89 julianocosta89 added bug Something isn't working needs triage New item requiring triage labels Jul 22, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@mx-psi mx-psi changed the title [doscker_stats] - Could not parse docker containerStats for container id [docker_stats] - Could not parse docker containerStats for container id Jul 23, 2024
@julianocosta89
Copy link
Member Author

Just adding more context here, I've tried setting api_version and increasing the timeout, but none of those changed the behaviour.

Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants