Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docker Stats Receiver is not enabled #957

Closed
pavankrish123 opened this issue Sep 4, 2020 · 2 comments · Fixed by #958
Closed

Docker Stats Receiver is not enabled #957

pavankrish123 opened this issue Sep 4, 2020 · 2 comments · Fixed by #958
Labels
bug Something isn't working

Comments

@pavankrish123
Copy link
Contributor

pavankrish123 commented Sep 4, 2020

Describe the bug
This is related to PR #495. It looks like docker stats receiver is not enabled in cmd/components.go . I am not sure if it was intentional or we missed the the below logic by accident. Please enable the receiver.

diff --git a/cmd/otelcontribcol/components.go b/cmd/otelcontribcol/components.go
index 6f2a3109..81cc726f 100644
--- a/cmd/otelcontribcol/components.go
+++ b/cmd/otelcontribcol/components.go
@@ -52,6 +52,8 @@ import (
        "github.com/open-telemetry/opentelemetry-collector-contrib/receiver/splunkhecreceiver"
        "github.com/open-telemetry/opentelemetry-collector-contrib/receiver/statsdreceiver"
        "github.com/open-telemetry/opentelemetry-collector-contrib/receiver/wavefrontreceiver"
+       "github.com/open-telemetry/opentelemetry-collector-contrib/receiver/dockerstatsreceiver"
+
 )
 
 func components() (component.Factories, error) {
@@ -90,6 +92,7 @@ func components() (component.Factories, error) {
                statsdreceiver.NewFactory(),
                awsxrayreceiver.NewFactory(),
                splunkhecreceiver.NewFactory(),
+               dockerstatsreceiver.NewFactory(),
        }
        for _, rcv := range factories.Receivers {
                receivers = append(receivers, rcv)

Steps to reproduce

The build from the latest commit when loaded with the below config
./otelcontribcol --config otel-config.yml

throws the following error

otel-collector_1  | {"level":"info","ts":1599247061.6296563,"caller":"service/telemetry.go:106","msg":"Serving Prometheus metrics","address":":8888","legacy_metrics":false,"new_metrics":true,"level":3,"service.instance.id":"33fd869b-e9cc-4ae7-8bfc-9b1272906c1f"}
otel-collector_1  | {"level":"info","ts":1599247061.6300523,"caller":"service/service.go:277","msg":"Loading configuration..."}
otel-collector_1  | Error: cannot load configuration: unknown receivers type "docker_stats" for docker_stats
otel-collector_1  | 2020/09/04 19:17:41 application run finished with error: cannot load configuration: unknown receivers type "docker_stats" for docker_stats

What did you expect to see?

OTel Controller comes up fine and docker stats shows up as expected.

What did you see instead?
OTel Collector stops because it cannot recognize docker_stats receiver

What version did you use?
Version: (commit: c1463db)

What config did you use?

receivers:
  docker_stats:
    endpoint:
    collection_interval: 60s
    provide_per_core_cpu_metrics: true

exporters:
  file:
    path: ./filename.json

service:
  pipelines:
    metrics:
      receivers: [docker_stats]
      exporters: [file]

Environment
Compiler(if manually compiled): Ran make docker-otelcontribcol

Additional context
I posted the diff in the components.go that makes the collector work. Happy to submit a PR but I wanted to make sure if docker_stats receiver was not included purposefully

PS: Great Addition to the list of receivers!

@pavankrish123 pavankrish123 added the bug Something isn't working label Sep 4, 2020
@rmfitzpatrick
Copy link
Contributor

@pavankrish123 thank you for your report. The docker stats receiver is being delivered in stages, and a PR with the final dynamic functionality and component addition will be opened shortly. I apologize for not adding a caveat to its readme explaining that it is still in active development, and only the current changelog details this: https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/master/CHANGELOG.md#unreleased

@pavankrish123
Copy link
Contributor Author

Thanks @rmfitzpatrick - I am closing this issue.

ljmsc referenced this issue in ljmsc/opentelemetry-collector-contrib Feb 21, 2022
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: dependabot[bot] <dependabot[bot]@users.noreply.github.com>
codeboten pushed a commit that referenced this issue Nov 23, 2022
* [instrumentation/wsgi] fix NonRecordingSpan bug

There was a bug caused by accessing `.kind` on a NonRecordingSpan. Added a test to validate the fix.

Fix #956

* fix lint

* use is_recording

* fix lint

* fix lint

* fix lint
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants