Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[exporter/awsemf] Exporter ignores the first batch of a metric send to cloudwatch #16218

Closed
mircohacker opened this issue Nov 9, 2022 · 4 comments
Labels
bug Something isn't working exporter/awsemf awsemf exporter priority:p2 Medium

Comments

@mircohacker
Copy link
Contributor

Component(s)

exporter/awsemf

What happened?

Description

The awsemf exporter of the ADOT distribution of OTEL sends faulty data for the first batch of a metric send to AWS Cloudwatch.

Steps to reproduce

  1. The code is checked in at https://github.com/mircohaug/otel-awsemf-reproduction-ignored-first-batch
  2. Authenticate your shell against an AWS account with the default profile. (Or change the value of the AWS_PROFILE env var in step 3)
  3. create the log group emfbug-reproduction-embedded-metrics-otel by running aws logs create-log-group --log-group-name emfbug-reproduction-embedded-metrics-otel
  4. Start the otel agent by running
    docker run -d --rm -p 4317:4317 \ -e AWS_REGION=eu-central-1 \ -e AWS_PROFILE=default \ -v ~/.aws:/root/.aws \ -v "$(pwd)/otel-agent-config.yaml":/otel-local-config.yaml \ --name awscollector \ public.ecr.aws/aws-observability/aws-otel-collector:latest \ --config otel-local-config.yaml;
  5. run yarn install
  6. create metrics by running yarn start
    1. In the code we create a new OTEL counter. We add 1 to the counter four times in total. We Split these four increments on two batches with two increments each. Between these batches there is a wait time to allow the OTEL agent to flush the values to AWS.
  7. Run this log insights query (fields counter_name,@timestamp on the log group emfbug-reproduction-embedded-metrics-otel ) to see the published EMF Metrics.
    1. See actual and expected result
  8. Cleanup by running docker rm -f awscollector and aws logs delete-log-group --log-group-name emfbug-reproduction-embedded-metrics-otel

Expected Result

We expect the log-group to contain two entries with a value of 2. One entry for each batch.

Actual result

We only get one entry with a value of 2. The first batch gets its value set to 0.

Additional information

  • The behaviour persists over multiple runs of the script.
  • Also does it happen for each new combination of counter name, and attributes.
  • In Order to exclude a faulty implementation in the OTEL Framework we also added a ConsoleMetricExporter alongside the one that exports the metrics to the OTEL agent. This Exporter print the correct values to the console.
  • In addition we added a file exporter to the OTEL agent pipeline. This one also shows the correct values.
  • Small waits between the batches lead to all four increments ending up in the same batch and we se no value in cloudwatch whatsoever.

Collector version

v0.23.0

Environment information

Environment

OS: OSX

OpenTelemetry Collector configuration

receivers:
  otlp:
    protocols:
      grpc:
exporters:
  awsemf:
    resource_to_telemetry_conversion:
      enabled: true
    log_group_name: "emfbug-reproduction-embedded-metrics-otel"
    log_stream_name: "otel-stream"
    namespace: "your-metric-namespace"
service:
  pipelines:
    metrics:
      receivers: [otlp]
      exporters: [awsemf]

Log output

2022/11/09 16:02:03 ADOT Collector version: v0.23.0
2022/11/09 16:02:03 found no extra config, skip it, err: open /opt/aws/aws-otel-collector/etc/extracfg.txt: no such file or directory
2022-11-09T16:02:03.688Z        info    service/telemetry.go:121        Setting up own telemetry...
2022-11-09T16:02:03.698Z        info    service/telemetry.go:143        Serving Prometheus metrics      {"address": ":8888", "level": "basic"}
2022-11-09T16:02:03.720Z        info    service/service.go:90   Starting aws-otel-collector...  {"Version": "v0.23.0", "NumCPU": 6}
2022-11-09T16:02:03.721Z        info    extensions/extensions.go:42     Starting extensions...
2022-11-09T16:02:03.721Z        info    pipelines/pipelines.go:74       Starting exporters...
2022-11-09T16:02:03.721Z        info    pipelines/pipelines.go:78       Exporter is starting... {"kind": "exporter", "data_type": "metrics", "name": "awsemf"}
2022-11-09T16:02:03.721Z        info    pipelines/pipelines.go:82       Exporter started.       {"kind": "exporter", "data_type": "metrics", "name": "awsemf"}
2022-11-09T16:02:03.721Z        info    pipelines/pipelines.go:86       Starting processors...
2022-11-09T16:02:03.721Z        info    pipelines/pipelines.go:98       Starting receivers...
2022-11-09T16:02:03.721Z        info    pipelines/pipelines.go:102      Receiver is starting... {"kind": "receiver", "name": "otlp", "pipeline": "metrics"}
2022-11-09T16:02:03.722Z        info    otlpreceiver/otlp.go:71 Starting GRPC server    {"kind": "receiver", "name": "otlp", "pipeline": "metrics", "endpoint": "0.0.0.0:4317"}
2022-11-09T16:02:03.722Z        info    pipelines/pipelines.go:106      Receiver started.       {"kind": "receiver", "name": "otlp", "pipeline": "metrics"}
2022-11-09T16:02:03.722Z        info    service/service.go:107  Everything is ready. Begin running and processing data.
2022-11-09T16:02:16.408Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:16.409Z        info    [email protected]/emf_exporter.go:184      Finish processing resource metrics      {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:16.474Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:16.576Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:16.685Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:16.776Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:16.843Z        info    [email protected]/pusher.go:298    logpusher: publish log events successfully.     {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "NumOfLogEvents": 1, "LogEventsSize": 0.345703125, "Time": 62}
2022-11-09T16:02:16.876Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:16.977Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:16.981Z        info    [email protected]/emf_exporter.go:184      Finish processing resource metrics      {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:17.037Z        info    [email protected]/pusher.go:298    logpusher: publish log events successfully.     {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "NumOfLogEvents": 1, "LogEventsSize": 0.345703125, "Time": 54}
2022-11-09T16:02:17.078Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:17.179Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:17.183Z        info    [email protected]/emf_exporter.go:184      Finish processing resource metrics      {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:17.239Z        info    [email protected]/pusher.go:298    logpusher: publish log events successfully.     {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "NumOfLogEvents": 1, "LogEventsSize": 0.345703125, "Time": 55}
2022-11-09T16:02:17.279Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:17.379Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:17.384Z        info    [email protected]/emf_exporter.go:184      Finish processing resource metrics      {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:17.438Z        info    [email protected]/pusher.go:298    logpusher: publish log events successfully.     {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "NumOfLogEvents": 1, "LogEventsSize": 0.345703125, "Time": 53}
2022-11-09T16:02:17.481Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:17.582Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:17.586Z        info    [email protected]/emf_exporter.go:184      Finish processing resource metrics      {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:17.641Z        info    [email protected]/pusher.go:298    logpusher: publish log events successfully.     {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "NumOfLogEvents": 1, "LogEventsSize": 0.345703125, "Time": 53}
2022-11-09T16:02:17.682Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:17.783Z        info    [email protected]/emf_exporter.go:133      Start processing resource metrics       {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:17.789Z        info    [email protected]/emf_exporter.go:184      Finish processing resource metrics      {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "labels": {"service.name":"backend"}}
2022-11-09T16:02:17.840Z        info    [email protected]/pusher.go:298    logpusher: publish log events successfully.     {"kind": "exporter", "data_type": "metrics", "name": "awsemf", "NumOfLogEvents": 1, "LogEventsSize": 0.345703125, "Time": 50}
<repeats further>

Additional context

  • The behaviour persists over multiple runs of the script.
  • Also does it happen for each new combination of counter name, and attributes.
  • In Order to exclude a faulty implementation in the OTEL Framework we also added a ConsoleMetricExporter alongside the one that exports the metrics to the OTEL agent. This Exporter print the correct values to the console.
  • In addition we added a file exporter to the OTEL agent pipeline. This one also shows the correct values.
  • Small waits between the batches lead to all four increments ending up in the same batch and we se no value in cloudwatch whatsoever.
@mircohacker mircohacker added bug Something isn't working needs triage New item requiring triage labels Nov 9, 2022
@github-actions github-actions bot added the exporter/awsemf awsemf exporter label Nov 9, 2022
@github-actions
Copy link
Contributor

github-actions bot commented Nov 9, 2022

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions
Copy link
Contributor

github-actions bot commented Jan 9, 2023

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Jan 9, 2023
@mircohacker
Copy link
Contributor Author

I got some pointers in the downstream issue aws-observability/aws-otel-collector#1653 and will create a PR to make this configurable

@Aneurysm9 Aneurysm9 added priority:p2 Medium and removed Stale needs triage New item requiring triage labels Jan 23, 2023
mircohacker added a commit to mircohacker/opentelemetry-collector-contrib that referenced this issue Jan 24, 2023
mircohacker added a commit to mircohacker/opentelemetry-collector-contrib that referenced this issue Feb 24, 2023
mircohacker added a commit to mircohacker/opentelemetry-collector-contrib that referenced this issue Mar 1, 2023
codeboten pushed a commit that referenced this issue Mar 1, 2023
@mircohacker
Copy link
Contributor Author

PR #17988 was merged. In the next release there exists the configuration flag retain_initial_value_of_delta_metric in the emf subsection. If this flag is set to true the initial value of any metric is not dropped any more. cc @jan-xyz

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working exporter/awsemf awsemf exporter priority:p2 Medium
Projects
None yet
Development

No branches or pull requests

2 participants