Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

awsemf exporter won't create any log group, nor generating any metrics #849

Open
stevemao opened this issue Feb 3, 2024 · 7 comments
Open

Comments

@stevemao
Copy link

stevemao commented Feb 3, 2024


name: Bug report
about: Create a report to help us improve
title: ''
labels: bug
assignees: ''


Describe the bug
A clear and concise description of what the bug is.

I was able to patch the instrument and generate metrics with logging exporter. I can see the metrics in cloudwatch console. Now I change the exporter to awsemf but I do no see the new log group being created.

Steps to reproduce
If possible, provide a recipe for reproducing the error.

Change the default exporter to awsemf

What did you expect to see?
A clear and concise description of what you expected to see.

A new log group about metrics should be created, and emf formatted logs should be populated. Then the metrics should show it

What did you see instead?
A clear and concise description of what you saw instead.

No new log group is created

What version of collector/language SDK version did you use?
Version: (e.g., v0.58.0, v1.11.0, etc)

latest

What language layer did you use?
Config: (e.g., Java, Python, etc)

nodejs

Additional context
Add any other context about the problem here.

receivers:
  otlp:
    protocols:
      grpc:
        endpoint: "localhost:4317"
      http:
        endpoint: "localhost:4318"

exporters:
  logging:
  awsxray:
  awsemf:

service:
  pipelines:
    traces:
      receivers: [otlp]
      exporters: [awsxray]
    metrics:
      receivers: [otlp]
      exporters: [awsemf]
  telemetry:
    metrics:
      address: localhost:8888
@stevemao
Copy link
Author

stevemao commented Feb 3, 2024

To prove that it is using the exporter, i see these logs in my lambda

{
    "level": "info",
    "ts": 1706941601.1228578,
    "logger": "NewCollector",
    "msg": "Using config URI from environment",
    "uri": "/var/task/collector.yaml"
}
{
    "level": "warn",
    "ts": 1706941601.145583,
    "caller": "[email protected]/emf_exporter.go:74",
    "msg": "the default value for DimensionRollupOption will be changing to NoDimensionRollupin a future release. See https://github.com/open-telemetry/opentelemetry-collector-contrib/issues/23997 for moreinformation",
    "kind": "exporter",
    "data_type": "metrics",
    "name": "awsemf"
}

Copy link

github-actions bot commented May 5, 2024

This issue is stale because it has been open 90 days with no activity. If you want to keep this issue open, please just leave a comment below and auto-close will be canceled

@github-actions github-actions bot added the stale label May 5, 2024
@stevemao
Copy link
Author

stevemao commented May 5, 2024

Not stale

@github-actions github-actions bot removed the stale label May 12, 2024
@Tameem-623
Copy link

Hi @stevemao, were you able to find any solution for your issue? I am also facing the same issue. I am getting the same logs but no metrics are generated. I have also tested my code locally and it is working perfectly.

To prove that it is using the exporter, i see these logs in my lambda

{
    "level": "info",
    "ts": 1706941601.1228578,
    "logger": "NewCollector",
    "msg": "Using config URI from environment",
    "uri": "/var/task/collector.yaml"
}
{
    "level": "warn",
    "ts": 1706941601.145583,
    "caller": "[email protected]/emf_exporter.go:74",
    "msg": "the default value for DimensionRollupOption will be changing to NoDimensionRollupin a future release. See https://github.com/open-telemetry/opentelemetry-collector-contrib/issues/23997 for moreinformation",
    "kind": "exporter",
    "data_type": "metrics",
    "name": "awsemf"
}

Copy link

This issue is stale because it has been open 90 days with no activity. If you want to keep this issue open, please just leave a comment below and auto-close will be canceled

@github-actions github-actions bot added the stale label Sep 22, 2024
@stevemao
Copy link
Author

@Tameem-623 No i haven't...

@github-actions github-actions bot removed the stale label Sep 29, 2024
@codemedian
Copy link

There seems to be a fix that supposedly addresses this in the plain otel variant, see this PR which is included in the latest release version 0.11.0

From what I can tell this change hasn't yet made it into the aws-otel-lambda version but I could be wrong. Are there any updates/workarounds on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants