Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sentry: mapping OTLP service.name to DSNs for multiple Sentry Projects #3869

Closed
sodabrew opened this issue Jun 23, 2021 · 10 comments
Closed

Comments

@sodabrew
Copy link
Contributor

Is your feature request related to a problem? Please describe.

The Sentry exporter takes a DSN argument which maps to a single Project, but I wouldn't want to run a separate OpenTelemetry Collector process for each of my projects.
https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/exporter/sentryexporter

Describe the solution you'd like
Ideally, Sentry itself could accept traces for multiple projects through a single API key and use the service name to put the traces into the right one.

Alternatively, have a mapping of service.name to dsn values in the opentelemetry collector config file. This wouldn't require changes to the Sentry hosted service and wouldn't be an unreasonable burden on the collector config (at least to have a starting point for next steps)

https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/resource/semantic_conventions/README.md#service

cc @AbhiPrasad

@sodabrew sodabrew changed the title Sentry: mapping OTLP application names to DSNs Sentry: mapping OTLP service.name to DSNs for multiple Sentry Projects Jun 23, 2021
@AbhiPrasad
Copy link
Member

AbhiPrasad commented Jun 23, 2021

I think this is a very reasonable request - in fact we’ve tracked a similar issue in our fork of the collector. getsentry#10

I’ll assign myself there and see if I can push out a patch. Thanks for reporting!

@sodabrew
Copy link
Contributor Author

Thank you!

@RyanSiu1995
Copy link

Upvote for this.
Instead of getting API key, the exporter maybe can also respect a custom field, which contains DSN, inside the trace data.
Otherwise, we need to have a ton of sentry exporters inside the cluster.

@github-actions github-actions bot added the Stale label Aug 1, 2021
@bogdandrutu bogdandrutu removed the Stale label Aug 9, 2021
alexperez52 referenced this issue in open-o11y/opentelemetry-collector-contrib Aug 20, 2021
@bloodsper
Copy link

Was this ever finished somewhere? If not; have people been setting up an insane amount of sentry exporters?

@Dalkenn
Copy link

Dalkenn commented May 12, 2022

I am also facing the same issue, was anybody able to do something about it ?

@ananthakumaran
Copy link

This issue can be worked around using routing processor. sample below

  processors:
    batch: {}
    memory_limiter: {}
    routing:
      attribute_source: resource
      from_attribute: service.name
      default_exporters:
      - logging
      table:
      - value: foo
        exporters: [sentry/foo]
      - value: bar
        exporters: [sentry/bar]
  exporters:
    logging: {}
    sentry/foo:
      dsn: "https://[email protected]/n"
    sentry/bar:
      dsn: "https://[email protected]/n"
  service:
    pipelines:
      traces:
        receivers:
          - otlp
        processors:
          - memory_limiter
          - batch
          - routing
        exporters:
          - logging
          - sentry/foo
          - sentry/bar

@github-actions
Copy link
Contributor

github-actions bot commented Nov 8, 2022

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

@github-actions
Copy link
Contributor

Pinging code owners for exporter/sentry: @AbhiPrasad. See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot removed the Stale label Mar 22, 2023
@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label May 22, 2023
@github-actions
Copy link
Contributor

This issue has been closed as inactive because it has been stale for 120 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Jul 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants