Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[exporter/loki] HTTP 429 \"Too Many Requests\": Ingestion rate limit exceeded for user default-logs #36558

Closed
meSATYA opened this issue Nov 27, 2024 · 7 comments
Labels
bug Something isn't working exporter/loki Loki Exporter

Comments

@meSATYA
Copy link

meSATYA commented Nov 27, 2024

Component(s)

exporter/loki

What happened?

Description

While exporting to loki-distributed using loki exporter in otel-collector, it throws below error.

2024-11-27T00:47:45.243Z info internal/retry_sender.go:126 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "logs", "name": "loki/default-logs", "error": "HTTP 429 \"Too Many Requests\": Ingestion rate limit exceeded for user default-logs (limit: 12582912 bytes/sec) while attempting to ingest '60' lines totaling '116866' bytes, reduce log volume or contact your Loki administrator to see if the limit can be increased", "interval": "2.537297293s"}
2024-11-27T00:47:45.243Z info internal/retry_sender.go:126 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "logs", "name": "loki/default-logs", "error": "HTTP 429 \"Too Many Requests\": Ingestion rate limit exceeded for user default-logs (limit: 12582912 bytes/sec) while attempting to ingest '60' lines totaling '130781' bytes, reduce log volume or contact your Loki administrator to see if the limit can be increased", "interval": "4.084086341s"}
2024-11-27T00:47:45.317Z info internal/retry_sender.go:126 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "logs", "name": "loki/default-logs", "error": "HTTP 429 \"Too Many Requests\": Ingestion rate limit exceeded for user default-logs (limit: 12582912 bytes/sec) while attempting to ingest '60' lines totaling '104668' bytes, reduce log volume or contact your Loki administrator to see if the limit can be increased", "interval": "2.55432931s"}

Surprisingly, loki doesn't throws any error in the loki gateway or distributor or ingester component.

Steps to Reproduce

Use the otel configuration below

Expected Result

The rate limit error shouldn't be thrown or the related loki configuration should be known.

Actual Result

It is not known from where the number 12582912 bytes/sec comes because it is not configured in loki. If we consider this number even if the limit is set at 12 MB, but the size in the error is way lesser than that.

Collector version

0.114.0

Environment information

Environment

OS: Ubuntu

OpenTelemetry Collector configuration

exporters:
  debug:
    verbosity: basic

  loki/default-logs:
    endpoint: http://loki-loki-distributed-gateway.logs:80/loki/api/v1/push
    headers:
      x-scope-orgid: default-logs
    tls:
      insecure: true

extensions:
  health_check:
    endpoint: ${env:MY_POD_IP}:13133
processors:
  batch: {}

  batch/default-logs:
    send_batch_max_size: 60
    send_batch_size: 50
    timeout: 10s

  memory_limiter:
    check_interval: 5s
    limit_percentage: 80
    spike_limit_percentage: 25
receivers:
  kafka/processor-logs:
    auth:
      sasl:
        mechanism: PLAIN
        password: ${EVENT_HUB_NAMESPACE_LISTEN_CONNECTION_STRING}
        username: $$ConnectionString
      tls:
        insecure: false
    brokers:
    - dev-event-hub-namespace.servicebus.windows.net:9093
    encoding: otlp_proto
    protocol_version: 3.7.0
    topic: dev-otlp-logs
service:
  extensions:
  - health_check
  pipelines:

    logs/default-logs:
      exporters:
      - loki/default-logs
      processors:
      - filter/default-logs
      - batch/default-logs
      receivers:
      - kafka/processor-logs

    metrics:
      address: ${env:MY_POD_IP}:8888

Log output

2024-11-27T00:47:45.243Z	info	internal/retry_sender.go:126	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "data_type": "logs", "name": "loki/default-logs", "error": "HTTP 429 \"Too Many Requests\": Ingestion rate limit exceeded for user default-logs (limit: 12582912 bytes/sec) while attempting to ingest '60' lines totaling '116866' bytes, reduce log volume or contact your Loki administrator to see if the limit can be increased", "interval": "2.537297293s"}
2024-11-27T00:47:45.243Z	info	internal/retry_sender.go:126	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "data_type": "logs", "name": "loki/default-logs", "error": "HTTP 429 \"Too Many Requests\": Ingestion rate limit exceeded for user default-logs (limit: 12582912 bytes/sec) while attempting to ingest '60' lines totaling '130781' bytes, reduce log volume or contact your Loki administrator to see if the limit can be increased", "interval": "4.084086341s"}
2024-11-27T00:47:45.317Z	info	internal/retry_sender.go:126	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "data_type": "logs", "name": "loki/default-logs", "error": "HTTP 429 \"Too Many Requests\": Ingestion rate limit exceeded for user default-logs (limit: 12582912 bytes/sec) while attempting to ingest '60' lines totaling '104668' bytes, reduce log volume or contact your Loki administrator to see if the limit can be increased", "interval": "2.55432931s"}

Additional context

Respective issue raised on grafana loki: grafana/loki#15140

@meSATYA meSATYA added bug Something isn't working needs triage New item requiring triage labels Nov 27, 2024
@github-actions github-actions bot added the exporter/loki Loki Exporter label Nov 27, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@mar4uk
Copy link
Contributor

mar4uk commented Nov 27, 2024

I believe this error is related to loki configuration, maybe to the ingestion_rate_mb setting. Have you tried tweaking it?

@meSATYA
Copy link
Author

meSATYA commented Nov 27, 2024

Yes, it is currently set at 24MB and burst rate of 36MB.

The config is mentioned here. grafana/loki#15140

@saleelshetye84
Copy link

saleelshetye84 commented Nov 27, 2024

The Opentelemetry contrib code does not have any error saying Ingestion rate limit exceeded for user default-logs (limit: 12582912 bytes/sec) while attempting to ingest '60' lines totaling '104668' bytes, reduce log volume or contact your Loki administrator to see if the limit can be increased", "interval": "2.55432931s"

This error is coming from grafana loki code. And the otel code is just printing out the response from the grafana loki.

https://github.com/grafana/loki/blob/4b5925a28e61f29a20aaabda3a159386a8ba7638/pkg/distributor/distributor.go#L586

I guess this should be an issue on grafana/loki codebase?

@meSATYA
Copy link
Author

meSATYA commented Nov 27, 2024

It is true that it is coming from Loki, but to my surprise the error is not thrown in any of the Loki components like loki gateway or distributor or ingester. Will follow up on the issue raised on Loki side.

@sandeepsukhani
Copy link
Contributor

The limit is enforced by the Loki distributor, and Loki, by default, does not print these rate limit errors in logs to avoid becoming noisy. You can turn on config to enable printing of those errors https://github.com/grafana/loki/blob/bd46e4c7b27798209894b5d515a42f06a25df02e/pkg/runtime/config.go#L25

I recommend opening an issue in the Loki repo and closing this issue since it is related to Loki.

@atoulme atoulme removed the needs triage New item requiring triage label Dec 7, 2024
@meSATYA
Copy link
Author

meSATYA commented Dec 9, 2024

An issue is already opened on Loki. grafana/loki#15140

@meSATYA meSATYA closed this as completed Dec 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working exporter/loki Loki Exporter
Projects
None yet
Development

No branches or pull requests

5 participants