Skip to content

Commit

Permalink
add alert ThanosQueryOverload to mixin
Browse files Browse the repository at this point in the history
Signed-off-by: Haoyu Sun <[email protected]>
  • Loading branch information
raptorsun committed Jun 23, 2022
1 parent a0f4181 commit c2a3171
Show file tree
Hide file tree
Showing 4 changed files with 34 additions and 0 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ We use *breaking :warning:* to mark changes that are not backward compatible (re
- [#5408](https://github.com/thanos-io/thanos/pull/5391) Receive: Add support for consistent hashrings.
- [#5391](https://github.com/thanos-io/thanos/pull/5391) Receive: Implement api/v1/status/tsdb.
- [#5424](https://github.com/thanos-io/thanos/pull/5424) Receive: export metrics regarding size of remote write requests
- [#5439](https://github.com/thanos-io/thanos/pull/5424) Add Alert ThanosQueryOverload to Mixin.

### Changed

Expand Down
16 changes: 16 additions & 0 deletions examples/alerts/alerts.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -170,6 +170,22 @@ groups:
for: 10m
labels:
severity: critical
- alert: ThanosQueryOverload
annotations:
description: Thanos Query {{$labels.job}} has been overloaded for more than
15 minutes. This may be a symptom of excessive simultanous complex requests,
low performance of the Prometheus API, or failures within these components.
Assess the health of the Thanos query instances, the connnected Prometheus
instances, look for potential senders of these requests and then contact support.
runbook_url: https://github.com/thanos-io/thanos/tree/main/mixin/runbook.md#alert-name-thanosqueryoverload
summary: Thanos query reaches its maximum capacity serving concurrent requests.
expr: |
(
max_over_time(thanos_query_concurrent_gate_queries_max[5m]) - avg_over_time(thanos_query_concurrent_gate_queries_in_flight[5m]) < 1
)
for: 15m
labels:
severity: warning
- name: thanos-receive
rules:
- alert: ThanosReceiveHttpRequestErrorRateHigh
Expand Down
16 changes: 16 additions & 0 deletions mixin/alerts/query.libsonnet
Original file line number Diff line number Diff line change
Expand Up @@ -142,6 +142,22 @@
severity: 'critical',
},
},
{
alert: 'ThanosQueryOverload',
annotations: {
description: 'Thanos Query {{$labels.job}}%s has been overloaded for more than 15 minutes. This may be a symptom of excessive simultanous complex requests, low performance of the Prometheus API, or failures within these components. Assess the health of the Thanos query instances, the connnected Prometheus instances, look for potential senders of these requests and then contact support.' % location,
summary: 'Thanos query reaches its maximum capacity serving concurrent requests.',
},
expr: |||
(
max_over_time(thanos_query_concurrent_gate_queries_max[5m]) - avg_over_time(thanos_query_concurrent_gate_queries_in_flight[5m]) < 1
)
||| % thanos.query,
'for': '15m',
labels: {
severity: 'warning',
},
},
],
},
],
Expand Down
1 change: 1 addition & 0 deletions mixin/runbook.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,7 @@
|ThanosQueryHighDNSFailures|Thanos Query is having high number of DNS failures.|Thanos Query {{$labels.job}} have {{$value humanize}}% of failing DNS queries for store endpoints.|warning|[https://github.com/thanos-io/thanos/tree/main/mixin/runbook.md#alert-name-thanosqueryhighdnsfailures](https://github.com/thanos-io/thanos/tree/main/mixin/runbook.md#alert-name-thanosqueryhighdnsfailures)|
|ThanosQueryInstantLatencyHigh|Thanos Query has high latency for queries.|Thanos Query {{$labels.job}} has a 99th percentile latency of {{$value}} seconds for instant queries.|critical|[https://github.com/thanos-io/thanos/tree/main/mixin/runbook.md#alert-name-thanosqueryinstantlatencyhigh](https://github.com/thanos-io/thanos/tree/main/mixin/runbook.md#alert-name-thanosqueryinstantlatencyhigh)|
|ThanosQueryRangeLatencyHigh|Thanos Query has high latency for queries.|Thanos Query {{$labels.job}} has a 99th percentile latency of {{$value}} seconds for range queries.|critical|[https://github.com/thanos-io/thanos/tree/main/mixin/runbook.md#alert-name-thanosqueryrangelatencyhigh](https://github.com/thanos-io/thanos/tree/main/mixin/runbook.md#alert-name-thanosqueryrangelatencyhigh)|
|ThanosQueryOverload|Thanos query reaches its maximum capacity serving concurrent requests.|Thanos Query {{$labels.job}} has been overloaded for more than 15 minutes. This may be a symptom of excessive simultanous complex requests, low performance of the Prometheus API, or failures within these components. Assess the health of the Thanos query instances, the connnected Prometheus instances, look for potential senders of these requests and then contact support.|warning|[https://github.com/thanos-io/thanos/tree/main/mixin/runbook.md#alert-name-thanosqueryoverload](https://github.com/thanos-io/thanos/tree/main/mixin/runbook.md#alert-name-thanosqueryoverload)|

## thanos-receive

Expand Down

0 comments on commit c2a3171

Please sign in to comment.