Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to fetch prometheus c++ exporter stats on metricbeat #8291

Closed
krishan-agrawal-guavus opened this issue Sep 12, 2018 · 4 comments
Closed
Assignees
Labels
bug containers Related to containers use case Metricbeat Metricbeat module Team:Integrations Label for the Integrations team

Comments

@krishan-agrawal-guavus
Copy link

Prometheus C++ exporter was modified to export our application stats and we planned to export these stats to Metricbeat (and to ElasticSearch) on some remote server.
My issues is that I am able to get Prometheus stats from my Kafka cluster using JMX exporter configuring the correct port details.
However, when using the cpp exporter I have not been able to get these stats into Metricbeat.
Although, when I do request using browser or curl request on configured port, then I can see those stats.

Do I need to take care of any pre-requisites while exporting Prometheus stats using this c++ exporter Or any specific conditions to be met so that metric beat can read and export these stats?

Sample exporter stats:
collector_stats{status="YYY",type="XXX"} 0.000000

@exekias
Copy link
Contributor

exekias commented Sep 12, 2018

Hi @krishan-agrawal-guavus, thank you for your feedback! We will need some more info in order to debug this issue. Could you share these?:

  • Metricbeat settings you are using
  • Metricbeat logs when running it
  • The full dump you get when manually fetching the metrics from the C++ exporter

Best regards

@krishan-agrawal-guavus
Copy link
Author

krishan-agrawal-guavus commented Sep 13, 2018

Metric beat settings:

  1. Enable prometheus and system stats
  2. The output of prometheus.yml:
- module: prometheus
  period: 10s
  hosts: ["localhost:8090"]
  metrics_path: /metrics
  namespace: example1

The version is 6.4

The logs of metricbeat are attached.
metricbeat.gz
screen shot 2018-09-13 at 10 36 56 am

Here are the dump of metrics from the C++ exporter

# HELP exposer_bytes_transferred bytesTransferred to metrics services
# TYPE exposer_bytes_transferred counter
exposer_bytes_transferred 17701173.000000
# HELP exposer_total_scrapes Number of times metrics were scraped
# TYPE exposer_total_scrapes counter
exposer_total_scrapes 16166.000000
# HELP exposer_request_latencies Latencies of serving scrape requests, in microseconds
# TYPE exposer_request_latencies summary
exposer_request_latencies_count 16166
exposer_request_latencies_sum 10217747.000000
exposer_request_latencies_quantile{quantile="0.500000"} 531.000000
exposer_request_latencies_quantile{quantile="0.900000"} 582.000000
exposer_request_latencies_quantile{quantile="0.990000"} 631.000000
# HELP collector_stats metrics related to collector
# TYPE collector_stats gauge
collector_stats{status="process",type="EdrFlow"} 1166.000000
collector_stats{status="drop",type="EdrFlow"} 319.000000
# HELP collector_stats metrics related to collector
# TYPE collector_stats gauge
collector_stats{status="process",type="EdrHttp"} 12.000000
collector_stats{status="drop",type="EdrHttp"} 0.000000

@exekias exekias added containers Related to containers use case and removed feedback needed labels Sep 13, 2018
@exekias
Copy link
Contributor

exekias commented Sep 13, 2018

Thanks for the detailed report. From what I see collector_stats is somehow duplicated and that's causing the client library to fail when parsing the metrics with this error:
text format parsing error in line 19: second HELP line for metric name "collector_stats"

This error is not currently reported but I think we should change that.

@ruflin ruflin added the Team:Integrations Label for the Integrations team label Nov 21, 2018
@exekias
Copy link
Contributor

exekias commented Apr 3, 2019

this module was refactored here: #9948, I think the problem should be gone now, please reopen if you are able to reproduce again with 7.0 (once released)

@exekias exekias closed this as completed Apr 3, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug containers Related to containers use case Metricbeat Metricbeat module Team:Integrations Label for the Integrations team
Projects
None yet
Development

No branches or pull requests

3 participants