-
Notifications
You must be signed in to change notification settings - Fork 634
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactoring of Scrape process, fixing multiple module issues #1111
Conversation
Signed-off-by: Kakuya Ando <[email protected]>
…ScrapeTarget function Signed-off-by: Kakuya Ando <[email protected]>
…nformation Signed-off-by: Kakuya Ando <[email protected]>
Signed-off-by: Kakuya Ando <[email protected]>
Signed-off-by: Kakuya Ando <[email protected]>
Signed-off-by: Kakuya Ando <[email protected]>
We have added error handling and testing that was not done. Could you please review it? |
Interesting, I'm currently on vacation but will try and review this soon. |
I've been considering implementing a full walk cache in the exporter, to help with things like "Cache data for 60s, but cache
If you have improvements to gosnmp, I would happily review those as well, as I am a maintainer there. I would not want to implement that inside the exporter when gosnmp is a well supported and widely used community implementation. |
We assume a temporary cache for de-duplication, as previously pointed out. |
Sorry, I didn't write the intention I wanted to convey correctly: we considered making the return value of the SNMPScraper interface independent of gosnmp, but we used gosnmp as the return value because the current situation is better in terms of error handling. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks reasonable, I think the refactoring makes sense.
Signed-off-by: Kakuya Ando <[email protected]>
Thanks for the review. I fixed it |
Signed-off-by: Kakuya Ando <[email protected]>
} | ||
return | ||
} | ||
level.Debug(g.logger).Log("msg", "Get of OIDs completed", "oids", oids, "duration_seconds", time.Since(st)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With the new scraper, the module
logging context is lost. I guess if we're heading towards duplication and caching this won't matter. It's just a change I noticed in some local testing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for confirming that. As you say, we will leave it at that.
CC @bastischubert You may be interested in this. |
@servak I wonder if it would be useful to add an |
I think it's a good idea. |
* [CHANGE] Improve generator parse error handling #1167 * [ENHANCEMENT] generator: Add generator HELP override #1106 * [ENHANCEMENT] Refactoring of Scrape process, fixing multiple module issues #1111 * [ENHANCEMENT] Skip using an interactive terminal in "make docker-generate". #1113 * [ENHANCEMENT] Add SNMPInflight metric #1119 * [FEATURE] Support for passing username, password & priv_password as env vars #1074 * [FEATURE] Add GoSNMP logger #1157 * [FEATURE] Add a "snmp_context" parameter to the URL #1163 * [BUGFIX] generator: curl failed #1094 * [BUGFIX] Fix SNMPv3 password configuration #1122 * [BUGFIX] generator: Update generator User-Agent #1133 * [BUGFIX] generator: fix mibs directory specification for parse_errors command #1135 * [BUGFIX] generator: remove extra character from dell iDrac-SMIv1 MIB #1141 * [BUGFIX] Fix do not expand envvars for empty config fields #1148 snmp.yml changes: * Updated Cisco MIBs #1180 * Updated Cyberpower MIBs #1124 * Updated servertech_sentry3 #1090 * Added support for Dell iDrac #1125 --------- Signed-off-by: Sebastian Schubert <[email protected]>
Since enabling the specification of multiple modules, we have observed an increase in DNS requests.
This issue seems to stem from gosnmp performing name resolution for each module.
We have revised the implementation around SNMP scraping.
By having each worker hold its instance of gosnmp, we can resolve the aforementioned issue.
Additionally, we improved maintainability by utilizing an interface to hold gosnmp instead of doing so directly in ScrapeTarget. We're hoping this change will enable us to implement a scraper that avoids retrieving the same OID multiple times by caching the OIDs once they are fetched.
Initially, we contemplated completely eliminating the dependency on gosnmp, but such a measure would have excessively broadened the scope of implementation, prompting us to forgo this option.
I haven't yet implemented error handling for connection failures, and it's still a work in progress, but I would appreciate a review including feedback on the general direction.
test env
test command log