Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable metric integration test for req-blocking #2445

Merged

Conversation

cijothomas
Copy link
Member

@cijothomas cijothomas commented Dec 18, 2024

Reqwest-blocking-client should work for Metrics, so that is enabled.
Also changed the verbosity of internal logs to avoid overwhelming amount of debug logs. We still get debug level logs from OpenTelemetry.
Also enabled internal-logs from OTLP exporter itself.

@cijothomas cijothomas requested a review from a team as a code owner December 18, 2024 03:10
@@ -293,7 +293,7 @@ mod tests {
// Set up the exporter
let exporter = create_exporter();
let reader = PeriodicReader::builder(exporter)
.with_interval(Duration::from_millis(100))
.with_interval(Duration::from_secs(30))
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

putting a large interval here, to make sure the test is actually testing shutdown triggered exporting, not a timer triggered.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good idea

@@ -30,7 +30,7 @@ async fn init_metrics() -> SdkMeterProvider {
let exporter = create_exporter();

let reader = PeriodicReader::builder(exporter)
.with_interval(Duration::from_millis(100))
.with_interval(Duration::from_millis(500))
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the sleep is around 5 seconds, so 500 msec interval is fine, and this reduces the amount of internal logs.
Also, I am thinking if we should rely on force_flush/shutdown for integration tests more, to avoid this sleep requirement. We should still have one set of tests for testing the export triggered by interval.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we'll still need a bit of a buffer because of the flushing on the otlp-collector side? What I think what would be the nicest would be to setup a simple, short exponential backoff mechanism in here - the function knows what it is looking for for the particular test, so it's well positioned to decide to wait a bit and look again:

pub fn fetch_latest_metrics_for_scope(scope_name: &str) -> Result<Value> {

so that we can use a tighter timing in the best case.

I've also added this rotation thing for logs, which I believe will decrease buffering on the collector side - doesn't help us with the other signals, though:

file/logs:
path: /testresults/logs.json
rotation:

Copy link

codecov bot commented Dec 18, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 76.8%. Comparing base (9011f63) to head (942a647).
Report is 1 commits behind head on main.

Additional details and impacted files
@@          Coverage Diff          @@
##            main   #2445   +/-   ##
=====================================
  Coverage   76.8%   76.8%           
=====================================
  Files        122     122           
  Lines      21823   21823           
=====================================
  Hits       16772   16772           
  Misses      5051    5051           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@cijothomas cijothomas added the integration tests Run integration tests label Dec 18, 2024
@@ -146,7 +146,7 @@ async fn setup_metrics_test() -> Result<()> {
println!("Running setup before any tests...");
*done = true; // Mark setup as done

// Initialise the metrics subsystem
// Initialize the metrics subsystem
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

US-English spelling strikes again 🤣

Copy link
Contributor

@scottgerring scottgerring left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice - lgtm!

@cijothomas cijothomas merged commit fbcba3b into open-telemetry:main Dec 18, 2024
23 checks passed
@cijothomas cijothomas deleted the cijothomas/integration-metric-test1 branch December 18, 2024 14:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
integration tests Run integration tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants