Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flaky test: (missing spans) IntegrationTests.GraphQL.GraphQLTests.SubmitsTraces #1235

Closed
pjanotti opened this issue Sep 15, 2022 · 6 comments
Milestone

Comments

@pjanotti
Copy link
Contributor

(This is not the same as #424)

3 recent failures of this test show that only 5 spans matching the test expectations, but, the mock Zipkin collector actually received 11 spans as expected. Looking at the test code it seems that the "validation" spans are not being accepted. The test received 5 spans and that matches with the expected number of "execution" spans.

@pjanotti
Copy link
Contributor Author

pjanotti commented Sep 15, 2022

BTW, no local repro so far for me (325 runs).

@pjanotti
Copy link
Contributor Author

No repro in 20 runs of the test via verify-test on my fork of the repo. It seems that the failure is related to running near other tests, that said, there was a small change on my private CI run that perhaps could affect the outcome pjanotti@a752ce3

@RassK
Copy link
Contributor

RassK commented Sep 19, 2022

note: it's interesting that zipkin collector reports receiving many spans but test doesn't see any 🤔
https://github.com/open-telemetry/opentelemetry-dotnet-instrumentation/actions/runs/3080870008/jobs/4978741438

@pjanotti
Copy link
Contributor Author

@RassK another one: many received by zipkin by assert report not all 11 ... https://github.com/open-telemetry/opentelemetry-dotnet-instrumentation/actions/runs/3087043165/jobs/4991999686#step:4:3378

Let's prioritize the issue to dump the spans received by Zipkin, meanwhile, you can enable the console exporter output for some initial attempt. Notice that running the single failing test multiple times hasn't be a good way to repro the issues (at least with the PrometheusExporter test). It seems that there is some side-effect/leak from previous tests that affect the tests coming after them.

@pjanotti
Copy link
Contributor Author

pjanotti commented Oct 4, 2022

Checked the latest CI failures and this one was not present.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

3 participants