Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flaky Test - tailsamplingprocessor/TestConcurrentTraceArrival #10205

Closed
djaglowski opened this issue May 23, 2022 · 18 comments
Closed

Flaky Test - tailsamplingprocessor/TestConcurrentTraceArrival #10205

djaglowski opened this issue May 23, 2022 · 18 comments
Labels
bug Something isn't working flaky test a test is flaky good first issue Good for newcomers help wanted Extra attention is needed never stale Issues marked with this label will be never staled and automatically removed priority:p3 Lowest processor/tailsampling Tail sampling processor

Comments

@djaglowski
Copy link
Member

race: limit on 8128 simultaneously alive goroutines is exceeded, dying
FAIL	github.com/open-telemetry/opentelemetry-collector-contrib/processor/tailsamplingprocessor	3.415s

Observed here: https://github.com/open-telemetry/opentelemetry-collector-contrib/runs/6555994409?check_suite_focus=true

@djaglowski djaglowski added bug Something isn't working flaky test a test is flaky labels May 23, 2022
@djaglowski
Copy link
Member Author

@jpkrohling jpkrohling self-assigned this Jul 5, 2022
@dgoscn
Copy link
Contributor

dgoscn commented Sep 21, 2022

Hi @jpkrohling is this issue still open?

@jpkrohling
Copy link
Member

I believe it is! Do you want to take this one?

@dgoscn
Copy link
Contributor

dgoscn commented Sep 26, 2022

Yes, can assign to me @jpkrohling

@djaglowski djaglowski assigned dgoscn and unassigned jpkrohling Sep 27, 2022
@djaglowski
Copy link
Member Author

@dgoscn, assigned to you

@evan-bradley evan-bradley added the processor/tailsampling Tail sampling processor label Sep 30, 2022
@github-actions
Copy link
Contributor

Pinging code owners: @jpkrohling. See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Nov 30, 2022
@jpkrohling jpkrohling added never stale Issues marked with this label will be never staled and automatically removed and removed Stale labels Nov 30, 2022
@jpkrohling
Copy link
Member

@dgoscn, are you still available to work on this one?

@dgoscn
Copy link
Contributor

dgoscn commented Dec 1, 2022

@jpkrohling I will have to let for someone else. If no one else assign, I want to go back to work on this as soon as possible

@dgoscn
Copy link
Contributor

dgoscn commented Jan 4, 2023

Hi @djaglowski How are you? Happy new year. Do you know if this scenario still happening? The log for the events are expired.
Thanks

@bryan-aguilar
Copy link
Contributor

@dgoscn I was looking through the processor and was wondering if have you been able to replicate these flaky test locally? I ran the tests for a while on my machine and did not see any failures. Running it on an Intel MBP

@dgoscn
Copy link
Contributor

dgoscn commented Jan 5, 2023

Hey @bryan-aguilar. Thanks for the message, this sounds really good. I didn't made the tests again. But, sure I will do. Until the moment, I believe if you do not face the error, we can advance based on some advice from @jpkrohling .
Thanks

@ZenoCC-Peng
Copy link
Contributor

ZenoCC-Peng commented Jun 29, 2023

Hello, I am currently review this test.

Upon running 20 times the GitHub Action on my personal branch, there were no test failures. Here is the file Tailsamplingprocessor-FlakyTest-#10205.xlsx
I suspect that someone may have modified the code.

If everything seems fine, I will proceed with pulling request.

@atoulme
Copy link
Contributor

atoulme commented Jun 29, 2023

Please go ahead, thanks!

jpkrohling pushed a commit that referenced this issue Jul 10, 2023
**Description:** <Describe what has changed.>
<!--Ex. Fixing a bug - Describe the bug and how this fixes the issue.
Ex. Adding a feature - Explain what this achieves.--> Run 20 times the
GitHub Action on my personal branch, there were no test failures. See
the files:

[Tailsamplingprocessor-FlakyTest-.10205.xlsx](https://github.com/open-telemetry/opentelemetry-collector-contrib/files/11911095/Tailsamplingprocessor-FlakyTest-.10205.xlsx)


**Link to tracking Issue:** <Issue number if applicable>
[10205](#10205)

**Testing:** tailsamplingprocessor/processor_test.go

---------

Co-authored-by: Alex Boten <[email protected]>
Co-authored-by: zeno-splunk <[email protected]>
@jpkrohling
Copy link
Member

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working flaky test a test is flaky good first issue Good for newcomers help wanted Extra attention is needed never stale Issues marked with this label will be never staled and automatically removed priority:p3 Lowest processor/tailsampling Tail sampling processor
Projects
None yet
Development

No branches or pull requests

8 participants