-
Notifications
You must be signed in to change notification settings - Fork 142
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Skip the mass transit test to see if it solves flake issues #5861
Conversation
Execution-Time Benchmarks Report ⏱️Execution-time results for samples comparing the following branches/commits: Execution-time benchmarks measure the whole time it takes to execute a program. And are intended to measure the one-off costs. Cases where the execution time results for the PR are worse than latest master results are shown in red. The following thresholds were used for comparing the execution times:
Note that these results are based on a single point-in-time result for each branch. For full results, see the dashboard. Graphs show the p99 interval based on the mean and StdDev of the test run, as well as the mean value of the run (shown as a diamond below the graph). gantt
title Execution time (ms) FakeDbCommand (.NET Framework 4.6.2)
dateFormat X
axisFormat %s
todayMarker off
section Baseline
This PR (5861) - mean (76ms) : 63, 90
. : milestone, 76,
master - mean (74ms) : 64, 84
. : milestone, 74,
section CallTarget+Inlining+NGEN
This PR (5861) - mean (1,072ms) : 1046, 1099
. : milestone, 1072,
master - mean (1,063ms) : 1041, 1084
. : milestone, 1063,
gantt
title Execution time (ms) FakeDbCommand (.NET Core 3.1)
dateFormat X
axisFormat %s
todayMarker off
section Baseline
This PR (5861) - mean (110ms) : 105, 115
. : milestone, 110,
master - mean (110ms) : 105, 115
. : milestone, 110,
section CallTarget+Inlining+NGEN
This PR (5861) - mean (747ms) : 727, 768
. : milestone, 747,
master - mean (747ms) : 727, 767
. : milestone, 747,
gantt
title Execution time (ms) FakeDbCommand (.NET 6)
dateFormat X
axisFormat %s
todayMarker off
section Baseline
This PR (5861) - mean (92ms) : 89, 95
. : milestone, 92,
master - mean (92ms) : 88, 96
. : milestone, 92,
section CallTarget+Inlining+NGEN
This PR (5861) - mean (705ms) : 681, 729
. : milestone, 705,
master - mean (703ms) : 684, 721
. : milestone, 703,
gantt
title Execution time (ms) HttpMessageHandler (.NET Framework 4.6.2)
dateFormat X
axisFormat %s
todayMarker off
section Baseline
This PR (5861) - mean (193ms) : 189, 197
. : milestone, 193,
master - mean (192ms) : 189, 196
. : milestone, 192,
section CallTarget+Inlining+NGEN
This PR (5861) - mean (1,172ms) : 1141, 1203
. : milestone, 1172,
master - mean (1,169ms) : 1139, 1200
. : milestone, 1169,
gantt
title Execution time (ms) HttpMessageHandler (.NET Core 3.1)
dateFormat X
axisFormat %s
todayMarker off
section Baseline
This PR (5861) - mean (277ms) : 272, 281
. : milestone, 277,
master - mean (275ms) : 271, 280
. : milestone, 275,
section CallTarget+Inlining+NGEN
This PR (5861) - mean (921ms) : 896, 947
. : milestone, 921,
master - mean (921ms) : 896, 947
. : milestone, 921,
gantt
title Execution time (ms) HttpMessageHandler (.NET 6)
dateFormat X
axisFormat %s
todayMarker off
section Baseline
This PR (5861) - mean (267ms) : 262, 271
. : milestone, 267,
master - mean (266ms) : 262, 270
. : milestone, 266,
section CallTarget+Inlining+NGEN
This PR (5861) - mean (908ms) : 880, 936
. : milestone, 908,
master - mean (905ms) : 883, 928
. : milestone, 905,
|
Datadog ReportBranch report: ✅ 0 Failed, 364361 Passed, 2418 Skipped, 79h 32m 12.48s Total Time |
Benchmarks Report for tracer 🐌Benchmarks for #5861 compared to master:
The following thresholds were used for comparing the benchmark speeds:
Allocation changes below 0.5% are ignored. Benchmark detailsBenchmarks.Trace.ActivityBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.AgentWriterBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.AspNetCoreBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.CIVisibilityProtocolWriterBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.DbCommandBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.ElasticsearchBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.GraphQLBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.HttpClientBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.ILoggerBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.Log4netBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.NLogBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.RedisBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.SerilogBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.SpanBenchmark - Same speed ✔️ Same allocations ✔️Raw results
Benchmarks.Trace.TraceAnnotationsBenchmark - Same speed ✔️ Same allocations ✔️Raw results
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
Throughput/Crank Report ⚡Throughput results for AspNetCoreSimpleController comparing the following branches/commits: Cases where throughput results for the PR are worse than latest master (5% drop or greater), results are shown in red. Note that these results are based on a single point-in-time result for each branch. For full results, see one of the many, many dashboards! gantt
title Throughput Linux x64 (Total requests)
dateFormat X
axisFormat %s
section Baseline
This PR (5861) (11.540M) : 0, 11540096
master (11.971M) : 0, 11971108
benchmarks/2.9.0 (11.513M) : 0, 11512780
section Automatic
This PR (5861) (7.825M) : 0, 7824502
master (8.005M) : 0, 8004518
benchmarks/2.9.0 (8.207M) : 0, 8206882
section Trace stats
master (8.370M) : 0, 8370086
section Manual
master (11.695M) : 0, 11695259
section Manual + Automatic
This PR (5861) (7.198M) : 0, 7197793
master (7.432M) : 0, 7431601
section DD_TRACE_ENABLED=0
master (11.039M) : 0, 11038724
gantt
title Throughput Linux arm64 (Total requests)
dateFormat X
axisFormat %s
section Baseline
This PR (5861) (9.549M) : 0, 9549349
benchmarks/2.9.0 (9.495M) : 0, 9494663
section Automatic
This PR (5861) (6.575M) : 0, 6575354
section Manual + Automatic
This PR (5861) (6.241M) : 0, 6240835
gantt
title Throughput Windows x64 (Total requests)
dateFormat X
axisFormat %s
section Baseline
This PR (5861) (10.209M) : 0, 10208988
master (10.190M) : 0, 10189702
section Automatic
This PR (5861) (6.900M) : 0, 6900477
master (6.990M) : 0, 6989682
section Trace stats
master (7.429M) : 0, 7429126
section Manual
master (10.143M) : 0, 10143331
section Manual + Automatic
This PR (5861) (6.549M) : 0, 6549419
master (6.505M) : 0, 6505005
section DD_TRACE_ENABLED=0
master (9.617M) : 0, 9617462
|
## Summary of changes Skip the mass transit smoke test as it seems to be a cause of a lot of flakiness ## Reason for change We've seen a lot of errors in the `CheckBuildlogsForErr` stage: ``` CheckBuildLogsForErr: 03:08:39 [Error] An error occurred while sending data to the agent at http://127.0.0.1:39573/v0.4/traces. If the error isn't transient, please check https://docs.datadoghq.com/tracing/troubleshooting/connection_errors/?code-lang=dotnet for guidance. System.Net.Http.HttpRequestException: Error while copying content to a stream. ``` These seemed to get a lot worse after we disabled keep-alive, but that's anecdotal. ## Implementation details It's not entirely clear if the problem is just coincidentally related to the MassTransit test (i.e. it's a test ordering process) or if it's actually something about the test. As a check I tried skipping the test in this branch and did 4 full (all TFM) integration tests runs, and didn't see the issue again. It's all still anecdotal, but rather trade off flakiness here. If the problem reappears subsequently, we can look into it again further. ## Test coverage Did 4 full runs, and didn't see the issue again
… v2) (#5911) ## Summary of changes Skip the mass transit smoke test as it seems to be a cause of a lot of flakiness ## Reason for change We've seen a lot of errors in the `CheckBuildlogsForErr` stage: ``` CheckBuildLogsForErr: 03:08:39 [Error] An error occurred while sending data to the agent at http://127.0.0.1:39573/v0.4/traces. If the error isn't transient, please check https://docs.datadoghq.com/tracing/troubleshooting/connection_errors/?code-lang=dotnet for guidance. System.Net.Http.HttpRequestException: Error while copying content to a stream. ``` These seemed to get a lot worse after we disabled keep-alive, but that's anecdotal. ## Implementation details It's not entirely clear if the problem is just coincidentally related to the MassTransit test (i.e. it's a test ordering process) or if it's actually something about the test. As a check I tried skipping the test in this branch and did 4 full (all TFM) integration tests runs, and didn't see the issue again. It's all still anecdotal, but rather trade off flakiness here. If the problem reappears subsequently, we can look into it again further. ## Test coverage Did 4 full runs, and didn't see the issue again ## Other details Backport of #5861 (as still getting a lot of flake on the release/2.x branch)
Summary of changes
Skip the mass transit smoke test as it seems to be a cause of a lot of flakiness
Reason for change
We've seen a lot of errors in the
CheckBuildlogsForErr
stage:These seemed to get a lot worse after we disabled keep-alive, but that's anecdotal.
Implementation details
It's not entirely clear if the problem is just coincidentally related to the MassTransit test (i.e. it's a test ordering process) or if it's actually something about the test.
As a check I tried skipping the test in this branch and did 4 full (all TFM) integration tests runs, and didn't see the issue again. It's all still anecdotal, but rather trade off flakiness here. If the problem reappears subsequently, we can look into it again further.
Test coverage
Did 4 full runs, and didn't see the issue again