-
Notifications
You must be signed in to change notification settings - Fork 0
Build cop notes and tracking 2017-05-17 through 2017-05-31ish #21
Comments
That comes back to the discussion we had a few weeks ago: "Should we make all tests using launch_testing match regex instead of full console outputs ?" (renaming the .txt files to .regex in this directory for example). These failures seem to be because of a false positive fastrtps error message while the participant did succeed to match and communicate after all. |
@mikaelarguedas I looked at similar failures in the last sprint and the problem is that our rmw specific |
Regressions 2017-05-18Nightly Linux aarch64 Repeated #43
Nightly OSX Repeated #689
Nightly Windows Release #433
These tests have two different outputs but it isn't clear from the test title what the difference between them is. These tests ran on icecube but these tests have also failed on windshield previously with similar output. |
The default consumer doesn't allow any customization to the output (see https://github.com/eProsima/Fast-RTPS/blob/6322cb9e875f685e9f68619143ded9374765ecb9/src/cpp/log/StdoutConsumer.cpp#L21). We would need to provide our own consumer implementation and register it as the default on startup. |
Regressions 2017-05-19 🎂Nightly Linux Packaging #434The Linux packaging jobs have been green for a while but when it has failed recently it's been related to the dynamic_bridge tests.
Nightly OSX Debug
Nightly OSX Release
Nightly OSX Repeated
Other changesWe got a 🍏 Nightly Windows Release build looking at the recent history they seem to pop up from time to time. The frequent timeouts are logged in #11 if the build flakes again this might be something to revisit now that namespaces have landed. |
I found some info on rmw_output_filters for connext but nothing about the one for Fast-RTPS. Are output filters still based only on prefix or are regex filters supported somewhere? |
Only on prefix atm. |
So it seems like we have a few possible ways to resolve these false positives:
I spent like four minutes glancing at what it would take to make the upstream logger only use color conditionally. There's no C++ or cross-platform equivalent of isatty, we could use isatty from C anyway and just leave color always disabled on Windows. If we're conditionally disabling color we could also set an environment variable that disables color at a tty for running tests locally. It's currently a compile time setting based on _WIN32 being defined. We could propose a FASTRTPS_LOG_NOCOLOR compiler flag that uses the empty color definitions cross platform when defined but that requires us to only ever use Fast-RTPS that we build and package with the color off (for testing) which might not be a terrible trade-off but shouldn't pass unconsidered. |
Regressions 2017-05-22Nightly Linux AArch64Nightly Linux Repeated
In addition to the above, there's also a bunch of Missing Result tests on this job. Nightly OSX Release
The OSX Release build failed with the same connext failures as previous OSX builds. |
Regressions 2017-05-26 Nightly Windows Debug #485This build had 14 new failures. Enough that I actually tried to automate the way I generate these reports. Unfortunately the Jenkins REST API will give me the test data but not a url to the test result pages that I've been linking to and templating the url using the test className and name requires a guess as to whether the url should contain a |
No major takeaways from this stint. The only identifiable way to reduce the flakes I looked at would be to swap out FastRTPS's logger for one that only outputs ANSI color escapes at terminals or build and provide our own. Both seem too much to worry about before Beta 2 but perhaps after. |
Have you created a ticket upstream regarding the ansi color? |
updating this thread for posterity:
I asked eProsima about (1) and they removed the error messages: eProsima/Fast-DDS#128. so for now, this particular error shouldn't cause flaky tests. For similar issues in the future: |
Regressions 2017-05-17
Nightly OSX Repeated #688
The test output what appears to be a successful communication between the two processes. The issue looks like it might be related to differences in expected test output.
Nightly Linux Coverage #383
I can't find anything that looks obviously like the reson for the timeouts in the console logs for job #383 demo_nodes_cpp.
The text was updated successfully, but these errors were encountered: