-
Notifications
You must be signed in to change notification settings - Fork 9.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
*: Upload test junit results #13152
*: Upload test junit results #13152
Conversation
Awesome @serathius this looks great. |
Ehm, hope to see it working. This is my first try to run this config as I didn't find a easy way to run github actions locally. |
1b62d8a
to
06aefbf
Compare
Heh, has someone seen error like:
|
29037cf
to
8a5733c
Compare
Ok, I think generating junit and upload to artifacts work as those files are available on the test page. On the other hand looks like the junit upload action doesn't work
|
859d12f
to
1de9d5a
Compare
Ok read more about this, apparently it needs to be run as a with |
I'm still not sure how using https://github.com/EnricoMi/publish-unit-test-result-action will help us with detecting flakes. It looks more like a way to add test failure information to Github UI and is definitely useful, however it doesn't solve the problem we care most about test flakes. Idea for now, upload junit reports to test artifacts all tests on main branch and let it run for some time. When we have enough data we could fetch them from GithHub artifacs and analyse for flakes. It should be a simple script that lists names of tests, which runs they failed in and create issues based on that. |
9a018c5
to
3895ec0
Compare
de944ef
to
97e3df1
Compare
4a550f4
to
0f0481a
Compare
b7a32ed
to
bbe3889
Compare
.github/workflows/tests.yaml
Outdated
- linux-386-unit-1-cpu | ||
- linux-amd64-e2e |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need to merge the e2e tests ?
The benefit of keeping them separately is that 'on flake' only a small subset needs to be retried.
I assume the goal is to limit number of files with test results, but I'm not sure if that pays off.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think that non-maintainers can retry tests the same way. They can only push another commit that retriggers all tests.
Goal is to have one workflow that generates junit reports as it makes fetching results easier (one place instead of multiple) and makes it possible to use https://github.com/marketplace/actions/publish-unit-test-results which needs to run once when junit reports are generated.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reverted the change, let's start from unit test and decide later how to handle other test scenarios.
c2f2539
to
feeee86
Compare
Last test flake pushed me to continue work on junit. Hope that getting aggregated test results will help identify flakes.