-
-
Notifications
You must be signed in to change notification settings - Fork 188
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
We don't seem to get detail on test results? #183
Comments
Sure, can you point me to the commit that has test failures? Then I can have a look what went wrong. |
Thanks a lot @EnricoMi ! Here's an example: pydata/xarray#5873 I see there that there are no test failures reported in the comment. So possibly our config is incorrect. I thought I copied it from your (very clear) readme, but it's likely I made a mistake. I just had another look but can't see what — if you spot anything then greatly appreciated. |
Yeah, found another one: pydata/xarray#5734 The comment got overwritten by a later run, which had no test failures. Check the third last edit of the comment: It links a check, and that one should contain some failures, but it does not: https://github.com/pydata/xarray/runs/4122341392 The log of the respective action runs does not show any problems: The strange thing it is that it creates multiple annotations (as can be seen in the log), but there are only 4 annotations. The first annotation says So the GitHub API somehow swallows some annotations. This is not good. |
In your example, the comment also got overwritten by a later run, the second last edit has a failure: And that one has failure annotations: That check has all annotations. |
@max-sixty your setup looks good, it looks like an issue with the GitHub API. However, I recommend to use the latest setup: pydata/xarray#5947 |
@EnricoMi thank you so much! Let's give this a whirl! |
I reckon this is a transient issue with GitHub API. Lets monitor the annotations for a while if we see more instances where some are missing (for commits with and without failures). |
@max-sixty I have looked through a few dozens of check results in your project and all have the expected annotations. If you spot one run that produces incomplete annotations, simply rerun the publish workflow. This can be done without rerunning the tests themself as the workflows are separated. If this occurs too frequently, then this issue becomes reproducible and we could do some debugging. |
Great! Thanks a lot @EnricoMi ! I'll close this for now and reopen if we see this again, if that works for you. |
@max-sixty I think I have found the issue may explain your observation: #215 When there are more than 50 annotations (test failures, test name list, ...) than only the last modulo 50 annotations appear. It is now clear how this happened, and it is fixed in master. I am about to release a new version. |
Great! Thanks @EnricoMi ! |
I really liked the idea of this aciton, and added it to xarray.
Unfortunately — unless I'm missing it — we don't seem to be getting one of the big benefits of this: immediate detail on what failed in tests.
If we're doing something wrong in our configs, I'd love to know.
ref pydata/xarray#5946
Thank you!
The text was updated successfully, but these errors were encountered: