Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tests are skipped (+ passed) on cuda build #229

Open
justinlaughlin opened this issue Jul 2, 2024 · 4 comments
Open

Tests are skipped (+ passed) on cuda build #229

justinlaughlin opened this issue Jul 2, 2024 · 4 comments

Comments

@justinlaughlin
Copy link
Contributor

Lately, ex18 and ex23 have been causing tests to fail. I noticed that on the CUDA build, no tests run on either mfem or pymfem. This leads to a PASS(-1) because "generated files are the same". This should probably be a fail.

Here is an example:

https://github.com/mfem/PyMFEM/actions/runs/9752638384/job/26916497223

Running : ex0.py
/opt/hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/mfem/external/ser/examples/ex0: error while loading shared libraries: libcusparse.so.12: cannot open shared object file: No such file or directory	File "/opt/hostedtoolcache/Python/3.9.19/x64/lib/python3.9/site-packages/mfem/ser.py", line 2, in <module>           : False
No difference in generate files (Passed output file check) in exe
ex0  PASS(-1): generated files are the same, but terminal output differs
@sshiraiwa
Copy link
Member

Closing this, since it was addressed with 4.7 release.

@justinlaughlin
Copy link
Contributor Author

I think this could stay open - the 4.7 release fixed CUDA installation. This issue is that CUDA tests will always pass when in fact they aren't actually running but look like they are (at least, their output is PASS(-1)). We still see this now, e.g. in this run

@sshiraiwa sshiraiwa reopened this Aug 9, 2024
@sshiraiwa
Copy link
Member

You are right. Reopened. This looks like both examples (C++ and Python) are not running due to run-time library linking issue?

@justinlaughlin
Copy link
Contributor Author

Looks like it. We might also want to consider changing the test behavior so that instead of PASS(-1) it is a FAIL - current behavior is easy to slip under the radar. If it is common/okay for terminal output to be different, maybe we can add a condition that if there are no files generated it is a fail?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants