-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tests that test for completion & correctness #20
Comments
We'd be better of with some small carefully designed geometry that we query with CGM after conversion. This means that we don't have to worry about inconsistencies with ACIS versions or need to be bound to ACIS, even. |
For the relatively simple geometries already in the test folder wouldn't the import of a SAT file still work despite different ACIS versions? |
Sorry - I thought you just wanted to compare files... What you suggest may work, but if the geometries are sufficiently simple, we can hard code the things we want to compare rather than comparing with a file, thus insulating the process from file versions and/or solid modeling engine choices. |
What I'd like to avoid (if possible) is a few hard coded values we're comparing in favor of the boolean operations available to us in CGM. A robust subtraction/intersection process would cover any/all hard coded tests we might want to perform. What we might do in order to insulate the testing process from file versions or underlying solid modeling engines is to write functions which create the expected geometry using iGeom/CGMA in the same program and then so this subtraction. The way I currently envision it is in order to create a new test you would simply add the new card and then a function in the test suite corresponding to that card. The test suite will then run mcnp2cad on the new test card, run the corresponding function to create the expected geometry, intelligently subtract the two and then test for a final NULL set. @makeclean expressed some concern that the subtraction process might not be robust due to numerical tolerance issues. I'm going to investigate that a little bit today, but I expect that at least for the simple geometries involved in the unittests it should be sufficient to provide the expected results. |
For what it's worth, I agree with @makeclean that "subtract two geometries that should be identical and test for an empty result" would be an invitation for numerical stability problems. |
If the same calls are being made to the underlying solid modeling engine, I think the boolean operations should be able to produce the desired null result, but @gonuke made a good point in our last research meeting that if this is the case, then it may be better to simply test that the correct functions are being called with the right arguments. This way we aren't relying on the robustness of the solid modeling engine itself. |
Right now tests in mcnp2cad seem to consist of a directory holding a set of test input files which mcnp2cad should be able to complete without error. It does not, however, test to see if the resulting file is correct.
I'd like to propose that we add SAT files representative of each input file to this directory (or a subdirectory). Then after running mcnp2cad on the test input files we can compare the known SAT file to the mcnp2cad-generated SAT file by subtracting/intersecting one set of volumes from the other in CGM/iGeom. The success result being a file in which no entities exist and failure if any entities exist at all.
Thoughts?
The text was updated successfully, but these errors were encountered: