-
Notifications
You must be signed in to change notification settings - Fork 61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/bsk 857 polymorphic state data #858
base: develop
Are you sure you want to change the base?
Conversation
9b6777c
to
93442a2
Compare
Ola @joaogvcarneiro! I have a test failing for this branch. It's related to the variable integrator scenario, which I believe you wrote? I made some changes to the adaptive integrators. The math should be the same, but the numerics have shifted around. I think the failing test is testing to an accuracy that is within the integrator error, so the updated numerics trigger the comparison failure. That's only my intuition, so I'd appreciate your take on it if you have a minute. |
I think your intuition is correct. You could vary the timestep to see if the results improve. Further, all integrators have an upper bound on the error that depends on the timestep, so you could do some correlation to determine if the errors are changing as expected. This link is a good resource. I hope this helps! |
The integrators do behave as expected: I will try lowering the time-step of the scenario as you suggested and see if that helps. I suspect that doing so will produce results closer to the real truth, but maybe not closer to the "truth" value used in the test (since the test "truth" could have been obtained with integration error larger than the test accuracy). If that's the case, is it ok if I update the truth value in the test and change the accuracy to reflect the expected integrator error? |
Yes, it is ok to update the truth value for this test. In the commit just explain why this is being updated, and how the new truth value was computed. Thanks for doing the time step refinement test to validate the expected behavior. |
If we're changing the truth values, then I'd argue for removing this "magic number" approach all together. We're seeing precisely why hardcoding truth values can be so confusing: we don't know where these results come from, what integrator was used, what the tolerance was, etc. If we change these numbers to some others with a lower tolerance, we're going to end up in the same situation a few years down the line. The test checks the position for some orbits defined in the variable integrator scenario. Couldn't we find the analytical value for the position at any point in the orbit? That would be the actual truth value. It would obviously take some time to do the math, but once it's done, we wouldn't have to change it ever again and we'd avoid all these issues with magic numbers. |
I haven’t looked at the details of the test yet. In many cases we had separate code that computed the numerical truth values. That is a documented approach. In other cases the test was rather a consistency check. In this branch case, if we assume Keplerian motion and we solve Kepler’s equation to sufficient accuracy, we could use that as a truth value. We check if we are close to that value. However, note that this truth isn’t the correct truth for an integrator. Rather, the better truth would be to duplicate the integration method to have consistent math. But, I think this is overkill for this branch. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- add release notes
- add warning about new *.I file inclusion, mention in release
- check HTML documentation for the changes files (class method descriptions present, etc.), no sphinx warnings.
src/simulation/dynamics/_GeneralModuleFiles/extendedStateVector.cpp
Outdated
Show resolved
Hide resolved
Also improvements to docstrings in StateData
d29e0f1
to
47a39e4
Compare
DynParamManager now makes use of some templated code, for which we need to provide some SWIG machinery.
Now, we make use of the StateData propragate function, which can now be overridden to provide custom state integration function.
The method is used in a test script.
For posterity: I ended up computing a reference analytical solution for this test (it's a simple 2body problem). I used some test tolerances that make sense and should be robust to numeric jitter. |
47a39e4
to
61581c6
Compare
Description
StateData
was made polymorphic,StateData::propagateState
is now virtual.This also required improvements to how the integrators work.
ExtendedStateVector
was given additional methods and updated to handle the polymorphicStateData
. Similarly,DynParamManager
had to be updated (registerState
is now a templated method for different state classes, for example). Unfortunately, this means a SWIG file is now necessary forDynParamManager
(templated code needs some extra massaging in SWIG), which in turn means a lot of SWIG interface files must be updated to use this new SWIG file.I'm a good boy scout, so I also tried to clean up code and add extra docstrings in these classes.
Verification
Commit 8976004 (or it's current equivalent) removes support for
operator+
andoperator*
onStateData
, thattest_stateArchitecture
depended on. This test has been updated to use other, equivalent, methods.No new tests added. Essentially all other Basilisk tests use
StateData
and integration, thus these tests passing should prove the refactored system works as expected. For now, only oneStateData
class appears in Basilisk (with the standard propagation function), and that should be shown to work. NewStateData
classes should be tested as they are implemented.Documentation
No changes. API remains very similar: by default
DynParamManager
assumes you want to create a standardStateData
. The state system is not discussed on the prosaic documentation, thus no changes are needed.Future work
Implement custom
StateData
classes, such as theMRPStateData
orQuaternionStateData
classes described in #857