Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/bsk 857 polymorphic state data #858

Open
wants to merge 14 commits into
base: develop
Choose a base branch
from

Conversation

juan-g-bonilla
Copy link
Contributor

@juan-g-bonilla juan-g-bonilla commented Nov 25, 2024

Description

StateData was made polymorphic, StateData::propagateState is now virtual.

This also required improvements to how the integrators work. ExtendedStateVector was given additional methods and updated to handle the polymorphic StateData. Similarly, DynParamManager had to be updated (registerState is now a templated method for different state classes, for example). Unfortunately, this means a SWIG file is now necessary for DynParamManager (templated code needs some extra massaging in SWIG), which in turn means a lot of SWIG interface files must be updated to use this new SWIG file.

I'm a good boy scout, so I also tried to clean up code and add extra docstrings in these classes.

Verification

Commit 8976004 (or it's current equivalent) removes support for operator+ and operator* on StateData, that test_stateArchitecture depended on. This test has been updated to use other, equivalent, methods.

No new tests added. Essentially all other Basilisk tests use StateData and integration, thus these tests passing should prove the refactored system works as expected. For now, only one StateData class appears in Basilisk (with the standard propagation function), and that should be shown to work. New StateData classes should be tested as they are implemented.

Documentation

No changes. API remains very similar: by default DynParamManager assumes you want to create a standard StateData. The state system is not discussed on the prosaic documentation, thus no changes are needed.

Future work

Implement custom StateData classes, such as the MRPStateData or QuaternionStateData classes described in #857

@juan-g-bonilla juan-g-bonilla self-assigned this Nov 25, 2024
@juan-g-bonilla juan-g-bonilla requested a review from a team as a code owner November 25, 2024 04:45
@juan-g-bonilla juan-g-bonilla force-pushed the feature/bsk-857-polymorphic-StateData branch 3 times, most recently from 9b6777c to 93442a2 Compare November 27, 2024 04:18
@juan-g-bonilla
Copy link
Contributor Author

Ola @joaogvcarneiro! I have a test failing for this branch. It's related to the variable integrator scenario, which I believe you wrote?

I made some changes to the adaptive integrators. The math should be the same, but the numerics have shifted around. I think the failing test is testing to an accuracy that is within the integrator error, so the updated numerics trigger the comparison failure. That's only my intuition, so I'd appreciate your take on it if you have a minute.

@juan-g-bonilla juan-g-bonilla added the enhancement New feature or request label Nov 27, 2024
@joaogvcarneiro
Copy link
Contributor

Ola @joaogvcarneiro! I have a test failing for this branch. It's related to the variable integrator scenario, which I believe you wrote?

I made some changes to the adaptive integrators. The math should be the same, but the numerics have shifted around. I think the failing test is testing to an accuracy that is within the integrator error, so the updated numerics trigger the comparison failure. That's only my intuition, so I'd appreciate your take on it if you have a minute.

I think your intuition is correct. You could vary the timestep to see if the results improve. Further, all integrators have an upper bound on the error that depends on the timestep, so you could do some correlation to determine if the errors are changing as expected. This link is a good resource. I hope this helps!

@juan-g-bonilla
Copy link
Contributor Author

Ola @joaogvcarneiro! I have a test failing for this branch. It's related to the variable integrator scenario, which I believe you wrote?
I made some changes to the adaptive integrators. The math should be the same, but the numerics have shifted around. I think the failing test is testing to an accuracy that is within the integrator error, so the updated numerics trigger the comparison failure. That's only my intuition, so I'd appreciate your take on it if you have a minute.

I think your intuition is correct. You could vary the timestep to see if the results improve. Further, all integrators have an upper bound on the error that depends on the timestep, so you could do some correlation to determine if the errors are changing as expected. This link is a good resource. I hope this helps!

The integrators do behave as expected:
imagen
The evolution of position error and integration time evolve as I would expect as a function of the time step, order, and tolerance.

I will try lowering the time-step of the scenario as you suggested and see if that helps. I suspect that doing so will produce results closer to the real truth, but maybe not closer to the "truth" value used in the test (since the test "truth" could have been obtained with integration error larger than the test accuracy). If that's the case, is it ok if I update the truth value in the test and change the accuracy to reflect the expected integrator error?

@schaubh
Copy link
Contributor

schaubh commented Dec 7, 2024

Yes, it is ok to update the truth value for this test. In the commit just explain why this is being updated, and how the new truth value was computed. Thanks for doing the time step refinement test to validate the expected behavior.

@joaogvcarneiro
Copy link
Contributor

If we're changing the truth values, then I'd argue for removing this "magic number" approach all together. We're seeing precisely why hardcoding truth values can be so confusing: we don't know where these results come from, what integrator was used, what the tolerance was, etc. If we change these numbers to some others with a lower tolerance, we're going to end up in the same situation a few years down the line.

The test checks the position for some orbits defined in the variable integrator scenario. Couldn't we find the analytical value for the position at any point in the orbit? That would be the actual truth value. It would obviously take some time to do the math, but once it's done, we wouldn't have to change it ever again and we'd avoid all these issues with magic numbers.

@schaubh
Copy link
Contributor

schaubh commented Dec 7, 2024

I haven’t looked at the details of the test yet. In many cases we had separate code that computed the numerical truth values. That is a documented approach. In other cases the test was rather a consistency check. In this branch case, if we assume Keplerian motion and we solve Kepler’s equation to sufficient accuracy, we could use that as a truth value. We check if we are close to that value.

However, note that this truth isn’t the correct truth for an integrator. Rather, the better truth would be to duplicate the integration method to have consistent math. But, I think this is overkill for this branch.

Copy link
Contributor

@schaubh schaubh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • add release notes
  • add warning about new *.I file inclusion, mention in release
  • check HTML documentation for the changes files (class method descriptions present, etc.), no sphinx warnings.

@juan-g-bonilla juan-g-bonilla force-pushed the feature/bsk-857-polymorphic-StateData branch from d29e0f1 to 47a39e4 Compare December 13, 2024 18:40
@juan-g-bonilla
Copy link
Contributor Author

I haven’t looked at the details of the test yet. In many cases we had separate code that computed the numerical truth values. That is a documented approach. In other cases the test was rather a consistency check. In this branch case, if we assume Keplerian motion and we solve Kepler’s equation to sufficient accuracy, we could use that as a truth value. We check if we are close to that value.

However, note that this truth isn’t the correct truth for an integrator. Rather, the better truth would be to duplicate the integration method to have consistent math. But, I think this is overkill for this branch.

For posterity: I ended up computing a reference analytical solution for this test (it's a simple 2body problem). I used some test tolerances that make sense and should be robust to numeric jitter.

@juan-g-bonilla juan-g-bonilla force-pushed the feature/bsk-857-polymorphic-StateData branch from 47a39e4 to 61581c6 Compare December 13, 2024 19:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Make StateData polymorphic to support alternative state propagation equations
3 participants