Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unit Tests for PointSourceResponse #243

Open
wants to merge 2 commits into
base: develop
Choose a base branch
from

Conversation

augustus-thomas
Copy link

Partially addresses response testing for Issue #226

  • PointSourceReponse.py now has its own associated test file called test_point_source_response.py.
  • Many spectra model types are tested in the get_expectation method call, which should ensure full coverage.
  • The point source response used is generated from a call to FullDetectorResponse.get_point_source_response()
  • The FullDetectorResponse used was generated from the test_full_detector_response.h5

Let me know if this PR is helpful! I am very excited to contribute something to this team :)

Copy link

codecov bot commented Sep 19, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 73.30%. Comparing base (99fdff7) to head (5ab3b13).
Report is 13 commits behind head on develop.

see 1 file with indirect coverage changes

@israelmcmc
Copy link
Collaborator

Thanks @augustus-thomas! This looks good to me. One suggestion: currently you are basically testing whether various methods run without crashing, which is good. In addition, it would be better to also test that the output is what's expected. For example, you can add a check (an assert) that the total expectation equals (or approximately equals) what we're currently getting, so that if a future version of the code breaks this it won't go unnoticed.

Please let me know if you have time and are willing to do this. I understand if you don't, in which case I would merge this anyway since it's nonetheless progress!

@augustus-thomas
Copy link
Author

augustus-thomas commented Sep 20, 2024

Hey! Thanks for the feedback here @israelmcmc. I would like to insure that the expectations are correct, and I do have the time to issue a new commit for that.

I will follow the example in test_detector_response for the assert statements. I have been testing the values of the np.ndarray given when invoking expectation.contents.todense(). It turns out that these are filled with zeros! I think the spectra need to be initialized in some non-trivial way, but I am not sure yet.

@israelmcmc
Copy link
Collaborator

Sound good, thanks @augustus-thomas . You can see an example of how to initialize a model and get the expectation here: https://cositools.github.io/cosipy/tutorials/response/DetectorResponse.html

@augustus-thomas
Copy link
Author

This is now updated based on our conversations. The forced push was because I included some other commits on accident

@israelmcmc
Copy link
Collaborator

I'll close and reopen this PR to kick the unit tests, which seem stuck.

@israelmcmc israelmcmc closed this Dec 6, 2024
@israelmcmc israelmcmc reopened this Dec 6, 2024
@israelmcmc
Copy link
Collaborator

@augustus-thomas Thanks for the changes. It seems something is off with the unit test 🤔 . Please take a look at the "checks" tab. I tried to reproduce this in my system, and they also fail at that point, but due to a different reason:

   def test_get_expectation():
        # supported spectral functions
        ## see astromodels.functions.function.Function1D.[function_name]()
        ## 3 sig-figs
>       assert 631 < np.sum(psr.get_expectation(Constant(k=1e-1)).contents) < 632
E       AssertionError: assert 219126308568.4652 < 632

🤔

Have you seen this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants