Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add loss functions for phase retrieval #236

Merged
merged 25 commits into from
Mar 3, 2022
Merged

Add loss functions for phase retrieval #236

merged 25 commits into from
Mar 3, 2022

Conversation

bwohlberg
Copy link
Collaborator

Add loss functions for phase retrieval.

The names of the new loss functions are rather cumbersome: WeightedSquaredL2AbsLoss and WeightedSquaredL2AbsSquaredLoss. Any suggestions for shorter alternatives? Perhaps something like WeightSqrL2AbsLoss and WeightSqrL2AbsSqrLoss (which would also imply changing SquaredL2Loss to SqrL2Loss and WeightedSquaredL2Loss to WeightSqrL2Loss)?

@bwohlberg bwohlberg added the enhancement New feature or request label Mar 2, 2022
@codecov
Copy link

codecov bot commented Mar 2, 2022

Codecov Report

Merging #236 (3d9e715) into main (244b05c) will increase coverage by 0.09%.
The diff coverage is 96.25%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #236      +/-   ##
==========================================
+ Coverage   92.52%   92.62%   +0.09%     
==========================================
  Files          49       49              
  Lines        3452     3512      +60     
==========================================
+ Hits         3194     3253      +59     
- Misses        258      259       +1     
Flag Coverage Δ
unittests 92.62% <96.25%> (+0.09%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
scico/blockarray.py 91.20% <ø> (ø)
scico/operator/biconvolve.py 96.42% <ø> (-0.13%) ⬇️
scico/random.py 98.38% <ø> (ø)
scico/loss.py 93.06% <96.00%> (+2.88%) ⬆️
scico/linop/radon_svmbir.py 87.87% <100.00%> (ø)
scico/optimize/admm.py 96.07% <100.00%> (ø)
scico/solver.py 98.87% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 67da918...3d9e715. Read the comment docs.

Copy link
Contributor

@Michael-T-McCann Michael-T-McCann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On names the length of names: I lean toward working to make them fewer words rather than abbreviating more of the words. Specific ideas:

First, strip out Loss from all of them. This is consistent with how we do operators (e.g., we use CircularConvolve not CircularConvolveLinOp) and no less readable, especially if use qualified: Loss.WeightedSquaredL2Abs.

Second, don't try to mention every aspect of the loss. SquaredL2Loss -> Loss.L2. The first line of the docstring will clear up any confusion. WeightedSquaredL2Loss -> Loss.L2, if the user passes weights, they get weights, document this fact.

For the optics loss, maybe it has a name in the optics community. By analogy, Loss.Poisson rather than WeightedLogPlusLogLoss.

scico/loss.py Show resolved Hide resolved
W: Weighting diagonal operator. Must be non-negative.
If ``None``, defaults to :class:`.Identity`.
"""
y = ensure_on_device(y)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe ensure_on_device is deprecated. Agreed?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really? It was in all the existing loss functions before the additional ones were added.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On some further digging, I think my impression came from #181. It seems we are of the mind that SCICO functions generally shouldn't work with numpy arrays, but the decision is stuck in limbo.

If #181 goes the way it is going, ensure_on_device would go away.

@bwohlberg
Copy link
Collaborator Author

On names the length of names: I lean toward working to make them fewer words rather than abbreviating more of the words. Specific ideas:

First, strip out Loss from all of them. This is consistent with how we do operators (e.g., we use CircularConvolve not CircularConvolveLinOp) and no less readable, especially if use qualified: Loss.WeightedSquaredL2Abs.

That is a good idea. Name collision with some of the functionals (specifically the norms) is not ideal, but is worth putting up with, I think.

Second, don't try to mention every aspect of the loss. SquaredL2Loss -> Loss.L2. The first line of the docstring will clear up any confusion. WeightedSquaredL2Loss -> Loss.L2, if the user passes weights, they get weights, document this fact.

You're suggesting renaming WeightedSquaredL2Loss to L2 and removing SquaredL2Loss, to be replaced by initializing WeightedSquaredL2Loss without any weights?

For the optics loss, maybe it has a name in the optics community. By analogy, Loss.Poisson rather than WeightedLogPlusLogLoss.

That would be reasonable, but unfortunately I'm not aware of any compact names for them, and I suspect they don't exist.

@Michael-T-McCann
Copy link
Contributor

You're suggesting renaming WeightedSquaredL2Loss to L2 and removing SquaredL2Loss, to be replaced by initializing WeightedSquaredL2Loss without any weights?

Yes, as an example to make the broader point that some of the extra words may be a result of having extra classes. There may be reasons why that specific change cannot/should not be made.

@bwohlberg
Copy link
Collaborator Author

Yes, as an example to make the broader point that some of the extra words may be a result of having extra classes. There may be reasons why that specific change cannot/should not be made.

I think that's reasonable: the way we handle L2 vs WeightedL2 at the moment seems a bit contrived, and could easily be replaced with one class with optional weights.

I'm going to back off from supporting the removal of the "Loss" part of the name, though, since it's consistent with usage in scico.functional where the norms are L2Norm, etc. It's also potentially confusing to completely drop the "Squared" since there are both L2Norm and SquaredL2Norm in functionals.

@Michael-T-McCann
Copy link
Contributor

I'm going to back off from supporting the removal of the "Loss" part of the name, though, since it's consistent with usage in scico.functional where the norms are L2Norm, etc.

Hm, how does loss.L2 and functional.L2 strike you? Or, is there any harm in "if the user doesn't specify A and y they get a functional"? This would (maybe) allow us to remove scico.functional completely. (I'd volunteer for that issue.)

@bwohlberg
Copy link
Collaborator Author

Hm, how does loss.L2 and functional.L2 strike you?

The problem is that there are many functionals that aren't norms, including L2BallIndicator, the name of which would collide with the norm if our policy is to strip the type postix part of the name. Also worth pointing out that the current name is L2Norm not L2Functional, so there isn't really an argument that the "Norm" should by dropped for consistency with the absence of an "Operator" or "LinOp" postfix on the linear operators.

Or, is there any harm in "if the user doesn't specify A and y they get a functional"? This would (maybe) allow us to remove scico.functional completely. (I'd volunteer for that issue.)

There are many more functionals than the norms, so I don't think it's feasible to remove scico.functional. One could try to combine the losses and norms, but I think the implementations would be too messy to make it worth the effort.

@bwohlberg
Copy link
Collaborator Author

See most recent commit. I think the removal of "Weighted" brings the names within acceptable lengths, and removes the need for further agonizing about this.

@Michael-T-McCann Michael-T-McCann self-requested a review March 3, 2022 16:08
@bwohlberg bwohlberg merged commit 6a481de into main Mar 3, 2022
@bwohlberg bwohlberg deleted the brendt/abs-loss branch March 3, 2022 16:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants