Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create common base class for optimizer classes. #375

Merged
merged 25 commits into from
Jan 14, 2023

Conversation

bwohlberg
Copy link
Collaborator

@bwohlberg bwohlberg commented Jan 5, 2023

Create common base class for optimizer classes. A significant fraction of the iterations stats initialization code is moved to the base class.

In addition, fix some faulty logic in the determination of whether an objective function can be evaluated.

@bwohlberg bwohlberg added the improvement Improvement of existing code, including addressing of omissions or inconsistencies label Jan 5, 2023
@codecov
Copy link

codecov bot commented Jan 5, 2023

Codecov Report

Merging #375 (3cdb8a4) into main (029a595) will decrease coverage by 0.09%.
The diff coverage is 96.74%.

@@            Coverage Diff             @@
##             main     #375      +/-   ##
==========================================
- Coverage   94.08%   93.99%   -0.10%     
==========================================
  Files          61       61              
  Lines        3821     3747      -74     
==========================================
- Hits         3595     3522      -73     
+ Misses        226      225       -1     
Flag Coverage Δ
unittests 93.99% <96.74%> (-0.10%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
scico/optimize/_padmm.py 90.07% <93.10%> (-2.28%) ⬇️
scico/optimize/_common.py 96.29% <95.12%> (-3.71%) ⬇️
scico/optimize/__init__.py 100.00% <100.00%> (ø)
scico/optimize/_admm.py 96.77% <100.00%> (-0.53%) ⬇️
scico/optimize/_ladmm.py 98.36% <100.00%> (-0.44%) ⬇️
scico/optimize/_pgm.py 97.14% <100.00%> (+2.82%) ⬆️
scico/optimize/_primaldual.py 100.00% <100.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

Args:
f: Loss or Functional object with `grad` defined.
g: Instance of Functional with defined prox method.
L0: Initial estimate of Lipschitz constant of f.
x0: Starting point for :math:`\mb{x}`.
step_size: helper :class:`StepSize` to estimate the Lipschitz
constant of f.
maxiter: Maximum number of PGM iterations to perform.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I miss seeing maxiter listed in the docs for each optimizer. What do you think about putting it back (while still refactoring out itstat_options)? Or maybe mention it in the help string for **kwargs?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see arguments for an against.

For: It would improve readability of individual optimizer documentation, and is a relatively small change.

Against: One of the reasons for use of **kwargs and moving the docs for those options to the base class is to avoid duplication of docs, which inevitably leads to errors when changes are made. Keeping the maxiter docs in the derived classes would be inconsistent with this principle.

Thoughts?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's a tough call. I value the readability of the docs over the maintenance issue, but not strongly. And, the PR already is as it is, so perhaps we leave it.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's go with the PR as-is. We can make the suggested changes later if we decide it's a serious issue.

@bwohlberg bwohlberg merged commit f569235 into main Jan 14, 2023
@bwohlberg bwohlberg deleted the brendt/optimizers-common branch January 14, 2023 17:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
improvement Improvement of existing code, including addressing of omissions or inconsistencies
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants