tests | |
---|---|
package |
A pytorch extension providing Bregman-based optimizers
- Free software: BSD 3-Clause License
The package can be install from PyPI using:
pip install bregman-learning
The library provides 2 Bregman-based optimizers, several regularizers for these optimizers, and functions for pre- and postprocessing the networks.
The Bregman-based optimizers provides are LinBreg and AdaBreg. Their usage is similar to the usage of Adam and SGD, their non-Bregman counterparts. Instead of:
from torch.optim import Adam ... optimizer = Adam(model.parameters(), lr=learning_rate)
the optimizers are created using:
from bregman import AdaBreg, L1 ... optimizer = AdaBreg( model.parameters(), reg=L1(rc=regularization_constant), lr=learning_rate )
where the L1 regularizer can be interchanged with any regularizer in the library.
For the best results when using sparsity-promoting regularizers, the networks have to pre- and postprocessed accordingly. For the L12 regularizer, this can be done using:
from bregman import simplify, sparsify ... sparsify(model, density_level=0.2) ... pruned_model = simplify(model)
If you use this code, please use the citation information in the CITATION.cff file or click the cite this repository button in the sitebar.