Skip to content

TJHeeringa/BregmanLearning

Repository files navigation

Overview

tests
package

A pytorch extension providing Bregman-based optimizers

  • Free software: BSD 3-Clause License

Installation

The package can be install from PyPI using:

pip install bregman-learning

Usage

The library provides 2 Bregman-based optimizers, several regularizers for these optimizers, and functions for pre- and postprocessing the networks.

The Bregman-based optimizers provides are LinBreg and AdaBreg. Their usage is similar to the usage of Adam and SGD, their non-Bregman counterparts. Instead of:

from torch.optim import Adam

...

optimizer = Adam(model.parameters(), lr=learning_rate)

the optimizers are created using:

from bregman import AdaBreg, L1

...

optimizer = AdaBreg(
    model.parameters(),
    reg=L1(rc=regularization_constant),
    lr=learning_rate
)

where the L1 regularizer can be interchanged with any regularizer in the library.

For the best results when using sparsity-promoting regularizers, the networks have to pre- and postprocessed accordingly. For the L12 regularizer, this can be done using:

from bregman import simplify, sparsify

...

sparsify(model, density_level=0.2)

...

pruned_model = simplify(model)

Citing

If you use this code, please use the citation information in the CITATION.cff file or click the cite this repository button in the sitebar.

About

A pytorch extension providing the Bregman optimizers

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages