Skip to content

Latest commit

 

History

History
113 lines (94 loc) · 6.95 KB

README.md

File metadata and controls

113 lines (94 loc) · 6.95 KB

Warning

This repository is deprecated in favour of the cirkit framework. Please have a look at it for your project.

MIT license PyPI version codecov Continuous Integration Documentation Status

Logo

DeeProb-kit

DeeProb-kit is a unified library written in Python consisting of a collection of deep probabilistic models (DPMs) that are tractable and exact representations for the modelled probability distributions. The availability of a representative selection of DPMs in a single library makes it possible to combine them in a straightforward manner, a common practice in deep learning research nowadays. In addition, it includes efficiently implemented learning techniques, inference routines, statistical algorithms, and provides high-quality fully-documented APIs. The development of DeeProb-kit will help the community to accelerate research on DPMs as well as to standardise their evaluation and better understand how they are related based on their expressivity.

Features

  • Inference algorithms for SPNs. 1 2
  • Learning algorithms for SPNs structure. 1 3 4 2 5
  • Chow-Liu Trees (CLT) as SPN leaves. 6
  • Cutset Networks (CNets) with various learning criteria. 7
  • Batch Expectation-Maximization (EM) for SPNs with arbitrarily leaves. 8 9
  • Structural marginalization and pruning algorithms for SPNs.
  • High-order moments computation for SPNs.
  • JSON I/O operations for SPNs and CLTs. 2
  • Plotting operations based on NetworkX for SPNs and CLTs. 2
  • Randomized And Tensorized SPNs (RAT-SPNs). 10
  • Deep Generalized Convolutional SPNs (DGC-SPNs). 11
  • Masked Autoregressive Flows (MAFs). 12
  • Real Non-Volume-Preserving (RealNVP) flows. 13
  • Non-linear Independent Component Estimation (NICE) flows. 14

The collection of implemented models is summarized in the following table.

Model Description
Binary-CLT Binary Chow-Liu Tree (CLT)
Binary-CNet Binary Cutset Network (CNet)
SPN Vanilla Sum-Product Network
MSPN Mixed Sum-Product Network
XPC Random Probabilistic Circuit
RAT-SPN Randomized and Tensorized Sum-Product Network
DGC-SPN Deep Generalized Convolutional Sum-Product Network
MAF Masked Autoregressive Flow
NICE Non-linear Independent Components Estimation Flow
RealNVP Real-valued Non-Volume-Preserving Flow

Installation

The library can be installed either from PIP repository or by source code.

# Install from PIP repository
pip install deeprob-kit
# Install from `main` git branch
pip install -e git+https://github.com/deeprob-org/deeprob-kit.git@main#egg=deeprob-kit

Project Directories

The documentation is generated automatically by Sphinx using sources stored in the docs directory.

A collection of code examples and experiments can be found in the examples and experiments directories respectively. Moreover, benchmark code can be found in the benchmark directory.

Cite

@misc{loconte2022deeprob,
  doi = {10.48550/ARXIV.2212.04403},
  url = {https://arxiv.org/abs/2212.04403},
  author = {Loconte, Lorenzo and Gala, Gennaro},
  title = {{DeeProb-kit}: a Python Library for Deep Probabilistic Modelling},
  publisher = {arXiv},
  year = {2022}
}

Related Repositories

References

Footnotes

  1. Peharz et al. On Theoretical Properties of Sum-Product Networks. AISTATS (2015). 2

  2. Molina, Vergari et al. SPFLOW : An easy and extensible library for deep probabilistic learning using Sum-Product Networks. CoRR (2019). 2 3 4

  3. Poon and Domingos. Sum-Product Networks: A New Deep Architecture. UAI (2011).

  4. Molina, Vergari et al. Mixed Sum-Product Networks: A Deep Architecture for Hybrid Domains. AAAI (2018).

  5. Di Mauro et al. Sum-Product Network structure learning by efficient product nodes discovery. AIxIA (2018).

  6. Di Mauro, Gala et al. Random Probabilistic Circuits. UAI (2021).

  7. Rahman et al. Cutset Networks: A Simple, Tractable, and Scalable Approach for Improving the Accuracy of Chow-Liu Trees. ECML-PKDD (2014).

  8. Desana and Schnörr. Learning Arbitrary Sum-Product Network Leaves with Expectation-Maximization. CoRR (2016).

  9. Peharz et al. Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits. ICML (2020).

  10. Peharz et al. Probabilistic Deep Learning using Random Sum-Product Networks. UAI (2020).

  11. Van de Wolfshaar and Pronobis. Deep Generalized Convolutional Sum-Product Networks for Probabilistic Image Representations. PGM (2020).

  12. Papamakarios et al. Masked Autoregressive Flow for Density Estimation. NeurIPS (2017).

  13. Dinh et al. Density Estimation using RealNVP. ICLR (2017).

  14. Dinh et al. NICE: Non-linear Independent Components Estimation. ICLR (2015).