Probabilistic Programming and Statistical Inference in PyTorch.
This project is being developed during my time at Cogent Labs.
The documentation is still WIP, a brief API description is reported below. The tests might also be helpful.
The API might change quickly during this initial development period.
The first dimension is the batch dimension, over which the samples are assumed to be independent.
# Random variables interface:
class RandomVariable:
def size(self) # --> (batch_size, rv_dimension)
def log_pdf(self, x) # --> [batch_size]
def sample(self) # --> [batch_size, rv_dimension]
def entropy(self) # --> [batch_size]
# Implemented random variables:
Normal(size=(batch_size, rv_dimension), cuda=cuda)
Normal(mu, sd)
Categorical(size=(batch_size, rv_dimension), cuda=cuda)
Categorical(p)
Bernoulli(size=(batch_size, rv_dimension), cuda=cuda)
Bernoulli(p)
Uniform(size=(batch_size, rv_dimension), cuda=cuda)
# KL-Divergence:
def kld(rv_from, rv_to) # --> [batch_size]
- removed specialized distributions => more flexible constructors
- refactoring: distributions into multiple files
- initial commit
The code is released under the MIT license.