Skip to content

Latest commit

 

History

History
29 lines (16 loc) · 1.55 KB

README.md

File metadata and controls

29 lines (16 loc) · 1.55 KB

RMwGGIS

The PyTorch (1.4.0) implementation of "Gradient-Guided Importance Sampling for Learning Discrete Energy-Based Models" (ICLR-2022, under review)

The official implementation is here.

Bio

In July this year, I gave a presentation of the energy-based models in our reading groups slide. In the talk, I proposed a simple method to enhance ratio matching by introducing gradient relaxation. The experimental results on learning Boltzmann Machine seem quite good compared with original ratio matching. Recently, when I skimmed the submission papers on ICLR-2022, I found a paper, named GRADIENT-GUIDED IMPORTANCE SAMPLING FOR LEARNING DISCRETE ENERGY-BASED MODELS, which applies a similar method to reduce the time and space complexity of the ratio matching. Overall, the method is simple, but works well in discrete energy based model learning. I reproduce the experiment on synthetic data here. For more details, please refers to this note and the paper.

Run

python main -data $dataset$

dataset = ['2spirals', '8gaussians', 'circles', 'moons', ...]

Results

Acknowledgement

The coding logic follows the project organization in ALOE.