Skip to content

Automatic differentiation support in volesti

Apostolos Chalkis edited this page Feb 22, 2022 · 4 revisions

Overview

The most efficient algorithm to sample from a log-concave distribution, that volesti supports is the Hamiltonian Monte Carlo with leapfrog integration. The leapfrog method is an Euler method that in each step evaluates the gradient of the log-probability density function of the target distribution. To sample from a log-concave distribution volesti currently requires a function that evaluates this gradient as an input.

However, there are plenty of applications that require sampling from a distribution where the gradient of the log-probability density function is unknown. This project will address this problem and will provide to volesti routines of automatic differentiation of a multivariate function. Then, a function that evaluates the log-density will suffice to sample from the target distribution.

Related work

Automatic differentiation references and libraries:

Details of your coding project

The student will have to integrate into volesti the three above software for automatic differentiation and perform extensive comparison experiments to decide which option is the most efficient.

Expected impact

The expected impact is very high, as this project will allow plenty of researchers or practitioners to use volesti's efficient implementations for a broader class of distributions.

Mentors

  • Marios Papachristou < papachristoumarios at gmail.com > is a PhD student in the Computer Science Department at Cornell University. His primary research interests lie within the field of Data Science. He has previous experience in GSoC 2018 and 2020 as a student under Org. FOSS and GeomScale. He was GSoC mentor in GSoC 2019.

  • Elias Tsigaridas <elias.tsigaridas at inria.fr> is an expert in computational nonlinear algebra and geometry with experience in mathematical software. He has contributed to the implementation, in C and C++, of several solving algorithms for various open-source computer algebra libraries and has previous GSOC mentoring experience with the R-project (2019).

Students, please contact the first and the third mentor after completing at least one of the tests below.

Tests

Students, please do one or more of the following tests before contacting the mentors above.

  • Easy: Compile and run volesti. Use the C++ code to sample from the Gaussian distribution using Hit-and-Run.
  • Medium: Use the C++ code to sample from the Gaussian distribution using Hamiltonian Monte Carlo with leapfrog integrator.
  • Hard: Implement an automatic differentiation function in C++ that uses first-order Taylor's expansion.

Solutions of tests

Students, please post a link to your test results here.

  • EXAMPLE STUDENT 1 NAME, LINK TO GITHUB PROFILE, LINK TO TEST RESULTS.