-
Notifications
You must be signed in to change notification settings - Fork 0
Automatic differentiation support in volesti
The most efficient algorithm to sample from a log-concave distribution, that volesti
supports
is the Hamiltonian Monte Carlo with leapfrog integration. The leapfrog method is an Euler method
that in each step evaluates the gradient of the log-probability density function of the target distribution.
To sample from a log-concave distribution volesti
currently requires a function that evaluates this gradient
as an input.
However, there are plenty of applications that require sampling from a distribution where the gradient of the
log-probability density function is unknown. This project will address this problem and will provide to volesti
routines of automatic differentiation of a multivariate function. Then, a function that evaluates the log-density
will suffice to sample from the target distribution.
Automatic differentiation references and libraries:
The student will have to integrate into volesti
the three above software for automatic differentiation
and perform extensive comparison experiments to decide which option is the most efficient.
Difficulty: Hard
- Required: C++, Basic applied math background
- Preferred: Experience with statistical or other mathematical software is a plus
The expected impact is very high, as this project will allow plenty of researchers or practitioners to use volesti's efficient implementations for a broader class of distributions.
-
Marios Papachristou < papachristoumarios at gmail.com > is a PhD student in the Computer Science Department at Cornell University. His primary research interests lie within the field of Data Science. He has previous experience in GSoC 2018 and 2020 as a student under Org. FOSS and GeomScale. He was GSoC mentor in GSoC 2019.
-
Elias Tsigaridas <elias.tsigaridas at inria.fr> is an expert in computational nonlinear algebra and geometry with experience in mathematical software. He has contributed to the implementation, in C and C++, of several solving algorithms for various open-source computer algebra libraries and has previous GSOC mentoring experience with the R-project (2019).
Students, please contact the first and the third mentor after completing at least one of the tests below.
Students, please do one or more of the following tests before contacting the mentors above.
- Easy: Compile and run
volesti
. Use the C++ code to sample from the Gaussian distribution using Hit-and-Run. - Medium: Use the C++ code to sample from the Gaussian distribution using Hamiltonian Monte Carlo with leapfrog integrator.
- Hard: Implement an automatic differentiation function in C++ that uses first-order Taylor's expansion.
Students, please post a link to your test results here.
- EXAMPLE STUDENT 1 NAME, LINK TO GITHUB PROFILE, LINK TO TEST RESULTS.
STUDENT 1 AGILAN S, https://github.com/Agi7an, https://github.com/Agi7an/VolEsti/blob/main/setup.r
STUDENT 2 Ioannis Iakovidis, https://github.com/iakoviid/, https://github.com/iakoviid/volesti/tree/gsoc2022
STUDENT 3 HUSSAIN LOHAWALA, https://github.com/H9660/Volesti-/blob/main/setup.r
STUDENT 4 Huu Phuoc Le, https://github.com/huuphuocle, https://github.com/huuphuocle/sampling_correlation_matrices