Skip to content

Maximum likelihood estimation of a multidimensional log concave density

Apostolos Chalkis edited this page Feb 25, 2022 · 12 revisions

Background

Modern non-parametric density estimation is a crucial problem that appears in many applications in computational statistics, machine learning and Bayesian inference. In [1] they provide a Maximum likelihood estimation of a multidimensional log-concave density given a set of n i.i.d. observations. They combine techniques from computational geometry and from non-differentiable convex optimization. The algorithm is implemented in CRAN package LogConcDEAD. However, the algorithm requires the computation of a triangulation of the point-set. Thus, the run-time grows exponentially to the dimension and the efficiency of the algorithm is limited for dimensions larger than say ~20. The same holds for the methods in [3, 4]. The student will implement an approximation algorithm based in [1] using Markov Chain Monte Carlo integration to solve the optimization problem that appears in [1].

Details of your coding project

The student will implement (a) the MCMC integration algorithm in [2] (b) approximate Shor’s r-algorithm based on (a) and (c) an approximation algorithm Maximum likelihood estimation of a multidimensional log-concave density given a set of n i.i.d. observations.

Difficulty: Hard

Expected impact

The aim is the new implementation to be the most efficient among existing open source software packages for non-parametric density estimation.

[1] M. Cule, R. Samworth, M. Stewart, Maximum likelihood estimation of a multi‐dimensional log‐concave density, 2010.
[2] L. Lovasz, S. Vempala, Fast Algorithms for Logconcave Functions: Sampling, Rounding, Integration and Optimization, 2006.
[3] E. Robeva, B. Sturmfels, N. Tran, C. Uhler, Maximum Likelihood Estimation for Totally Positive Log-Concave Densities, 2018.
[4] A. Grosdos, A. Heaton, K. Kubjas, O. Kuznetsova, G. Scholten, M. S. Sorea, Exact Solutions in Log-Concave Maximum Likelihood Estimation, 2020.

Mentors

  • Elias Tsigaridas <elias.tsigaridas at inria.fr> is an expert in computational nonlinear algebra and geometry with experience in mathematical software. He has contributed to the implementation, in C and C++, of several solving algorithms for various open source computer algebra libraries and has previous GSOC mentoring experience with the R-project (2019).

  • Apostolos Chalkis <tolis.chal at gmail.com> is a PhD student in Computer Science. His research focuses on mathematical computing, optimization and computational finance. He has previous experience in GSoC 2018 and 2019 as a student under Org. R-project, implementing state-of-the-art algorithms for sampling from high dimensional multivariate distributions. He was GSOC mentor in three projects with Geomscale (2020). He is one of the authors of volesti.

  • Ioannis Psarros < ipsarros at di.uoa.gr > is a postdoctoral researcher at the University of Bonn. He is an expert in geometric approximation algorithms and mathematical computing.

  • Christos Konaxis < ckonaxis at gmail.com > is a postdoctoral researcher at ERGA group, University of Athens. His an expert in algebraic geometry and mathematical software.

Tests

  • Easy: Download, compile and run a simple sampling example with both C++ and R interfaces of volesti. For example, you can sample uniformly distributed points from a 100-dimensional cube using all the implmented in volesti random walks and project the points onto the plane to demonstrate the mixing of the random walks.

  • Medium: Given an evaluation oracle of a strongly convex function, implement ball walk to sample from the corresponding log-concave distribution truncated to a polytope. You are free to choose if the oracle is written in C++.

  • Hard: Implement gradient-descent algorithm when additionally, an evaluation oracle is given for the gradient of a strongly convex function. Use the step size of Barzilai–Borwein method. Again you are free to choose if the gradient oracle is written in C++.

Solution of Tests