This was part of Bayesian Statistics and Statistical Learning

Homotopy Continuation Techniques for Optimization in Variational Inference

Emma Cobian, University of Notre Dame

Friday, December 15, 2023



Slides
Abstract: Approximating probability distributions is an important task in statistics and machine learning. Recently, optimization-based methods in variational inference have gained popularity, such as normalizing flows, to provide approximations which allow both sampling and density estimation. Normalizing flows are invertible mappings used to transform simpler distributions into ones that are more complex through optimizing parameters associated with these mappings. With complicated geometric structures or expensive model evaluations underlying a distribution, providing an accurate approximation can be a challenging task. I will be presenting optimization techniques, such as homotopy continuation, which facilitate computational efficiency and accurate convergence to the desired distribution.