This was part of
Data Sciences for Mesoscale and Macroscale Materials Models
Optimization, Sampling and Generative Modeling in Non-Euclidean Spaces
Molei Tao, Georgia Institute of Technology
Wednesday, May 15, 2024
Abstract: Machine learning in non-Euclidean spaces have been rapidly attracting attention in recent years, and this talk will give some examples of progress on its mathematical and algorithmic foundations. A sequence of developments that eventually leads to non-Euclidean generative modeling will be reported.
More precisely, I will begin with variational optimization, which, together with delicate interplays between continuous- and discrete-time dynamics, enables the construction of momentum-accelerated algorithms that optimize functions defined on manifolds. Then I will turn the optimization dynamics into an algorithm that samples probability distributions on Lie groups. If time permits, the efficient and accuracy of the sampler will also be quantified via a new, non-asymptotic error analysis. Finally, I will describe how this sampler can lead to a structurally-pleasant diffusion generative model that allows users to, given training data that follow any latent statistical distribution on a Lie group, generate more data exactly on the same manifold that follow the same distribution. Hopefully this could draw connections to important potential applications such as molecule and material design.