This was part of Applied Optimal Transport

Inverse bounds for learning latent structures

Long Nguyen, University of Michigan

Monday, May 16, 2022



Abstract: Inverse bounds are inequalities that provide upper bound estimates of the distance of latent structures of interest in a suitable Wasserstein space in terms of the statistical distance (e.g., total variation, Hellinger, KL divergence) in the space of data populations. This type of bounds is useful in deriving the rates of parameter estimation (via either M-estimators or Bayesian methods) that arise in latent structured models, including convolution models, mixture models and hierarchical models. In this talk I will present several such inverse bounds for (i) mixing measures in mixture and convolution models, (ii) the de Finetti’s mixing measure in mixture of product distributions, and (iii) mixing measures in the settings of contaminated models and regression with heterogeneous responses. The derivation of such inverse bounds requires a deeper investigation into conditions of the latent structure’s identifiability, which shed some light about the optimal transport geometry of latent structured models.