This was part of
Reduced Order and Surrogate Modeling for Digital Twins
Short Talk: Provable in-context learning of PDEs
Yulong Lu, University of Minnesota, Twin Cities
Monday, November 10, 2025
Abstract: Transformer-based foundation models, pre-trained on a wide range of tasks with large datasets, demonstrate remarkable adaptability to diverse downstream applications, even with limited data. One of the most striking features of these models is their in-context learning (ICL) capability: when presented with a prompt containing examples from a new task alongside a query, they can make accurate predictions without requiring parameter updates. This emergent behavior has been recognized as a paradigm shift in transformers, though its theoretical underpinnings remain underexplored. In this talk, I will discuss some recent theoretical understandings of ICL for PDEs, emphasizing its approximation power and generalization capabilities. The theoretical analysis will focus on two scientific problems: elliptic PDEs and stochastic dynamical systems.