Themes

Scientific Themes

Scientific activity at IMSI is organized around a set of themes which have been chosen as focal points for sustained engagement over many years.  These themes typically represent areas which we expect will provide fertile ground for interdisciplinary work involving the mathematical sciences, and where progress has the potential for significant benefits to society.  Long programs, interdisciplinary research clusters, and research collaboration workshops are generally expected to relate strongly to one or more themes.  Standalone workshops will often relate to themes, but may range further afield. The set of themes will evolve over time.  Here, we give a brief description of the Institute’s current themes.

Climate change is one of the most pressing issues facing the modern world, with growing and important impacts on natural systems and human life. The increasing frequency and intensity of extreme weather and climate events have powerful consequences. Hence, it is important to have models that can be used to make effective predictions on a variety of geographical and time scales. 

The impact of these phenomena on human life is enormous.  Intense and frequent heat waves caused by climate change can be fatal. Climate change is also expected to alter the geographic and seasonal distributions of existing vectors and vector-borne diseases. Rising temperature, changing precipitation patterns, and a higher frequency of some extreme weather events associated with climate change will influence the distribution, abundance, and prevalence of infection in vectors and pathogens. 

Climate is a complex coupled system involving significant physical processes for the atmosphere, ocean, and land over a wide range of spatial scales (from millimeters to thousands of kilometers) and time scales (minutes to decades or centuries). Climate research has traditionally relied upon comprehensive computational atmospheric and oceanic simulation models designed to mimic complex physical spatio-temporal patterns in nature. Either through lack of resolution due to  limitations in computing power or an inability to model nature through first principles, such simulations require parameterized models of many features of the climate system. There are, however, intrinsic model errors that are not readily ascertained. Climate change science must cope with predicting the coarse-grained dynamic changes of an extremely complex system by using a combination of first principles and models for which uncertainty quantification is limited at best. 

A highly nontrivial interplay of partial differential equations, probability theory, numerics and large-scale computing, and mathematicaland statistical modeling for large dimensional turbulentdynamical systems modeling will need to come into play to make progress in this very important area.

The surging data generation capabilities of modern sensors and networked systems and the vastly increased data processing power of computers and storage media have led to the accumulation of enormous volumes of disparate data. The nascent field of data science focuses on developing scalable and robust algorithms for extracting knowledge from these stores of information. The growing need for powerful and novel methods to extract information from data, in a form that is useful to individuals, society, researchers, and industry, has led to a groundswell in machine learning. Recent progress has been remarkable. This success has been in large part driven by the increasing availability of large-scale training data sets, more powerful computers, and sophisticated algorithms for analyzing extremely large data sets. There is now intense interest in leveraging machine learning in many fields: automatic recognition of image content, identification of best practices in health care, improvement of agricultural yields, understanding how the human brain encodes information, and more.

However, many modern machine-learning algorithms lack interpretability, and  can also be surprisingly fragile.  Furthermore, training data can be skewed, resulting in unexpectedly “unfair” algorithms which can lead to bias.  Although the developments to date, driven primarily by phenomenological considerations, have been remarkably successful, substantial work remains to be done in order to reach a fundamental understanding of why these methodologies actually succeed. 

Progress can only come from the development of new and sophisticated mathematics and statistics. The study of data and information cuts across a myriad of disciplines, including computer science, statistics, optimization, and signal processing, and reaches into classical areas of mathematics. Furthermore, application-specific models and constraints in fields such as astrophysics, particle physics, biology, economics, and sociology present additional exciting opportunities for the mathematical and statistical  analysis of data.

Historically,  medicine has seen many applications of mathematics and statistics, with examples including the validation of the effectiveness of new drugs, estimation of survival rates for patients undergoing treatments, and medical imaging (CT scans and MRIs). Going forward, health care data analytics, which bridges the fields of computer science, engineering, statistics and medicine, will improve the delivery of health care, increase the precision of medicine, improve the quality of life, and extend the human lifespan. For example, mining of genetic data could inform the creation of an early warning system to predict the onset of diseases, such as cancer or diabetes, in at-risk patients.

There is a dire need to develop new methods to  improve health and fight disease using data science, computational modeling, and machine learning. Some of the most pressing issues are the identification  of medical procedures that are most likely to be effective based on an individual’s specific genetic makeup, the understanding of how the changing  environment will affect the emergence and spread of infectious diseases, and  streamlining health care delivery to improve both cost and patient outcomes. 

Currently, a great deal of effort is being put into collecting and interpreting data on health and medical care and, in turn, building better models for diseases, responses to treatment, prognostic plans, diagnostic regimes, vaccinations, and mental health. However, as better and more complex models are developed and back-tested, there is a pressing need to develop equally sophisticated decision making models for their subsequent use. This disparity both requires and motivates the development of richer models for complex health and medical care settings, allowing, for example, for high-dimensions (multi-attribute clinical profiles), path dependence (medical history and response to recent treatments), state and control constraints (morbidity factors, coexisting medical conditions), optimal stopping (best time to induce labor, stop treatment, start prophylactic medication, etc.), optimal pasting across random horizons (corresponding to conditional responses to sequential treatments), and more. 

There is also need to develop sophisticated and reliable models to quantify and assess risk related to various decisions in health and medical care. Methods from financial mathematics and quantitative finance can play a major role here, as they are directly related to the concepts of risk quantification, risk management, project valuation, contract design, and more. Indeed, there is a plethora of direct analogies between valuing financial derivatives and contracts and pricing a medical treatment, risk assessment of vaccines, optimal funding of a health-care activity, design of contracts between stakeholders, etc.

Another direction, currently very much overlooked, is related to how individuals think about their health, the extent to which they fear disease and its treatment, how they react to information and uncertainty, perceive risks, follow regimes, and more. There is currently very little work in this direction, but it would be possible to borrow fundamental concepts from decision analysis and economics to build meaningful models of how patients react to uncertainty while making decisions about their medical treatment, life-style, insurance plan, choice of medical facilities and specialized doctors, and so on.

Mathematical sciences have also played a significant role in studying a wide range of issues including, for example, oncology (tumor growth), cardiology (blood flow, cardiovascular system and heart conditions), and neurology (brain disorders). There remains room for progress in these directions as well.

The topics indicated above are by no means exhaustive. Rather they provide an indication of the wide range of health and medicine-related issues which can be tackled by mathematical and statistical techniques, and have the potential to inspire the development of new tools and techniques in the mathematical sciences. With ever increasing advances in medicine and health care, on one hand, and mathematical and statistical modeling and computational methods on the other, this role may be strengthened significantly further through fruitful and multifaceted scientific interactions.

Materials science is about the discovery, design and development of new materials in areas such as nanotechnology, biomedicine (biomaterials) and metallurgy (composite materials in airplane design, nuclear reactors), forensic science (textiles), quantum computing (qubits), development of more efficient energy resources (new oxides) as well as in everyday life (“smart” cement). Breakthroughs in this area will have significant effects on technology and society.

The modeling of materials is done at different (microscopic, mesoscopic and macroscopic) levels and it involves many different scales, for example, electric wires in a jumbo jet. This presence of numerous length-scales and multiple levels of physical modeling in materials science problems represents a challenge for mathematics and numerical simulation. It can take various forms. It begins with the inclusion of atomic-scale information, like that coming from electronic structure theory, and continues at coarser scales with, for example, the simulation of chemical reactions and detailed biological phenomena at the microscopic scale. 

Combinations of scales are the purpose of the Atomistic-to-Continuum models combining atomistic and continuum mechanics simulations for phenomena such as  nano-indentations and micro-cracks. Alternatively,  defects in crystalline materials may also included as a first step toward  nucleation, and then the motion, of dislocations, disordered phases in amorphous materials, thermal fluctuations, and impurities or heterogeneities in a large variety of continuous media.

Standard methods available in the literature to address such problems often lead to very (sometimes prohibitively) expensive computations, even when implemented on large-scale computers. Practically relevant and affordable numerical methods are being developed, but there is still substantial room for improvement. In sharp contrast to existing well established mathematical theories, it is a pending challenge to understand, model, simulate and control real materials with all their inevitable imperfections. Issues such as the modeling of non-periodic materials, materials with defects, and how fatigue and aging affect the characteristics of materials, are not so well understood quantitatively, or sometimes even qualitatively.

A related but also very different issue arising in modeling of physical phenomena is related to coupling. The essence of such “multi-physics” modeling is the coupling between distinct physical processes whose mathematical descriptions differ substantially in (for example) the nature of the governing time evolution operators, boundary conditions and so forth. Examples include radiation hydrodynamics, coupling between fluid flows and bounding mechanical structures like flows over dynamic airfoils), and coupling between physics at the atomic and mesoscales like crack initiation and propagation. Correctly reconciling the modeling of such disparate, but coupled, physical processes is an extremely challenging problem that clearly is in the domain of mathematics and statistics.

The development of the sophisticated theoretical and computational tools that are needed in materials science will undoubtedly have significant ramifications for the future.

There are many challenges, both practical and theoretical, in the emerging and exciting areas of quantum information and computing, which seek to make effective use of the information embedded in the state of a quantum system, and promise to solve previously intractable computational problems and revolutionize simulation.

The greatest challenges are the development of hardware and algorithms.

The first challenge is, of course, an engineering problem and relates to the need for large numbers of qubits, greater connectivity, lower error rates, longer coherence, and lower cost, to name a few.

The second challenge is where the mathematical sciences will play a fundamental role, and relates to the need to come up with algorithms that will make quantum computing work. The algorithms used for classical computing, which are based on mathematics and logic, do not work, and need to be replaced by the physics of quantum mechanics.

New theory is required to understand the advantages and limitations of quantum media. This includes the development of quantum cryptography promising “unhackable” communications, new quantum algorithms with the promise to solve previously intractable computational problems and to revolutionize simulation, remote verification, and the study of random circuits for benchmarking. In other words, there is a pressing need to develop a quantum Turing machine.

The fundamental challenges are to advance the understanding and prediction of qubit materials that are needed for the next stage of quantum computing, and to uncover the possibility of delegating a quantum computation to a quantum device in such a way that the final outcome can be verified on a classical computer. New directions that need to be pursued as the theory advances are the security of proposed post-quantum cryptosystems, which are based on ideal lattices and on elliptic curve isogeny. The question is whether there exists a deep mathematical structure that can be exploited by quantum algorithms to break these cryptosystems.  Another important direction moving forward is very efficient quantum error-correcting codes and fault-tolerant quantum computing. This area is due for a major new push in view of more physically relevant error models derived from experiments, as well as an expanded focus on efficiency. This may be intimately related to coding theory.  

At the moment, quantum information theory draws inspiration from many different aspects of mathematics and statistics, including mathematical physics, representation theory, operator and random matrix theory, algebraic geometry and general quantum algebra, statistical inference and Monte Carlo methods, high-dimensional statistics and compressed sensor algorithms.

This general subject is very cross-disciplinary with critical inputs from mathematics, statistics, computer science, information theory, physics, engineering and others. Given the significance of information processing and communication in modern society, the potential of quantum computing and quantum information cannot be understated.

Uncertainty is ubiquitous in the modern world. This raises profound challenges in any effort to model massively complex phenomena.

The construction of verifiable models is limited due to error and uncertainty that enter through many channels. When it comes to the modeling of highly complex systems, there are two potential sources for predictive inaccuracy. The first can be due to misestimation of the system’s initial, and possibly stochastic, conditions, or to unavoidable natural variation in the target phenomena. This is the random error, which is, at least in principle, detectable and correctable. The second (systematic error) may arise from systematic misspecification of the theoretical characteristics and dynamics of the system itself and  is difficult to detect empirically. However, advances in the interrogability of models could lead to stable mathematical moorings for the parametrizations and attendant weightings of the models, and thus to sound anchoring of the basic science. 

Model uncertainty has considerably more complex effects on human decision making as its directly affects attitudes and behaviors not only towards uncertainty itself but also towards the lack of accurate models for it. This creates a stochastic interplay between human beings and the environment which can be extremely difficult to analyze, and is seen in all aspects of everyday life, at both personal and collective levels. The actions of a decision maker (parent, patient, voter, consumer, traveler, etc.) also depend crucially on the perceptual horizon of the situation they encounter, which creates additional layers of difficulty as model decay and misspecification themselves change with time. Constructing informed beliefs and forecasts presents serious conceptual and computational challenges to both decision scientists and statisticians. 

Dynamic models of single or multiple agents and learning require tractable solutions to learning and filtering problems. Although quasi-analytic solutions are available for only a limited number of such problems, there have been important advances in numerical solutions. The numerical methods involved are of interest to statisticians as a way to learn about parameters and, more generally, the models. Depending on the sector in which the applications are considered, bespoke methods of analysis are expected. For example, in building economic models, these methods are applied to the people (economic agents) who seek to learn from historical data when making decisions. The objective functions of the decision makers should influence how to design the numerical learning algorithms. For instance, economists care about how people respond to extreme events that occur infrequently but may have important, even detrimental, consequences to their well-being. On the other hand, building models for patients facing a potentially harsh treatment would need a different analysis of preferences towards uncertainty, based on more personalized profiles and much shorter horizon.

Although identifying and quantifying uncertainty is of paramount importance, it is equally critical to access, manage and mitigate the emerging risks. This is where mathematics and statistics can provide the necessary, effective, rigorous models and tools to deal with these issues.