Description

Back to top

Modern machine learning (ML) methods, coupled with new optimization and statistical inference strategies, have demonstrated an unprecedented potential to solve challenging problems in computer vision, natural language processing, healthcare, agriculture, and other application areas. However, foundational understanding regarding how and when certain methods are adequate to use and most effective in solving tasks of interest is still emerging. A central question at the heart of this endeavor is to understand the different facets of the complexity of machine learning tasks. These include sample complexity, computational complexity, Kolmogorov complexity, oracle complexity, memory complexity, model complexity, and the stationarity of the learning problem. This workshop will focus on developing a better understanding of these different types of complexity within machine learning, how they can be jointly leveraged to understand the solvability of learning problems, and fundamental trade-offs among them.

Organizers

Back to top
A B
Avrim Blum Toyota Technological Institute of Chicago
O M
Olgica Milenkovic Electrical and Computer Engineering
University of Illinois at Urbana-Champaign
L R
Lev Reyzin Mathematics
University of Illinois at Chicago
M T
Matus Telgarsky Computer Science
University of Illinois at Urbana-Champaign
R W
Rebecca Willett Statistics and Computer Science
University of Chicago

Speakers

Back to top
J A
Jayadev Acharya Cornell University
P B
Peter Bartlett University of California, Berkeley
K C
Kamalika Chaudhuri University of California, San Diego
J D
Jelena Diakonikolas University of Wisconsin-Madison
V F
Vitaly Feldman Apple
S G
Surbhi Goel Microsoft Research NYC
M H
Moritz Hardt University of California, Berkeley
D H
Daniel Hsu Columbia University
S J
Stephanie Jegelka MIT
A K
Adam Klivans University of Texas at Austin
A K
Andreas Krause ETH Zurich
A K
Aryeh Kontorovich Ben-Gurion University
S K
Samory Kpotufe Columbia University
P L
Po-Ling Loh University of Wisconsin-Madison
A R
Andrei Risteski Carnegie Mellow University
T S
Tselil Schramm Stanford University
G V
Gregory Valiant Stanford University
R W
Rachel Ward University of Texas at Austin

Schedule

Back to top
Monday, April 12, 2021
9:00-9:45 CDT
Contrastive Learning, multi-view redundancy, and linear models

Speaker: Daniel Hsu (Columbia University)

10:30-11:15 CDT
Functions with average smoothness: structure, algorithms, and learning

Speaker: Aryeh Kontorovich (Ben-Gurion University)

13:00-13:45 CDT
Benign overfitting

Speaker: Peter Bartlett (University of California, Berkeley)

15:15-16:00 CDT
Robust regression with covariate filtering: Heavy tails and adversarial contamination

Speaker: Po-Ling Loh (University of Wisconsin – Madison)

Tuesday, April 13, 2021
9:00-9:45 CDT
Sparse Random Features for Function Approximation

Speaker: Rachel Ward (University of Texas at Austin)

10:30-11:15 CDT
New Problems and Perspectives on Sampling, Memory, and Learning

Speaker: Gregory Valiant (Stanford University)

13:00-13:45 CDT
Computational/Statistical Gaps for Learning Neural Networks

Speaker: Adam Klivans (University of Texas at Austin)

14:30-15:15 CDT
Performative Prediction

Speaker: Moritz Hardt (University of California, Berkeley)

Wednesday, April 14, 2021
9:00-9:45 CDT
Learning from data under information constraints: Fundamental limits and applications

Speaker: Jayadev Acharya (Cornell University)

10:30-11:15 CDT
On Min-Max Optimization and Halpern Iteration

Speaker: Jelena Diakonikolas (University of Wisconsin – Madison)

13:00-13:45 CDT
Chasing the Long Tail: What Neural Networks Memorize and Why

Speaker: Vitaly Feldman (Apple AI Research)

Thursday, April 15, 2021
9:00-9:45 CDT
Learning in Graph Neural Networks

Speaker: Stefanie Jegelka (Massachussetts Institute of Technology)

10:30-11:15 CDT
Sampling Beyond Log-concavity

Speaker: Andrej Risteski (Carnegie Mellon University)

13:00-13:45 CDT
Statistical Query Algorithms and Low-Degree Tests Are Almost Equivalent

Speaker: Tselil Schramm (Stanford University)

14:30-15:15 CDT
Computational Complexity of Learning ReLUs

Speaker: Surbhi Goel (Microsoft Research NYC)

Friday, April 16, 2021
9:00-9:45 CDT
Some Recent Insights on Transfer-Learning

Speaker: Samory Kpotufe (Columbia University)

10:30-11:15 CDT
Information-directed Exploration in Bandits and Reinforcement Learning

Speaker: Andreas Krause (ETH Zurich)

13:00-13:45 CDT
The Mysteries of Adversarial Robustness for Non-parametric Methods and Neural Networks

Speaker: Kamalika Chaudhuri (University of California, San Diego)


Videos

Back to top

Contrastive Learning, multi-view redundancy, and linear models

Daniel Hsu
April 12, 2021

Functions with average smoothness: structure, algorithms, and learning

Aryeh Kontorovich
April 12, 2021

Benign overfitting

Peter Bartlett
April 12, 2021

Robust regression with covariate filtering: Heavy tails and adversarial contamination

Po-Ling Loh
April 12, 2021

Sparse Random Features for Function Approximation

Rachel Ward
April 13, 2021

New Problems and Perspectives on Sampling, Memory, and Learning

Gregory Valiant
April 13, 2021

Computational/Statistical Gaps for Learning Neural Networks

Adam Klivans
April 13, 2021

Performative Prediction

Moritz Hardt
April 13, 2021

Learning from data under information constraints: Fundamental limits and applications

Jayadev Acharya
April 14, 2021

On Min-Max Optimization and Halpern Iteration

Jelena Diakonikolas
April 14, 2021

Chasing the Long Tail: What Neural Networks Memorize and Why

Vitaly Feldman
April 14, 2021

Learning in Graph Neural Networks

Stefanie Jegelka
April 15, 2021

Sampling Beyond Log-concavity

Andrej Risteski
April 15, 2021

Statistical Query Algorithms and Low-Degree Tests Are Almost Equivalent

Tselil Schramm
April 15, 2021

Computational Complexity of Learning ReLUs

Surbhi Goel
April 15, 2021

Some Recent Insights on Transfer-Learning

Samory Kpotufe
April 16, 2021

Information-directed Exploration in Bandits and Reinforcement Learning

Andreas Krause
April 16, 2021

The Mysteries of Adversarial Robustness for Non-parametric Methods and Neural Networks

Kamalika Chaudhuri
April 16, 2021