The Multifaceted Complexity of Machine Learning
April 12-16, 2021
- Avrim Blum (Toyota Technological Institute of Chicago)
- Olgica Milenkovic (Electrical and Computer Engineering, UIUC)
- Lev Reyzin (Mathematics, UIC)
- Matus Telgarsky (Computer Science, UIUC)
- Rebecca Willett (Statistics and Computer Science, Chicago)
Modern machine learning (ML) methods, coupled with new optimization and statistical inference strategies, have demonstrated an unprecedented potential to solve challenging problems in computer vision, natural language processing, healthcare, agriculture, and other application areas. However, foundational understanding regarding how and when certain methods are adequate to use and most effective in solving tasks of interest is still emerging. A central question at the heart of this endeavor is to understand the different facets of the complexity of machine learning tasks. These include sample complexity, computational complexity, Kolmogorov complexity, oracle complexity, memory complexity, model complexity, and the stationarity of the learning problem. This workshop will focus on developing a better understanding of these different types of complexity within machine learning, how they can be jointly leveraged to understand the solvability of learning problems, and fundamental trade-offs among them.
Registration will open soon.