AI Seminar: The mathematics of neural networks: recent advances, thoughts, and the path forward

Event Speaker
Mikhail Belkin, Professor
University of California, San Diego
Event Type
Artificial Intelligence
Date
Event Location
Rogers 230
Event Description

The recent remarkable practical achievements of neural networks have far outpaced our theoretical understanding of their properties. Yet, it is hard to imagine that progress can continue indefinitely, without deeper understanding of their fundamental principles and limitations. In this talk I will discuss some recent advances in the mathematics of neural networks, including some of our recent work, and outline what are in my opinion are some promising directions for the future research.

Speaker Biography

Mikhail Belkin received his Ph.D. in 2003 from the Department of Mathematics at the University of Chicago. His research interests are in theory and  applications of machine learning and data analysis. Some of his well-known work includes widely used Laplacian Eigenmaps, Graph Regularization and Manifold Regularization algorithms, which brought ideas from classical differential geometry and spectral analysis to data science. His recent work has been concerned with understanding remarkable mathematical and statistical phenomena observed in deep learning. This empirical evidence necessitated revisiting some of the basic concepts in statistics and optimization. One of his key recent findings is the "double descent" risk curve that extends the textbook U-shaped bias-variance trade-off curve beyond the point of interpolation. He has served on the editorial boards of the Journal of Machine Learning Research, IEEE Pattern Analysis and Machine Intelligence and SIAM Journal on Mathematics of Data Science.