Our talk is mainly about a scalable and easy way of addressing bilevel optimization problems when we only have access to noisy evaluations of the objective functions, and absolutely no access to their gradient or Hessian. The talk starts with some motivating applications of bilevel programming in machine learning. We will then briefly overview zeroth-order methods for general optimization problems, and specifically focus on a more recent technique called Gaussian smoothing, which allows estimating the first and second order derivatives of an objective function solely through random queries to the function. Finally, we propose a computational algorithm to address general bilevel programs, without the need to access the Jacobian or Hessian of the underlying objectives. We present theoretical guarantees on the performance of the algorithm, and ultimately present iteration complexity (sample complexity) results. To the best of our knowledge, the proposed algorithm is the first result in the literature about a fully zeroth-order scheme to address general bilevel programs.
Alireza Aghasi joined the School of Electrical Engineering and Computer Science at Oregon State University in Fall 2022. Between 2017 and 2022, he was an assistant professor in the Department of Data Science and Analytics at the school of business, Georgia State University. Prior to this position he was a research scientist with the Department of Mathematical Sciences, IBM T.J. Watson research center, Yorktown Heights. From 2015 to 2016 he was a postdoctoral associate with the computational imaging group at the Massachusetts Institute of Technology, and between 2012 and 2015 he served as a postdoctoral research scientist with the compressed sensing group at Georgia Tech. His research fundamentally focuses on optimization theory, probability theory and statistical analysis, with applications to various areas of data science, artificial intelligence, and modern signal processing.