CMSA Computer Science for Mathematicians: Optimization Methods in AI and Machine Learning: Submodularity and Beyond

CMSA EVENTS

View Calendar
May 18, 2021 11:30 am - 12:30 pm
via Zoom Video Conferencing
Speaker:

Francesco Quinzan - HPI, Germany

Several optimization problems in AI Machine Learning can be solved with the maximization of functions that exhibit natural diminishing returns. Examples include feature selection for Generalized Linear Models, Data Summarization, and Bayesian experimental design. By leveraging diminishing returns, it is possible to design efficient approximation algorithms for these problems.One of the simplest notions of diminishing returns is submodularity. Submodular functions are particularly interesting, because they admit simple, yet non-trivial, polynomial-time approximation algorithms. In recent years, several definitions have been proposed, to generalize the notion of submodularity. A study of these generalized functions lead to the design of efficient approximation algorithms for non-convex problems.In this talk, I will discuss the notion of submodularity, and illustrate relevant results on this topic, including new interesting combinatorial algorithms. I will also talk about generalizations of this notion to continuous domains, and how they translate into first- and second-order conditions. I will discuss how these notions pertain interesting problems in AI Machine Learning.

Zoom: https://harvard.zoom.us/j/98231541450