Loading Events

Local complexity measures in modern parameterized function classes for supervised learning

CMSA EVENTS: CMSA COLLOQUIUM

When: October 7, 2024
4:30 pm - 5:30 pm
Where: CMSA, 20 Garden St, G10
Address: 20 Garden Street, Cambridge, MA 02138, United States
Speaker: Elisenda Grigsby (Boston College)

The parameter space for any fixed architecture of neural networks serves as a proxy during training for the associated class of functions – but how faithful is this representation? For any fixed feedforward ReLU network architecture, it is well-known that many different parameter settings can determine the same function. It is less well-known that the degree of this redundancy is inhomogeneous across parameter space. I’ll discuss two locally-applicable complexity measures for ReLU network classes and what we know about the relationship between them: (1) the local functional dimension, and (2) a local version of VC dimension called persistent pseudodimension. The former is easy to compute on finite batches of points, the latter should give local bounds on the generalization gap. I’ll speculate about how this circle of ideas might help guide our understanding of the double descent phenomenon. All of the work described in this talk is joint with Kathryn Lindsey. Some portions are also joint with Rob Meyerhoff, David Rolnick, and Chenxi Wu.

Zoom ID 965 2902 1352
Passcode 322891
https://harvard.zoom.us/j/96529021352?pwd=ehXEylANVrstFfISgNJhjaPwcIuCby.1