Erlangen AI Hub Seminar: Geometry, Complexity, and Generalization in Learning Systems
April 30 @ 1:00 pm - 2:00 pm
Compression-based complexity measures have been used to construct non-vacuous generalization bounds for deep neural networks. In this talk, Branton DeMoss (University of Oxford) will discuss the relationship between compression, complexity, and the geometry of the loss landscape. Using a geometric complexity measure to track memorization and generalization in some pathological deep learning phenomena like grokking and double descent, and refuting a common criticism of sharpness-based generalization measures based on their lack of parameterization-invariance.
Register here: Erlangen AI Hub Seminar: Geometry, Complexity, and Generalization in Learning Systems
