• Erlangen Seminar Series: Studying the Geometry of Loops on the Möbius Band

    Date: Friday 23 January, from 3–4pm Location:  Online Title:  Studying the Geometry of Loops on the Mobius Band From contours of 2D shapes to knotted proteins, many areas of statistics and physical sciences contend with data in the form of geometric loops -- an embedding of S1 into a Euclidean space. Similar to many problems …

  • Erlangen Hub Seminar: Computing Diffusion Geometry, Iolo Jones

    Calculus and geometry are ubiquitous in the theoretical modelling of scientific phenomena, but have historically been very challenging to apply directly to real data as statistics. Diffusion geometry is a new theory that reformulates classical calculus and geometry in terms of a diffusion process, allowing these theories to generalise beyond manifolds and be computed from …

  • Erlangen Hub Seminar: Riemannian Neural Optimal Transport, Alessandro Micheli

    Computational optimal transport (OT) provides a principled framework for generative modelling. Neural OT methods learn transport maps from data using neural networks and can be evaluated out of sample after training; however, existing approaches are largely restricted to Euclidean settings. Extending neural OT to high-dimensional Riemannian manifolds presents significant theoretical and computational challenges. In this …

  • Erlangen AI Hub Seminar: Estimating Intrinsic Dimensionality with L2N2

    Estimating the intrinsic dimensionality (ID) of data is a fundamental problem in machine learning and computer vision, as it reveals the true degrees of freedom underlying high-dimensional observations. In this talk, Eng-Jon Ong (QMUL) introduces L2N2, a simple yet powerful ID estimator based on nearest-neighbour distance ratios that achieves state-of-the-art performance with minimal computational overhead. …

  • Erlangen AI Hub Seminar: Geometry, Complexity, and Generalization in Learning Systems

    Compression-based complexity measures have been used to construct non-vacuous generalization bounds for deep neural networks. In this talk, Branton DeMoss (University of Oxford) will discuss the relationship between compression, complexity, and the geometry of the loss landscape. Using a geometric complexity measure to track memorization and generalization in some pathological deep learning phenomena like grokking …