Celebrating Three PhD Successes

We are delighted to celebrate the recent PhD successes of Dr Yueqi Cao, Dr Roan Talbut, and Dr Qiquan (Qi) Wang, three early-career researchers whose work spans tropical geometry, statistical topology, and the mathematics of complex data. Their achievements reflect not only their own creativity and depth of insight but also the vibrant research environment shaped by their supervisor, and Hub Co-Director Professor Anthea Monod, who’s algebraic topology and algebraic geometry contributes to the understanding of modern statistical and machine-learning problems.


Dr Yueqi Cao — Tropical Geometry and Metric Graphs

Dr Yueqi Cao successfully defended his PhD, From Graphs to Point Clouds: the Tropical Abel–Jacobi Transform and Persistent Homology for Metric Graphs. His thesis develops rigorous links between tropical geometry, persistent homology, and statistical approaches for metric graphs, offering new tools for understanding geometric structure and opening pathways for applications in cryptography, information geometry, and machine-learning tasks on graph-structured data.

Yueqi’s doctoral work has led to four published journal papers across computational mathematics, data science, and statistics, with several more under review. He now continues his research as a Digital Futures Fellow at KTH Stockholm.


Dr Roan Talbut — Tropical Geometry for Phylogenetic Statistics

Dr Roan Talbut, now a Postdoctoral Research Associate at the Erlangen Hub, defended their PhD titled Tropical Geometry for Phylogenetic Statistics. Their research provides deep new insights into the intersection of tropical geometry, probability, statistics, and optimisation, developing tools that bring greater interpretability and computational tractability to the analysis of evolutionary and biological data.

Roan’s PhD resulted in several peer-reviewed publications across data science, optimisation theory and pure mathematics, with further work in progress. They continue their academic journey at Durham University


Dr Qiquan (“Qi”) Wang — Statistical Topology Across Biology and AI

Dr Qiquan Wang successfully defended her PhD, The Shape of Data: Statistical Topology Across Biology and AI. Her thesis establishes new statistical frameworks for analysing data using topological invariants in both single- and multi-parameter settings, and has applications to biological systems and deep-learning architectures.

Qi’s research has led to five papers, including publications, with additional manuscripts under review. She now moves on to a postdoctoral fellowship at Queen Mary, University of London.



Recognising the Mathematical Foundations of Their Research

These PhD successes highlight the vibrancy of mathematical foundations research and the impact of early-career researchers contributing new ideas at the interface of mathematics and AI. We are proud to celebrate their achievements and look forward to seeing the exciting directions their work will take in the years ahead.

Conference Round-Up: CDC 2025 and NeurIPS 2025

Researchers across the Erlangen AI Hub continue to showcase their work on the international stage. This season, Hub members presented at the IEEE Conference on Decision and Control (CDC 2025) and NeurIPS 2025, one of the world’s leading AI gatherings. Their contributions span advances in autonomous systems, the mathematical foundations of control, and the growing use of generative AI in finance. The highlights are captured below.

Thom Badings, delivering CDC conference talk

Advances in Abstraction-Based Control at CDC 2025

Designing safe, reliable controllers for autonomous systems, from drones to self-driving vehicles, remains a fundamental challenge in AI. At CDC 2025, Erlangen Hub PDRA Thom Badings and Co-Investigator Alessandro Abate presented new research advancing abstraction-based control, a principled approach for computing correct-by-construction control policies under uncertainty.

Their two papers deliver key contributions:

  • Strengthening the mathematical foundations
    A refined abstraction framework capable of computing provably safe control policies even when system dynamics are uncertain. This work enhances both precision and scalability for complex autonomous platforms.
  • Introducing data-driven abstraction methods
    New techniques for constructing abstractions directly from empirical data, reducing reliance on fully specified analytical models and enabling robust control in partially known environments.

These developments push forward the frontier of reliable autonomous decision-making and contribute to the Hub’s broader mission to develop rigorous foundations for trustworthy AI.

Further reading:
Probabilistic Alternating Simulations for Policy Synthesis in Uncertain Stochastic Dynamical Systems https://arxiv.org/abs/2508.05062
• Data-Driven Abstraction and Synthesis for Stochastic Systems with Unknown Dynamics: https://arxiv.org/abs/2508.15543

Generative AI for Finance: Rama Cont at NeurIPS 2025

At the NeurIPS 2025 Workshop on Generative AI in Finance, Erlangen Hub Co-Investigator Rama Cont delivered an invited talk on how generative models are transforming quantitative finance.

Financial markets are noisy, nonlinear, and highly interdependent, making simulation and risk assessment especially challenging. Cont presented recent work demonstrating how GAN-based models can emulate complex market behaviour, generate realistic scenarios, and support robust risk management.

His talk covered several key generative approaches developed by Cont and collaborators, including:

  • VolGAN for stochastic volatility surfaces
  • Tail-GAN for modelling rare but high-impact tail events
  • YieldGAN for yield curve dynamics
  • Data-driven hedging with generative models, a method using conditional generative models to compute hedge ratios across simulated market scenarios

The last of these was the focus of his presentation and recent paper, which proposes a non-parametric approach to hedging that outperforms classical delta and delta-vega strategies, even years after the training period.

The workshop itself featured leading voices from academia and industry, reflecting the rapid growth of AI-driven approaches in financial modelling.

Paper abstract:
Cont, R., Vuletić, M. Data-driven hedging with generative models. Ann Oper Res (2025)

Hub Leadership at NeurIPS 2025

NeurIPS 2025 was among the most competitive editions of the conference to date, with just 24.5% of submissions accepted. Against this backdrop, Erlangen AI Hub Director Michael Bronstein appeared as a co-author on nine accepted papers, presented across poster and spotlight sessions.

These contributions span generative and diffusion models, flow-based methods, equivariant and graph neural architectures, optimisation, and inference. All are core areas in the mathematical foundations of modern AI, and together, they reflect sustained engagement with both the theory and practice of scalable learning systems.

In a conference landscape increasingly shaped by large North American corporations and Chinese research institutions, this level of representation places Bronstein as a key figure in small group of Europe-based researchers maintaining strong technical visibility at NeurIPS, while highlighting the continued contribution of UK and European research to foundational questions shaping the field.

Further reading

Arroyo, Álvaro; Gravina, Alessio; Gutteridge, Benjamin; Barbero, Federico; Gallicchio, Claudio; Dong, Xiaowen; Bronstein, Michael; Vandergheynst, Pierre. On Vanishing Gradients, Over-Smoothing, and Over-Squashing in GNNs: Bridging Recurrent and Graph Learning. NeurIPS 2025

Finkelshtein, Ben; Ceylan, İsmail İlkan; Bronstein, Michael; Levie, Ron. Equivariance Everywhere All At Once: A Recipe for Graph Foundation Models. NeurIPS 2025

Gelberg, Yoav; Eitan, Yam; Navon, Aviv; Shamsian, Aviv; Putterman, Theo (Moe); Bronstein, Michael; Maron, Haggai. GradMetaNet: An Equivariant Architecture for Learning on Gradients. NeurIPS, 2025

Marisca, Ivan; Bamberger, Jacob; Alippi, Cesare; Bronstein, Michael M. Over-squashing in Spatiotemporal Graph Neural Networks. NeurIPS 2025

Petrović, Katarina; Atanackovic, Lazar; Moro, Viggo; Kapuśniak, Kacper; Ceylan, İsmail İlkan; Bronstein, Michael; Bose, Avishek Joey; Tong, Alexander. Curly Flow Matching for Learning Non-gradient Field Dynamics. NeurIPS 2025

Reu, Teodora; Dromigny, Sixtine; Bronstein, Michael; Vargas, Francisco. Gradient Variance Reveals Failure Modes in Flow-Based Generative Models. NeurIPS 2025

Sadeghi (Akhound-Sadegh), Tara; Lee, Jungyoon; Bose, Avishek Joey; De Bortoli, Valentin; Doucet, Arnaud; Bronstein, Michael M.; Beaini, Dominique; Ravanbakhsh (Ravandbakhsh), Siamak; Neklyudov, Kirill; Tong, Alexander. Progressive Inference-Time Annealing of Diffusion Models for Sampling from Boltzmann Densities. NeurIPS 2025

Tan, Charlie B.; Hassan, Majdi; Klein, Leon; Syed, Saifuddin; Beaini, Dominique; Bronstein, Michael M.; Tong, Alexander; Neklyudov, Kirill. Amortized Sampling with Transferable Normalizing Flows. NeurIPS 2025

Tang, Zhiyuan; Zhou, Yuhao; Zhao, Xuanlei; Shi, Mingjia; Wang, Wangbo; Huang, Kaixuan; Schurholt (Schürholt), Konstantin; Bronstein, Michael M.; You, Yang; Zhangyang, Wang; Wang, Kai. Drag-and-Drop LLMs: Zero-Shot Prompt-to-Weights. NeurIPS 2025

Meet the team Q&A

Continuing our Q&As, where hub members kindly answer a set of questions to share more about them and their work, we introduce Hub Co-Director and Imperial Lead Anthea Monod.

Can you share a bit about your background and your current research focus?
My undergraduate degree is in pure mathematics, while my PhD is in statistics. I now work at the intersection of pure mathematics and statistics, data analysis, and machine learning. This uses aspects of my formal training and also provides plenty of opportunities to learn more and explore more ways to use pure mathematics in computation and data science, as well as use computational and data-centric approaches to questions in pure mathematics.

What inspired you to pursue this area?
During my PhD, I missed the pure mathematics that I studied in undergrad but, in undergrad, I yearned for my work to have real-world impact and significance. When I was searching for a postdoc after my PhD, I stumbled on this area by accident. I wanted to work more on the geometry of random fields, and contacted Robert Adler (Technion, Israel) for a postdoc in this area, as he is the world expert in it and literally wrote the book on it. But he informed me that his latest interests are in topological data analysis (TDA), which adapts algebraic topology to the computational setting for data analysis and statistics. Although I had hoped to spend some of our discussions on the geometry of random fields during my postdoc, it turns out that I ended up finding TDA more inspiring and motivating and have been working in that area ever since, also expanding to other intersections as well, including algebraic statistics. Algebraic statistics uses techniques from algebraic geometry to study statistical models, which I am exploring in the extension to machine learning methods and neural networks, mostly using tropical geometry, which is a piecewise linear, combinatorial, and polyhedral variant of algebraic geometry.

Which themes are you connected to within the Erlangen AI Hub?
Initially, while constructing the proposal, I felt most connected to Themes A and B (Understanding Data, Understanding Machine Learning Models). However, recent work that was born out of the hub framework has also produced some work in collaboration with other hub members as well as new connections to industry. So, I have also dabbled in Themes C and D (Understanding Learning, Understanding Decision-Making), especially where we have used algebraic topology to try and understand latent spaces of large language models under adversarial influence.

What attracted you to the Erlangen AI Hub and what do you hope to see it achieve?
I feel very fortunate to be one of the team who founded the idea from the moment that the EPSRC call for proposals went live, together with Jeff, Heather, Jacek, Omer, Primoz, and Michael. Jeff, Heather, Omer, Primoz, and myself have primarily been working on applications of topology to data science, mostly persistent homology, and were interested in developing other areas of algebraic topology to data science, such as K-theory. Heather and I also work both in algebraic statistics and wanted to push further to include algebraic geometry as well, which then grew when Omer proposed a probabilistic grounding to the research with his background, Jeff who suggested category theory, and Michael who has worked a lot in adaptations of smooth and continuous geometry to machine learning as well. Over the lifetime of the hub, I would really like to see more ways that more “exotic” mathematics can be adapted to understand deep learning and AI better and, in that sense, establish and ground the position of mathematics in modern data processing techniques and learning theory. I would also like to see the reach of our work extend beyond academia, and I would like for concepts in pure mathematics that might have been inaccessible or intimidating so far to become more widely-used, or even household concepts!

What’s been the most surprising or exciting finding in your work so far?
I think currently what I’ve been most excited about is the reach of tropical geometry to data settings that I’ve been interested in. I started off with an interest in understanding, analysing, and comparing evolutionary relationships in biology captured by phylogenetic trees. It turns out that tropical geometry is a very powerful framework for doing this, and over the past 7 years that I’ve gone into this direction, together with some wonderful collaborators that I have had the great fortune to meet, I’ve done quite a lot of work in providing guarantees for data analytic questions in tropical geometry. The more I learn about tropical geometry, the more I am realising that it also has connections to other discrete mathematical objects that I am interested in, such as metric graphs, which are also very important mathematical structures to model urban road networks, for example. Perhaps most relevant to the hub, it turns out that tropical geometry also has connections to neural networks, which can be thought of as the engine to modern AI systems. I’ve had the great fortune of supervising wonderful PhD students in this area, one of whom is now a hub PDRA in Durham! And another who is also carrying forward the mission of the hub as a Digital Futures Fellow at KTH Sweden, collaborating with some other well-known researchers in both algebraic geometry and algebraic topology in machine learning, such as Kathlén Kohn (a plenary speaker at the hub’s first public conference) and Martina Scolamiero.

What challenges have you faced in your research, and how did you overcome them?
In such an interdisciplinary field, it’s always a challenge to have the expertise in all of the required areas to make a real contribution. Fortunately, I’ve built a large network of fantastic collaborators from all backgrounds with different expertise. This collaborative spirit of exchange and openness is something that we’ve built into the hub by construction. Nobody knows everything, nobody is an expert in everything, so it’s important to talk to each other and work together, which I’m happy to report has worked well for me so far in the hub, and I hope for this to grow and expand over the lifetime of the Hub.

What advice would you give to someone just starting out in your field?
The area is quite vast and fast-moving, so it can be difficult to know where to start. However, fortunately, there is also a lot of content out there: blog posts, tutorials, vlogs, as well as opportunities for in-person interactions, such as talks, workshops, and conferences. Don’t be afraid to sit in and listen to these talks, to talk to the speakers, or reach out to researchers. Don’t be afraid of not understanding everything, hardly anyone does! Interest is what motivates you to learn, and we need to learn continuously to do the work that we do, so keep up the interest and jump on opportunities to learn more, but also make the time to get serious and do some work as well.

What’s something people might be surprised to learn about you outside of research?
I almost didn’t go to university at all. I almost went to study at a conservatoire instead, but quickly got intimidated by how difficult a career in music is, so settled for the “easier” option of mathematics!

Hub seminar series

In the latest of the hub’s seminar series, Raphaël Tinarrage of the Institute of Science and Technology Austria visited Imperial College London on 11 November to give a talk on Linear orbits of compact Lie groups and machine learning.

When a problem involves continuous symmetries, such as rotations, one naturally expects a Lie group action. In some cases, this action is linear, that is, made of rigid Euclidean motions. As a matter of fact, linear actions arise in several corners of data analysis: in image processing, where standard embeddings commute with Euclidean isometries; in equivariant neural networks, where one structurally forces linear actions or favors them via optimization; or in physical systems, where representations are found sometimes through Noether’s theorem, and sometimes more unexpectedly.

However, most of the time, the representation is not observed directly, but only through its orbits. Recovering the underlying representation from a single orbit would not only allow one to verify the Lie linear orbit hypothesis, but also to improve existing data analysis techniques.

In his talk, Raphaël presented such an orbit-regression algorithm, developed with Henrique Ennes, PhD student at the Inria Centre at the Université Côte d’Azur. Building on previous work by Cahill, Mixon and Parshall, they tackle the problem at the level of Lie algebras, where it can be reformulated as a discrete-continuous optimization over the orthogonal group. In addition to presenting the algorithm and its theoretical guarantees, Raphaël’s talk also delved into the applications mentioned above.

City St George’s hosts special edition of hub-supported international TDA seminar

City St George’s, University of London, played host to a highly successful Topological Data Analysis seminar supported by the Erlangen AI Hub, in association with the London Mathematical Society, on 6-7 November.

The London – Oxford – Paris TDA Seminar, whose organising team included Hub Co-Director Anthea Monod and Hub Co-I Omer Bobrowski of Imperial College London (pictured left and centre above), brought together researchers from across the UK and France working in and around the field of algebraic topology, geometry and topological data analysis.

This special edition of the seminar included a number of high profile speakers, including academics from École Polytechnique, University of Oxford, Imperial College London, King’s College London, University of Southampton, Jussieu Institute of Mathematics, and Northeastern University London.

View more information about the event and speakers.

Hub seminar series

Xinyu Li, a new Postdoctoral Research Associate based at Oxford’s Mathematical Institute, delivered the latest in the hub’s seminar series on 23 October. The seminars provide a great opportunity for the newest members of the hub’s teams to present their research to the community.

Xinyu’s talk, entitled Markov α-Potential Games: A Framework to study Multi-Agent Reinforcement Learning, proposed a new framework of Markov α-potential games to study Markov games. It showed that any Markov game with finite-state and finite-action is a Markov α-potential game, and established the existence of an associated α-potential function. Any optimizer of an α-potential function is shown to be an α-stationary Nash equilibrium.

Xinyu studied two important classes of practically significant Markov games, Markov congestion games and the perturbed Markov team games, via the framework of Markov α-potential games, with explicit characterisation of an upper bound for α and its relation to game parameters. She also provided a semi-infinite linear programming-based formulation to obtain an upper bound for α for any Markov game. Furthermore, Xinyu studied two equilibrium approximation algorithms, namely the projected gradient- ascent algorithm and the sequential maximum improvement algorithm, along with their Nash regret analysis.

Meet the team Q&A

Our hub members have been kindly answering a set of questions so that we can share more about them and their work. We start with Hub Co-Director Jeffrey Giansiracusa of Durham University.

Can you share a bit about your background and your current research focus?
I started off as a very pure mathematician, working in topology and homotopy theory. From there I drifted towards algebraic aspects of tropical geometry, but over the past 5 years I’ve become increasingly interested in applications of topological data analysis to quantum field theory data, as well as machine learning in non-archimedean and tropical geometry.

What inspired you to pursue this area?
By now I’ve worked in several very different areas of mathematics. In each case it was the incluence of mentors and a supportive community that brought me into learning and doing new things.

Which themes are you connected to within the Erlangen AI Hub?
Theme A: Understanding Data
Theme B: Understanding Machine Learning Models

What attracted you to the Erlangen AI Hub and what do you hope to see it achieve?
As one of the architects of the hub, I was very excited about the opportunity to help develop the already impressive community of people in the UK doing topological data analysis, encouraging them to connect to ML and AI and some of the really big questions around right now.

What’s been the most surprising or exciting finding in your work so far?
Gradient descent optimisation shouldn’t work in a non-archimedean setting, where small steps can’t add up to a big step. But we found a non-archimedean optimisation procedure that looks a lot like gradient descent which does work!

What challenges have you faced in your research, and how did you overcome them?
My biggest challenge is always balancing my various projects and responsibilities, and balancing work with family commitments. I often have to leave meetings early to collect my kids from school and take them to their various activities.

What advice would you give to someone just starting out in your field?
Find the people you enjoy working with, and then work with them! Don’t waste your time working with people that you don’t like.

What’s something people might be surprised to learn about you outside of research?
My favourite person to do mathematics with is my brother.

Imperial PhD graduates secure coveted postdoctoral positions

Two Imperial PhD graduates under the tutelage of Hub Co-Director Anthea Monod have secured key postdoctoral positions at leading research institutions in Europe.

Inés Garcia-Redondo (pictured left) successfully completed her PhD and begins postdoctoral life as Senior Researcher at the AIDOS (AI for Data-Oriented Science) Lab at the University of Fribourg, led by Professor Bastian Rieck, whose work is closely aligned with the mission of the hub. Inés’ research focuses on topological data analysis, particularly in its use within machine learning systems, to investigate the mathematical foundations of AI. She said:

“I intend to continue my research at the interface of topology and geometry, and deep learning systems, which I initiated with Anthea as a student aligned to the hub. I’m very grateful and excited for the new opportunities to come, and to stay connected to the hub as well!”

Meanwhile, Yueqi Cao (pictured right) has been awarded a Digital Futures Postdoc Fellowship at the Department of Mathematics at KTH Royal Institute of Technology in Stockholm, supervised by Profs Johan Karlsson and Sandra Di Rocco. Yueqi’s research sits at the crossroads of mathematics, statistics, and machine learning. During his PhD, he developed new tools and methods to analyse metric graphs using ideas from tropical geometry and topological data analysis. Yueqi’s postdoctoral research will now see him extend his research in metric graphs, exploring new geometric and topological methods and applications in machine learning and data analysis. He said:

“I am excited to embark a new postdoctoral position at KTH, where I look forward to further developing my research and building new collaborations, and making new connections in Europe to advance the research areas of the hub, strengthening connections between pure mathematics, computation, and machine learning.”

Congratulations to Ines and Yueqi. We wish them the best of luck in their new roles!

Conference round-up

It’s conference season and hub members have been busy presenting work across the world. Take a look at a snapshot of activity below:


Hub members took part in a fantastic two days at the UK AI Research Symposium (UKAIRS) at Northumbria University.

Congratulations to Oliver Clarke, Edward Pearce-Crump and Qiquan Wang, who presented their research during the poster sessions, and Edward who also gave a lightning talk on his research.

UKAIRS was a hugely inspiring event bringing together and consolidating the UK’s AI research community, with highly engaging talks, demos, panels, posters and keynotes across diverse disciplines, with reflections on the future of AI and emerging challenges. It was also a brilliant platform for our postdocs to showcase their research and meet peers from across the UK, facilitating connections and ideas-sharing with the wider AI research community, including the other EPSRC AI hubs. Many thanks to organisers Responsible Ai UK and the steering committee for their hard work putting the event together.


Several hub members attended a week-long conference in celebration of the 10-year anniversary of AATRN, the Applied Algebraic Topology Research Network, at the Institute for Mathematical and Statistical Innovation in Chicago.

Speakers included hub members Anthea Monod, Omer Bobrowski and Heather Harrington. They were accompanied by hub PhD students Arne Wolf, Inés Garcia-Redondo and David Lanners.

The event was AATRN’s first in-person meeting, bringing together researchers from mathematics, statistics, computer science, physics, biology, and beyond.


Anthea Monod was a speaker at the Graph Learning Meets Theoretical Computer Science workshop (co-chaired by Michael Bronstein) at the Simons Institute for the Theory of Computing at the University of California, Berkeley. She offered a Bootcamp on geometry and graph learning.

The workshop brought together researchers to provide a more unified perspective on graph learning within theoretical computer science.


Guiseppe De Giacomo presented three papers at the International Joint Conference on Artificial Intelligence (IJCAI) 2025 in Montreal.

Read: LTLf+ and PPLTL+: Extending LTLf and PPLTL to Infinite Traces

Read: Solving MDPs with LTLf+ and PPLTL+ Temporal Objectives

Read: Computational Grounding of Responsibility Attribution and Anticipation in LTLf


During the summer, Oliver Clarke presented his work at the SIAM (Society for Industrial and Applied Mathematics) 2025 Conference on Applied Algebraic Geometry in Madison, Wisconsin.

The SIAM Activity Group on Algebraic Geometry has a broad scope and brings together researchers using tools in commutative algebra, geometry, topology, combinatorics, computational algebra to solve ‘applied problems’ in areas such as biology, computer vision, machine learning, robotics, and statistics. The SIAM AG conference, which takes place every 2 years, is a chance to see what fellow researchers are working on through a series of parallel mini-symposia and plenary talks. 

Oliver presented his work-in-progress alongside Yue Ren, Jeffrey Giansiracusa, and Julio Quijas-Acaves, with a talk entitled Towards non-Archimedean Machine Learning. The project is concerned with developing machine learning tools, for instance gradient descent, over non-Archimedean fields such as the p-adics. Oliver said:

“I was delighted with the attendance for my talk, presenting to a packed seminar room, which lead to fruitful conversations with experts in p-adics analysis and tropical geometry.”

The conference lasted 5 days, during which time Oliver attended around 50 talks, learning about many of the problems and techniques in applying algebraic geometry to machine learning. He added:

“It was an excellent opportunity and I’m looking forward to presenting some concrete results in the future.”


A team of researchers including Michael Bronstein won the best paper award at the ICML Generative AI and Biology (GenBio) workshop for FORT: Forward-Only Regression Training of Normalizing Flows.

Uzu Lim presented Cover Learning for Large-Scale Topology Representation at ICML. Authors of the joint paper also included Luis Scoccola and Heather Harrington, the hub’s Oxford Maths lead.

Edward Pearce-Crump (pictured above) presented his work Permutation Equivariant Neural Networks for Symmetric Tensors at ICML. Edward said:

“I’m delighted to have had the opportunity to present my work at ICML 2025 in Vancouver! The feedback I received was incredibly valuable and will guide me in my future research. It was also a pleasure to see old colleagues again and engage in thoughtful discussions about the latest advances in AI.”

Doctoral student Thiziri Nait Saada presented work supported by the hub at ICML. Mind the Gap: a Spectral Analysis of Rank Collapse and Signal Propagation in Attention Layers was authored by Thiziri alongside Alireza Naderi and Jared Tanner.


Thom Badings (pictured above) presented his work at CAV in July.

In the joint paper Policy Verification in Stochastic Dynamical Systems Using Logarithmic Neural Certificates his team developed novel techniques for the verification of neural network policies in stochastic dynamical systems. 

LOGML 2025: ‘First-class’ summer school sponsored by hub shines bright 

The Erlangen AI Hub was a ‘diamond’ sponsor of the 2025 London Geometry and Machine Learning (LOGML) Summer School at Imperial College London this year. 

Every July the summer school brings together mathematicians and computer scientists to collaborate on a range of problems at the intersection of geometry and machine learning. The week-long event features a number of group projects, each overseen by an experienced mentor, talks by leading figures in the field, a poster session, networking with industry, and social events.

As a primary sponsor, the Erlangen AI Hub enjoyed a key presence at this year’s school, with many members, hub-aligned postdocs and PhD students involved as organisers, advisors, project leaders, and participants. The organising team included incoming hub-aligned Postdoctoral Research Associate Daniel Platt and hub-aligned PhD student Arne Wolf. The scientific advisory board included Dr Anthea Monod, Prof Heather Harrington, and Prof Michael Bronstein, one of the original founders of the school during his time at Imperial.  

This year’s vibrant and fruitful event welcomed more than 100 participants from across the world, who collaborated in teams on 19 mentored projects and enjoyed a range of talks and tutorials from high profile speakers including the hub’s Prof Coralia Cartis. It wasn’t all work though as attendees enjoyed a range of social activities including a welcome breakfast at the V&A Museum, a company night, bouldering, live music, and lunch at Chiswick House and Gardens. 

Co-Director of the Erlangen AI Hub, Dr Anthea Monod, was a key advisor to the summer school and co-led a project with fellow hub board member Prof Omer Bobrowski. Anthea said:

“It was a fantastic, first-class summer school, and I am proud of the hard work of the organisers. I had the pleasure of leading a project using topology to study the evolution of high dimensional neural activation patterns. It was so much fun and great to catch up with people on the circuit. Huge thanks to the Erlangen AI Hub for being a diamond sponsor.”

Find out more about the LOGML Summer School at https://www.logml.ai/