On 11 February, we will have Joey Bose from Imperial College London.
If you cannot attend in person, please register via the Ticketsource link provided and you will receive the link to join the Teams session.
Title: Flow Maps and Normalizing Flows for Accelerated Generative Modelling of Molecules
Abstract:
This talk will be broken into 2 distinct parts. The first part will focus on an emerging class of efficient generative models known as Flow-maps that achieve state-of-the-art performance on few-step generation—-at a fraction of the inference cost of conventional diffusion and flow-matching based models. We will recap and approach the theory of Flow-maps from first principles and show how they can be adapted to training All-Atom Protein generative models with a novel Denoiser-based formulation of the Lagrangian Flow-map. This formulation will illuminate how existing best practices from diffusion models—-such as using the Kabsch algorithm for alignment—-can be seamlessly adopted for Flow-maps as well.
The second part of the talk will focus on the pain points of existing generative models when applied to modelling molecular systems. In particular, we will discuss their application to Boltzmann sampling under the framework of Boltzmann Generators, which pair an exact likelihood generative model trained on biased data with a subsequent importance sampling step to draw statistically independent and consistent samples from the target Boltzmann distribution. To accelerate the scalability of Bgs, we will revisit classical normalizing flows in the context that offer efficient sampling and likelihoods, but whose training via maximum likelihood is often unstable and computationally challenging. We will introduce propose Regression Training of Normalizing Flows (RegFlow), a novel and scalable regression-based training objective that bypasses the numerical instability and computational challenge of conventional maximum likelihood training in favour of a simple ?2-regression objective. Specifically, RegFlow maps prior samples under our flow to targets computed using optimal transport couplings or a pre-trained continuous normalizing flow (CNF). To enhance numerical stability, RegFlow employs effective regularization strategies such as a new forward-backward self-consistency loss that enjoys painless implementation. Empirically, we demonstrate that RegFlow unlocks a broader class of architectures that were previously intractable to train for BGs with maximum likelihood. We also show that RegFlow exceeds the performance, computational cost, and stability of maximum likelihood training in equilibrium sampling in Cartesian coordinates of alanine dipeptide, tripeptide, and tetrapeptide, showcasing its potential in molecular systems.
Bio:
Joey Bose is an Assistant Professor of Computing at Imperial College London, an ELLIS member, and an Affiliate member Mila. Previously, he was a Post-Doctoral Fellow at University of Oxford working with Michael Bronstein. He completed his PhD at McGill/Mila under the supervision of Will Hamilton, Gauthier Gidel, and Prakash Panagaden. His research interests span Generative Modelling, Differential Geometry for Machine Learning with a current emphasis on geometric generative models for scientific applications. Previously, he completed his Bachelors and Master’s degrees from the University of Toronto working on adversarial attacks against face detection and is the President and CEO of FaceShield Inc an educational platform for digital privacy for facial data. His work has been featured in Forbes, New York Times, CBC, VentureBeat and other media outlets and was generously supported by the IVADO PhD Fellowship, and NSERC Post-doc Fellowship.