Machine-Learning-Based Sampling in Lattice Field Theory and Quantum Chemistry
from
Monday, 21 October 2024 (08:00)
to
Friday, 25 October 2024 (17:00)
Monday, 21 October 2024
08:45
Registration and Coffee
Registration and Coffee
08:45 - 09:40
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
09:40
Welcome Remarks
Welcome Remarks
09:40 - 10:00
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
10:00
Normalizing flows for lattice gauge theory in 3+1D
-
Gurtej Kanwar
Normalizing flows for lattice gauge theory in 3+1D
Gurtej Kanwar
10:00 - 10:40
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Normalizing flows have emerged over the past several years as a promising avenue to accelerating Monte Carlo sampling for a variety of lattice field theories. This talk will discuss recent progress and ongoing efforts in applying normalizing flows based on gauge-equivariant coupling layers to lattice gauge theories in 3+1d.
10:40
Coffee Break
Coffee Break
10:40 - 11:10
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
11:10
Practical applications of machine-learned flows on gauge fields
-
Daniel Hackett
Practical applications of machine-learned flows on gauge fields
Daniel Hackett
11:10 - 11:50
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Normalizing flows are machine-learned maps between different lattice theories which can be used as components in exact sampling and inference schemes. Ongoing work yields increasingly expressive flows on gauge fields, but it remains an open question how flows can improve lattice QCD at state-of-the-art scales. I discuss progress of two strategies to employ flows for computational advantage. The first is applications of flows in replica exchange (parallel tempering) sampling, aimed at improving topological mixing, which are viable with iterative improvements upon presently available flows. The second is the use of flows to improve signal to noise in Feynman-Hellmann calculations and related approaches.
11:50
Stochastic Normalizing Flows for lattice gauge theory
-
Alessandro Nada
Stochastic Normalizing Flows for lattice gauge theory
Alessandro Nada
11:50 - 12:30
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Non-equilibrium Markov Chain Monte Carlo simulations based on Jarzynski's equality represent a powerful method to compute differences in free energy and also to sample from a target probability distribution without the need to thermalize the system under study. If the target distribution suffers from long autocorrelation times, they provide a promising candidate to mitigate critical slowing down. These out-of-equilibrium simulations can be naturally combined with Normalizing Flows into a recently-developed architecture called Stochastic Normalizing Flows (SNF). In this talk we first outline our implementation of SNFs in the four-dimensional $SU(3)$ lattice gauge theory and then we focus on their promising scaling with the volume, both in terms of training and sampling. We discuss future systematic improvements and how a mitigation of topological freezing in simulations of lattice gauge theories at large volumes can be realistically achieved in the short term.
12:30
Lunch (On your own)
Lunch (On your own)
12:30 - 14:00
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
14:00
Coffee Break
Coffee Break
14:00 - 14:30
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
14:30
Poster Flashtalks (1 min pp)
Poster Flashtalks (1 min pp)
14:30 - 15:45
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
There will also be few words from our sponsors - Prof. Dr. Carsten Urbach (NuMeriQS) - Prof. Dr. Connie Lu and Prof. Dr. Sebastian Neubert (TRA)
15:45
Coffee Break
Coffee Break
15:45 - 16:15
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
16:15
Poster Session
Poster Session
16:15 - 19:30
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
17:00
Reception
Reception
17:00 - 19:30
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Tuesday, 22 October 2024
08:45
Registration and Coffee
Registration and Coffee
08:45 - 09:20
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
09:20
Keynote: Machine Learning and Physics
-
Pan Kessel
Keynote: Machine Learning and Physics
Pan Kessel
09:20 - 10:40
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
In this talk I will give an introduction to generative models for lattice field theory. Following the spirit of the workshop, I will try to give an accessible review of the basics of lattice field theory and discuss its close analogies to large molecule generation. I will also summarize the challenges of the field and the ongoing efforts to overcome them.
10:40
Coffee Break
Coffee Break
10:40 - 11:10
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
11:10
A dynamical systems perspective on measure transport and generative modeling
-
Lorenz Richter
A dynamical systems perspective on measure transport and generative modeling
Lorenz Richter
11:10 - 11:50
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Generative modeling via measure transport can be effectively understood through the lens of dynamical systems that describe the evolution from a prior to the prescribed target measure. Specifically, this involves deterministic or stochastic evolutions described by ODEs or SDEs, respectively, that shall be learned in such a way that the respective process is distributed according to the target measure at terminal time. In this talk, we show that this principled framework naturally leads to underlying PDEs connected to the density evolution of the processes. On the computational side, those PDEs can then be approached via variational approaches, such as BSDEs or PINNs. Using the former, we can draw connections to optimal control theory and recover trajectory-based sampling methods, such as diffusion models or Schrödinger bridges - however, without relying on the concept of time reversal. PINNs, on the other hand, offer the appealing numerical property that no trajectories need to be simulated and no time discretization has to be considered, leading to efficient training and better mode coverage in the sampling task. We investigate different learning strategies (admitting either unique or infinitely many solutions) on multiple high-dimensional multimodal examples.
11:50
NETS: A non-equilibrium transport sampler
-
Michael Albergo
NETS: A non-equilibrium transport sampler
Michael Albergo
11:50 - 12:30
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
We propose an algorithm, termed the Non-Equilibrium Transport Sampler (NETS), to sample from unnormalized probability distributions. NETS can be viewed as a variant of annealed importance sampling (AIS) based on Jarzynski's equality, in which the stochastic differential equation used to perform the non-equilibrium sampling is augmented with an additional learned drift term that lowers the impact of the unbiasing weights used in AIS. We show that this drift is the minimizer of a variety of objective functions, which can all be estimated in an unbiased fashion without backpropagating through solutions of the stochastic differential equations governing the sampling. We also prove that some these objectives control the Kullback-Leibler divergence of the estimated distribution from its target. NETS is shown to be unbiased and, in addition, has a tunable diffusion coefficient which can be adjusted post-training to maximize the effective sample size. We demonstrate the efficacy of the method on standard benchmarks, high-dimensional Gaussian mixture distributions, and a model from statistical lattice field theory, for which it surpasses the performances of related work and existing baselines.
12:30
Lunch (On your own)
Lunch (On your own)
12:30 - 14:00
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
14:00
Coffee Break
Coffee Break
14:00 - 14:30
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
14:30
Panel Discussion: Research in Industry and Academia
Panel Discussion: Research in Industry and Academia
14:30 - 15:45
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Participants: - Jan Gerken (Chalmers University) - Pan Kessel (Prescient Design) - Lorenz Richter (Zuse institute, dida)
15:45
Coffee Break
Coffee Break
15:45 - 16:15
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
16:15
TRA Colloquium: Symmetries in AI4Science
-
Jan Gerken
TRA Colloquium: Symmetries in AI4Science
Jan Gerken
16:15 - 17:00
Room: Kreuzbergweg 24/0.052 (FTD) - Präsentationsraum
Symmetries are of fundamental importance in all of science and therefore critical for the success of deep learning systems used in this domain. In this talk, I will give an overview of the different forms in which symmetries appear in physics and chemistry and explain the theoretical background behind equivariant neural networks. Then, I will discuss common ways of constructing equivariant networks in different settings and contrast manifestly equivariant networks with other techniques for reaching equivariant models. Finally, I will report on recent results about the symmetry properties of deep ensembles trained with data augmentation.
19:00
Dinner - 60 seconds to Napoli
Dinner - 60 seconds to Napoli
19:00 - 00:00
Wednesday, 23 October 2024
08:45
Registration and Coffee
Registration and Coffee
08:45 - 09:20
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
09:20
Neural Thermodynamic Integration: Free Energies from Energy-based Diffusion Models
-
Tristan Bereau
Neural Thermodynamic Integration: Free Energies from Energy-based Diffusion Models
Tristan Bereau
09:20 - 10:00
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Thermodynamic integration (TI) offers a rigorous method for estimating free-energy differences by integrating over a sequence of interpolating conformational ensembles. However, TI calculations are computationally expensive and typically limited to coupling a small number of degrees of freedom due to the need to sample numerous intermediate ensembles with sufficient conformational-space overlap. In this work, we propose to perform TI along an alchemical pathway represented by a trainable neural network, which we term Neural TI. Critically, we parametrize a time-dependent Hamiltonian interpolating between the interacting and non-interacting systems, and optimize its gradient using a denoising-diffusion objective. The ability of the resulting energy-based diffusion model to sample all intermediate ensembles allows us to perform TI from a single reference calculation. We apply our method to Lennard-Jones fluids, where we report accurate calculations of the excess chemical potential, demonstrating that Neural TI is capable of coupling hundreds of degrees of freedom at once.
10:00
Thermodynamic Integration along interpolant ML-potentials
-
Balint Mate
Thermodynamic Integration along interpolant ML-potentials
Balint Mate
10:00 - 10:40
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
We propose to machine learn an interpolating family of potential functions between two Hamiltonians and to perform thermodynamic integration along it. The approach requires the functional forms of the Hamiltonians and samples from the corresponding Boltzmann distributions. We then define an interpolation between the distributions at the level of samples and optimize a neural network potential to match the corresponding equilibrium potential, i.e. the negative log-likelihood, at every intermediate time-step. Once the alignment between the interpolating samples and potentials is sufficiently accurate, the free energy difference between the two Hamiltonians can be estimated using (neural) thermodynamic integration. We experimentally validate the proposal on the estimation of solvation free energies.
10:40
Coffee Break
Coffee Break
10:40 - 11:10
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
11:10
Importance weights distribution in Neural Samplers
-
Piotr Bialas
Importance weights distribution in Neural Samplers
Piotr Bialas
11:10 - 11:50
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Neural samplers in general generate configurations from distribution that only approximates the desired target distribution. The importance weights, which are the quotients of the target probability of the sample to its actual probability, do account for this discrepancy and permit the correction of the distribution either by reweighting the samples or accepting/rejecting them using the Metropolis algorithm. Ideally, we would like those weights to be distributed around one with a small variance. It turns out however that, in the case of poorly trained sampler, this distribution can be long tailed, even to the point of having infinite variance. In my talk I will discuss possible causes of this behavior and its connection with mode collapse.
11:50
Continuous Normalizing Flows in Lattice Gauge Theories
-
Simone Bacchio
Continuous Normalizing Flows in Lattice Gauge Theories
Simone Bacchio
11:50 - 12:30
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
We have developed continuous normalizing flows for lattice gauge theories, building on Lüscher's perturbative framework of gradient flow and enhancing it through machine learning techniques. This approach offers a highly efficient means of trivializing the theory, as demonstrated in the 2-dimensional SU(3) gauge theory, where our method successfully trivializes the system using only a few hundred parameters. However, trivializing the 4-dimensional theory presents greater challenges due to critical slowing down at large values of $\beta$ and the inherently non-perturbative nature of strong interactions. While full trivialization of the 4-dimensional theory remains out of reach, I will present a promising application of our machine-learned gradient flow: the computation of gradients of physical observables with respect to action parameters. This method circumvents traditional difficulties and demonstrates the potential of machine learning to advance the study of lattice gauge theories, even in complex, higher-dimensional settings.
12:30
Lunch (On your own)
Lunch (On your own)
12:30 - 14:00
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
14:00
Coffee Break
Coffee Break
14:00 - 14:30
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
14:30
Social Activity (TBD)
Social Activity (TBD)
14:30 - 20:00
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Thursday, 24 October 2024
08:45
Registration and Coffee
Registration and Coffee
08:45 - 09:20
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
09:20
Keynote: Machine Learning and Generative Models
-
Michael Albergo
Keynote: Machine Learning and Generative Models
Michael Albergo
09:20 - 10:40
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
TBD
10:40
Coffee Break
Coffee Break
10:40 - 11:10
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
11:10
Continuous normalizing flows for lattice gauge theories
-
Pim de Haan
Continuous normalizing flows for lattice gauge theories
Pim de Haan
11:10 - 11:50
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Continuous normalizing flows are known to be highly expressive and flexible, which allows for the easier incorporation of large symmetries and makes them a powerful tool for sampling in lattice field theories. Building on previous work, I will present a general continuous normalizing flow architecture for matrix Lie groups that are equivariant under group transformations. By applying it to lattice gauge theories in two dimensions as proof-of-principle experiments, I will show that it achieves competitive performance, making it plausibly a promising component in the toolbox for future lattice sampling tasks.
11:50
Hands on NeuLat: a toolbox for neural samplers in lattice field theory
-
Christopher Anders
Hands on NeuLat: a toolbox for neural samplers in lattice field theory
Christopher Anders
11:50 - 12:30
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Normalizing flows have increasingly gained attention as a promising choice for sampling in lattice field theory. However, there has been a lack of software packages for applying powerful generative Machine Learning models such as Normalizing flows specifically for lattice field theory. To fill this gap, we present NeuLat: a fully customizable software package that allows researchers in lattice field theory to harness the recent advances in deep generative learning. In this hands-on session, we explore how NeuLat can considerably simplify the application and benchmarking of deep generative models for lattice quantum field theory.
12:30
Lunch (On your own)
Lunch (On your own)
12:30 - 14:00
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
14:00
Coffee Break
Coffee Break
14:00 - 14:30
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
14:30
Panel Discussion: Machine Learning and Physics
Panel Discussion: Machine Learning and Physics
14:30 - 15:45
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Participants: - Piotr Bialas (Jagellonian University) - Gurtej Kanwar (University of Edinburgh) - Alessandro Nada (University of Turin)
15:45
Coffee Break
Coffee Break
15:45 - 16:15
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
16:15
Bethe Colloquium: Generative flow models for lattice field theory
-
Gurtej Kanwar
Bethe Colloquium: Generative flow models for lattice field theory
Gurtej Kanwar
16:15 - 17:00
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Lattice field theory simulations provide a non-perturbative understanding of strongly interacting field theories, such as quantum chromodynamics (QCD), and for example play an important role in providing inputs for a variety of high-energy experiments. State-of-the-art lattice calculations are, however, limited by the large computational cost of Monte Carlo simulation. Recently, significant progress has been made in applying a class of generative machine learning "flow models" to combat this issue. These generative samplers enable promising practical improvements in Monte Carlo sampling, such as fully parallelized configuration generation. In this talk, I will discuss the progress towards this goal and future prospects of the method.
Friday, 25 October 2024
08:45
Registration and Coffee
Registration and Coffee
08:45 - 09:20
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
09:20
Keynote: Machine Learning and Chemistry
-
Jonas Köhler
Keynote: Machine Learning and Chemistry
Jonas Köhler
09:20 - 10:40
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
TBD
10:40
Coffee Break
Coffee Break
10:40 - 11:10
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
11:10
Timewarp: Transferable Acceleration of Molecular Dynamics by Learning Time-Coarsened Dynamics
-
Leon Klein
Timewarp: Transferable Acceleration of Molecular Dynamics by Learning Time-Coarsened Dynamics
Leon Klein
11:10 - 11:50
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Timewarp is an enhanced sampling method which uses a normalizing flow as a proposal distribution in a Markov chain Monte Carlo method targeting the Boltzmann distribution. The flow is trained offline on Molecular Dynamics trajectories and learns to make large steps in time (up to one ns). Crucially, Timewarp is transferable between molecular systems: once trained, it generalizes to unseen small peptides (2-4 amino acids) at all-atom resolution, exploring their metastable states and providing wall-clock acceleration of sampling compared to standard Molecular Dynamics.
11:50
Modeling Biomolecular Structures through Generative Models
-
Gabriele Corso
Modeling Biomolecular Structures through Generative Models
Gabriele Corso
11:50 - 12:30
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Predicting the binding structure of a small molecule ligand to a protein -- a task known as molecular docking -- is critical to biological research and drug design. Framing molecular docking as a generative modeling problem we developed DiffDock, a diffusion model over the non-Euclidean manifold of ligand poses. Empirically, DiffDock was the first ML model to outperform traditional docking methods on blind docking and it has since been integrated in the drug discovery pipelines of many pharma and biotech companies. In this talk I will give an overview of DiffDock as well as recent improvements we made to its generalization across the proteome and its ability to model the protein flexibility.
12:30
Lunch (On your own)
Lunch (On your own)
12:30 - 14:00
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
14:00
Coffee Break
Coffee Break
14:00 - 14:30
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
14:30
Panel Discussion: Machine Learning and Chemistry
Panel Discussion: Machine Learning and Chemistry
14:30 - 15:45
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
Participants: - Gabriele Corso (MIT) - Jonas Köhler (cusp.ai) - Lei Wang (Chinese Academy of Sciences)
15:45
Coffee Break
Coffee Break
15:45 - 16:15
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn
16:15
Closing Remarks
Closing Remarks
16:15 - 16:25
Room: Wegelerstr. 10 - Seminar Room 2.019 - 53115 Bonn