28 July 2024 to 3 August 2024
Europe/London timezone

Equivariant Normalizing Flows for the Hubbard Model

30 Jul 2024, 17:15
1h
Poster Algorithms and Artificial Intelligence Poster session and reception

Speaker

Janik Kreit

Description

Generative models, in particular normalizing flows, have demonstrated exceptional performance in learning probability distributions across various domains of physics, including statistical mechanics, collider physics, and lattice field theory. In lattice field theory, using normalizing flows for accurately learning the Boltzmann distribution has been applied to a wide range of tasks, such as the direct estimation of thermodynamic observables and the sampling of independent and identically distributed (i.i.d.) configurations.

In the talk by Dominic Schuh, it was shown how normalizing flows can be used to learn the Boltzmann distribution for the Hubbard model. These machine learning techniques may overcome potential ergodicity problems of state-of-the-art Hamiltonian Monte Carlo (HMC) methods.
In this poster, we explain the properties and symmetries of the probability distributions that we aim to learn using generative models. Specifically, we discuss the implementation of a symmetry-equivariant normalizing flow for the Hubbard model. Incorporating physics prior knowledge as an inductive bias substantially enhances the learning process.

Primary authors

Janik Kreit Dominic Schuh

Co-authors

Evan Berkowitz (Forschungszentrum Jülich) Lena Funcke (University of Bonn) Thomas Luu (Forschungszentrum Jülich / University of Bonn) Dr Kim A. Nicoli (University of Bonn) Marcel Rodekamp (Forschungszentrum Jülich)

Presentation materials