Speaker
Description
Generative models, in particular normalizing flows, have demonstrated exceptional performance in learning probability distributions across various domains of physics, including statistical mechanics, collider physics, and lattice field theory. In lattice field theory, using normalizing flows for accurately learning the Boltzmann distribution has been applied to a wide range of tasks, such as the direct estimation of thermodynamic observables and the sampling of independent and identically distributed (i.i.d.) configurations.
In the talk by Dominic Schuh, it was shown how normalizing flows can be used to learn the Boltzmann distribution for the Hubbard model. These machine learning techniques may overcome potential ergodicity problems of state-of-the-art Hamiltonian Monte Carlo (HMC) methods.
In this poster, we explain the properties and symmetries of the probability distributions that we aim to learn using generative models. Specifically, we discuss the implementation of a symmetry-equivariant normalizing flow for the Hubbard model. Incorporating physics prior knowledge as an inductive bias substantially enhances the learning process.