Speaker
Prof.
Akio Tomiya
(Tokyo Woman's Christian University)
Description
In this talk, we will discuss transformers that preserve gauge symmetry and their applications. Gauge symmetry is crucial in lattice QCD. There has been significant progress in accelerating lattice calculations using machine learning, particularly neural networks. Meanwhile, in the field of machine learning, transformers such as GPT have rapidly advanced and transformed society. Transformers excel at capturing long-range correlations in data and handling data with local causal structures, such as language. In this study, we have formulated a transformer that preserves gauge symmetry. This presentation will cover the fundamental aspects of this research and the results of practical calculations.
Primary authors
Prof.
Akio Tomiya
(Tokyo Woman's Christian University)
Prof.
Hiroshi Ohno
(University of Tsukuba)
Prof.
Yuki Nagai
(University of Tokyo)