28 July 2024 to 3 August 2024
Europe/London timezone

Random Matrix Theory for Stochastic Gradient Descent

31 Jul 2024, 11:15
20m
Talk Algorithms and Artificial Intelligence Algorithms and artificial intelligence

Speaker

Chanju Park (Swansea University)

Description

Investigating the dynamics of learning in machine learning algorithms is of paramount importance for understanding how and why an approach may be successful. The tools of physics and statistics provide a robust setting for such investigations. Here we apply concepts from random matrix theory to describe stochastic weight matrix dynamics, using the framework of Dyson Brownian motion. We derive the linear scaling rule between the step size of the optimisation and the batch size, and identify universal and non-universal aspects of the weight matrix dynamics. We test our hypothesis in the (near-)solvable case of the Gaussian Restricted Boltzmann Machine, and explicitly identify the Wigner surmise and semi-circle, and the linear scaling rule.

Primary authors

Biagio Lucini (Swansea University) Chanju Park (Swansea University) Gert Aarts (Swansea University)

Presentation materials