Max Welling: Exploiting Symmetries in Inference and Learning logo
Max Welling | Professor, University of Amsterdam Abstract: Symmetries play a crucial role in much of mathematics and physics. In this talk I will explore the role that symmetries are playing in machine learning. In particular I will discuss the concept of equivariance and apply it to neural networks. After the introduction I will discuss two newer works: first E(n) equivariant graph neural networks that are equivariant to both permutations of the nodes and global E(n) transformations of the node features. These models are ideal for predicting molecular properties in biology and chemistry and to model objects as point clouds in computer vision. Secondly I will show how invariance if probability densities result in very efficient deterministic mcmc samplers with better convergence behavior. We show that by modeling the sampler as a push-forward of the density according to an ODE, and by identifying a very large set of symmetries for an arbitrary density characterized divergence free vector fields, we can indeed design highly efficient samplers. Finally we extend these ideas to discrete sampling spaces, such as the Ising model. Joint work with Kirill Neklyudov and Roberto Bondesan. Bio: Max Welling is a computer scientist who works in artificial intelligence (expert systems, machine learning, robotics). He holds a research chair in machine learning at the University of Amsterdam; is co-founder of Scyfer BV, a university spin-off in deep learning; and has held postdoc positions at the California Institute of Technology, University College London and the University of Toronto. Welling received his PhD in 1998 under supervision of Nobel laureate Gerard ‘t Hooft. He has served on the editorial boards of JMLR and JML; was an associate editor for Neurocomputing and JCGS; and has received grants from Google, Facebook, Yahoo, NSF, NIH, NWO and ONR-MUR. Currently, Welling serves on the board of the NIPS foundation and of the Data Science Research Center in Amsterdam; directs the Amsterdam Machine Learning Lab (AMLAB); and co-directs the Qualcomm-UvA deep learning lab (QUVA), the Bosch-UvA Deep Learning lab (DELTA) and the AML4Health Lab. Paper mentioned: - E(n) Equivariant Graph Neural Networks - Orbital MCMC - Probabilistic Numeric Convolutional Neural Networks - The Hintons in your Neural Network: a Quantum Field Theory View of Deep Learning

00:00 Intro 0:20 Overview 1:09 Generative vs Discriminative AI 7:09 Symmetries & Equivariance 13:11 (Equivariant) Graph Neural Networks 27:53 Markov Chain Monte Carlo 30:11 Deterministic MCMC flows from Divergence Free Vector Fields 43:11 Quantum ML and Hinton Particles 45:44 Conclusions 48:28 Discussion