Physical Approaches to Learning and Inference

Dr. David Schwab (hosted by Michael Tikhonov), Initiative for the Theoretical Sciences (ITS) - The Graduate Center, The City University of New York (CUNY), N.Y.
October 2, 2017 at 4:00 pm
241 Compton
Event Description 

Connections between machine learning and statistical physics have long been appreciated, but there has recently been a resurgence of work at this interface. I will discuss new ways of using the lens of physics to first understand existing techniques in the field of deep learning and then to create novel learning methods. In the first part of the talk, I will give a physics perspective to deep learning, a popular set of techniques in machine learning where performance on tasks such as visual object recognition rivals human performance. I present work relating greedy training of so-called deep belief networks, which employ layers of hidden variables, to variational real-space renormalization. This connection may help explain how deep networks automatically learn relevant features from data and extract independent factors of variation, as well as for what types of data deep networks work best. In the second part of the talk, I will use quantum-inspired tensor networks for supervised learning. Tensor networks are efficient representations of high-dimensional tensors that have been very successful in modeling many-body physics systems. Here, we will use matrix product states, a particularly well-studied tensor network, for classification tasks. Methods for optimizing such tensors can be adapted to learning problems, and we find good performance on classic datasets. I will speculate on why this method works, using a perspective from physics that suggests a natural way forward.

Coffee:  3:45 pm, 241 Compton