Dense Associative Memories and Deep Learning

Dense Associative Memories are generalizations of Hopfield nets to higher order (higher than quadratic) interactions between the spins/neurons. I will describe a relationship between these models and neural networks commonly used in deep learning. From the perspective of associative memory, such models deserve attention because they make it possible to store a much larger number of memories, compared to the quadratic case. From the perspective of deep learning, these models make it possible to control the kind of representation that the neural networks learn from a given dataset: small powers of the interaction vertex correspond to feature-based representations, large powers – to prototypes. These Dense Associative Memories can be driven by images processed with convolutional neural networks generally used in image analysis. I will discuss the potential for using this idea to mitigate the problem of adversarial images (very small changes to an input image which lead to a gross misclassification) in computer vision.

Speaker Details

Dmitry Krotov is a member of the research staff at the Institute for Advanced Study in Princeton, NJ. His work focuses on the computational properties of neural networks. The central theme that runs through his research is the relationship between models of associative memory (also known as Hopfield nets) and neural networks used in deep learning. He received a PhD in Physics from Princeton University in 2014.

Date:
Speakers:
Dmitry Krotov
Affiliation:
Institute for Advanced Study