Modern Hopfield Networks in AI and Neurobiology
November 04, 2022, ESB 2001
Modern Hopfield Networks or Dense Associative Memories are recurrent neural networks with fixed point attractor states that are described by an energy function. In contrast to conventional Hopfield Networks, their modern versions have a very large memory storage capacity, which makes them appealing tools for many problems in machine learning and cognitive and neuro-sciences. In this talk I will introduce an intuition and a mathematical formulation of this class of models, and will give examples of problems in AI that can be tackled using these new ideas. I will also explain how different individual models of this class (e.g. hierarchical memories, attention mechanism in transformers, etc.) arise from their general mathematical formulation with the Lagrangian functions.
Dmitry Krotov is a physicist working on neural networks and machine learning. He is a member of the research staff at the MIT-IBM Watson AI Lab and IBM Research in Cambridge, MA. Prior to this, he was a member of the Institute for Advanced Study in Princeton. Broadly defined, his research focuses on the computational properties of neural networks. Particularly, he is interested in implementing ideas coming from Neuroscience and Physics in modern AI systems. He received a PhD in Physics from Princeton University in 2014.