New tricks for old problems: using maximum entropy models to study hippocampal memory
Starts 3 May 2016 14:00
Ends 3 May 2016 15:00
Central European Time
Central Area, Second floor, ex SISSA building
Understanding how memory and learning happen at the level of neural circuits is one of the fundamental questions in neuroscience, with broad implications ranging from education, to neuroprosthetics or brain-machine interfaces. The contribution of the hippocampus to the formation of new memories and memory recall has been investigated extensively, both experimentally and in theoretical models. Nonetheless, putting together the pieces of the memory puzzle remains a challenge.
During the talk I’ll use a two examples from my recent work to illustrate how ideas from machine learning and statistical physics, in particular maximum entropy models, can provide novel insights into the contributions of the hippocampus to learning and memory. First, I’ll discuss a new approach for characterising the statistical structure of neural activity and it’s changes during learning. Second, I’ll introduce a top-down framework for studying memory recall at the level of neural circuits, which provides computationally well-founded links across scales, from single synapses, to the integration of signals in the dendritic arbor, to circuit dynamics, to the system's level architecture of the brain and to behavior.