



Since November 2013, I am an IST fellow in the Institute for Science and Technology, Austria; I am based in the lab of Gasper Tkacik. I will be moving to the Center for Neural Science (CNS) and the Center for Data Science (CDS) at NYU as an Assistant Professor next June. Previously, I obtained a Ph.D. in Computational Neuroscience at the Frankfurt Institute for Advanced Studies (in the lab of Jochen Triesch). As a postdoc, I worked in the Computational and Biological Learning Lab at Cambridge University (with Mate Lengyel), and briefly in the Group for Neural Theory, at ENS, Paris (with Sophie Deneve). In very generic terms, my research focuses on learning and memory at the level of neural circuits in the brain. I use a combination of theoretical modelling, computer simulations and data analysis to study how different plasticity mechanisms subserve these functions. On the theory side, I construct probabilistic models that describe biologicallyrelevant computations and then use techniques borrowed from machine learning to work out how neural circuits could approximate the optimal solution for these tasks. On the data analysis side, I build statistical models describing the joint activity of neurons recorded experimentally, then use information theoretic measures to asses how this activity is shaped by learning. The results of this work follow three major themes:
Recent and ongoing projects: Statistical description (max ent models) of the activity of neural populations in area CA1 of the hippocampus; in collaboration with Gasper Tkacik and Jozsef Csicsvari. Circuit and systems' level solutions for effective autoassociative memory recall; in collaboration with Mate Lengyel and Peter Dayan, see our NIPS 2011 and 2013 papers and [Savin et al, 2014]. Distributed codes for sampling from multidimensional, realvalued distributions; in collaboration with Sophie Deneve; contributed talk Cosyne 2014 and NIPS 2014 spotlight. Signatures of statistical optimal learning in neural activity; in collaboration with Jozsef Fiser and Mate Lengyel, see our recent technical report on arXiv and Letter to the Editor in reply to Okun et al, 2012, or the abstracts for our two recent SfN talks here. The role of homeostatic mechanisms in learning efficient representations of sensory inputs [Savin et al, 2010], [Keck et al, 2012]. Rewarddependent learning in PFC: how task constraints shape neural representations in working memory circuits; contributed talk Cosyne 2009, [Savin and Triesch, 2014].
For more details, have a look at my publications. 

Last modified: 20 June 2016 