Comparison of objective functions for estimating linear-nonlinear models - Quantitative Biology > Neurons and CognitionReport as inadecuate




Comparison of objective functions for estimating linear-nonlinear models - Quantitative Biology > Neurons and Cognition - Download this document for free, or read online. Document in PDF available to download.

Abstract: This paper compares a family of methods for characterizing neural featureselectivity with natural stimuli in the framework of the linear-nonlinearmodel. In this model, the neural firing rate is a nonlinear function of a smallnumber of relevant stimulus components. The relevant stimulus dimensions can befound by maximizing one of the family of objective functions, Renyi divergencesof different orders. We show that maximizing one of them, Renyi divergence oforder 2, is equivalent to least-square fitting of the linear-nonlinear model toneural data. Next, we derive reconstruction errors in relevant dimensions foundby maximizing Renyi divergences of arbitrary order in the asymptotic limit oflarge spike numbers. We find that the smallest rrors are obtained with Renyidivergence of order 1, also known as Kullback-Leibler divergence. Thiscorresponds to finding relevant dimensions by maximizing mutual information. Wenumerically test how these optimization schemes perform in the regime of lowsignal-to-noise ratio small number of spikes and increasing neural noise formodel visual neurons. We find that optimization schemes based on either leastsquare fitting or information maximization perform well even when number ofspikes is small. Information maximization provides slightly, but significantly,better reconstructions than least square fitting. This makes the problem offinding relevant dimensions, together with the problem of lossy compression,one of examples where information-theoretic measures are no more data limitedthan those derived from least squares.



Author: Tatyana O. Sharpee

Source: https://arxiv.org/







Related documents