TY - JOUR

T1 - Multiplicative neural noise can favor an independent components representation of sensory input

AU - Gottschalk, Allan

AU - Sexton, Matthew G.

AU - Roschke, Guilherme

N1 - Funding Information:
The authors wish to thank Robert Jones for his assistance with some of the preliminary computer calculations and Max Mintz for several useful discussions. This work is supported by National Institutes of Health under grant EY10915.

PY - 2004/11

Y1 - 2004/11

N2 - Arguments have been advanced to support the role of principal components (e.g., Karhunen-Loéve, eigenvector) and independent components transformations in early sensory processing, particularly for color and spatial vision. Although the concept of redundancy reduction has been used to justify a principal components transformation, these transformations per se do not necessarily confer benefits with respect to information transmission in information channels with additive independent identically distributed Gaussian noise. Here, it is shown that when a more realistic source of multiplicative neural noise is present in the information channel, there are quantitative benefits to a principal components or independent components representation for Gaussian and non-Gaussian inputs, respectively. Such a representation can convey a larger quantity of information despite the use of fewer spikes. The nature and extent of this benefit depend primarily on the probability distribution of the inputs and the relative power of the inputs. In the case of Gaussian input, the greater the disparity in power between dimensions, the greater the advantage of a principal components representation. For non-Gaussian input distributions with a kurtosis that is super-Gaussian, an independent components representation is similarly advantageous. This advantage holds even for input distributions with equal power since the resulting density is still rotationally asymmetric. However, sub-Gaussian input distributions can lead to situations where maximally correlated inputs are the most advantageous with respect to transmitting the greatest quantity of information with the fewest number of spikes.

AB - Arguments have been advanced to support the role of principal components (e.g., Karhunen-Loéve, eigenvector) and independent components transformations in early sensory processing, particularly for color and spatial vision. Although the concept of redundancy reduction has been used to justify a principal components transformation, these transformations per se do not necessarily confer benefits with respect to information transmission in information channels with additive independent identically distributed Gaussian noise. Here, it is shown that when a more realistic source of multiplicative neural noise is present in the information channel, there are quantitative benefits to a principal components or independent components representation for Gaussian and non-Gaussian inputs, respectively. Such a representation can convey a larger quantity of information despite the use of fewer spikes. The nature and extent of this benefit depend primarily on the probability distribution of the inputs and the relative power of the inputs. In the case of Gaussian input, the greater the disparity in power between dimensions, the greater the advantage of a principal components representation. For non-Gaussian input distributions with a kurtosis that is super-Gaussian, an independent components representation is similarly advantageous. This advantage holds even for input distributions with equal power since the resulting density is still rotationally asymmetric. However, sub-Gaussian input distributions can lead to situations where maximally correlated inputs are the most advantageous with respect to transmitting the greatest quantity of information with the fewest number of spikes.

UR - http://www.scopus.com/inward/record.url?scp=9744221224&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=9744221224&partnerID=8YFLogxK

U2 - 10.1088/0954-898X_15_4_004

DO - 10.1088/0954-898X_15_4_004

M3 - Article

C2 - 15600235

AN - SCOPUS:9744221224

VL - 15

SP - 291

EP - 311

JO - Network: Computation in Neural Systems

JF - Network: Computation in Neural Systems

SN - 0954-898X

IS - 4

ER -