Nonlinear modeling of auditory-nerve rate responses to wideband stimuli

Eric D. Young, Barbara M. Calhoun

Research output: Contribution to journalArticlepeer-review

26 Scopus citations


The spectral selectivity of auditory nerve fibers was characterized by a method based on responses to random-spectrum-shape stimuli. The method models the average discharge rate of fibers for steady stimuli and is based on responses to ≈100 noise-like stimuli with pseudorandom spectral levels in 1/8- or 1/16-octave frequency bins. The model assumes that rate is determined by a linear weighting of the spectrum plus a second-order weighting of all pairs of spectrum values within a certain frequency range of best frequency. The method allows prediction of rate responses to stimuli with arbitrary wideband spectral shapes, thus providing a direct test of the degree of linearity of spectral processing Auditory-nerve fibers are shown to rely mainly on linear weighting of the stimulus spectrum; however, significant second-order terms are present and are important in predicting responses to random-spectrum shape stimuli, although not for predicting responses to noise filtered with cat head-related transfer functions. The second-order terms weight the products of levels at identical frequencies positively and the products of different frequencies negatively. As such, they model both curvature in the rate versus level function and suppressive interactions between different frequency components. The first- and second-order characterizations derived in this method provide a measure of higher-order nonlinearities in neurons, albeit without providing information about temporal characteristics.

Original languageEnglish (US)
Pages (from-to)4441-4454
Number of pages14
JournalJournal of neurophysiology
Issue number6
StatePublished - Dec 1 2005

ASJC Scopus subject areas

  • Neuroscience(all)
  • Physiology


Dive into the research topics of 'Nonlinear modeling of auditory-nerve rate responses to wideband stimuli'. Together they form a unique fingerprint.

Cite this