Dynamics and architecture for neural computation

Fernando J. Pineda

Research output: Contribution to journalArticlepeer-review

104 Scopus citations


Useful computation can be performed by systematically exploiting the phenomenology of nonlinear dynamical systems. Two dynamical phenomena are isolated into primitive architectural components which perform the operations of continuous nonlinear transformation and autoassociative recall. Backpropagation techniques for programming the architectural components are presented in a formalism appropriate for a collective nonlinear dynamical system. It is shown that conventional recurrent backpropagation is not capable of storing multiple patterns in an associative memory which starts out with an insufficient number of point attractors. It is shown that a modified algorithm can solve this problem by introducing new attractors near the to-be-stored patterns. Two primitive components are assembled into an elementary machine and trained to perform invariant pattern recognition with respect to small arbitrary transformations of the input pattern, provided the transformations are sufficiently small. The machine realizes modular learning since error signals do not propagate across the boundaries of the components.

Original languageEnglish (US)
Pages (from-to)216-245
Number of pages30
JournalJournal of Complexity
Issue number3
StatePublished - Sep 1988

ASJC Scopus subject areas

  • Algebra and Number Theory
  • Statistics and Probability
  • Numerical Analysis
  • Mathematics(all)
  • Control and Optimization
  • Applied Mathematics


Dive into the research topics of 'Dynamics and architecture for neural computation'. Together they form a unique fingerprint.

Cite this