Generalization of back-propagation to recurrent neural networks

Research output: Contribution to journalArticle

Abstract

An adaptive neural network with asymmetric connections is introduced. This network is related to the Hopfield network with graded neurons and uses a recurrent generalization of the rule of Rumelhart, Hinton, and Williams to modify adaptively the synaptic weights. The new network bears a resemblance to the master/slave network of Lapedes and Farber but it is architecturally simpler.

Original languageEnglish (US)
Pages (from-to)2229-2232
Number of pages4
JournalPhysical Review Letters
Volume59
Issue number19
DOIs
StatePublished - Jan 1 1987

ASJC Scopus subject areas

  • Physics and Astronomy(all)

Fingerprint Dive into the research topics of 'Generalization of back-propagation to recurrent neural networks'. Together they form a unique fingerprint.

  • Cite this