Fast learning in feedforward neural networks by migrating hidden unit outputs

Isaac N. Bankman, David W. Aha

Research output: Contribution to conferencePaperpeer-review

Abstract

We introduce the MIGRATION algorithm, a fast learning algorithm that provides target values for the outputs of hidden units in pattern classification applications. Classification in feedforward artificial neural networks requires separating the outputs of the hidden units with the output layer's perceptron. To do this, the standard BACKPROPAGATION training algorithm employs a general-purpose gradient descent procedure to assign values to the network's weights. Competing algorithms increase convergence rates by replacing gradient descent with a faster algorithm or by using a localized weight-updating scheme. The MIGRATION algorithm instead forcefully migrates the values of the hidden units' outputs to facilitate separation by the output perceptron. It then reassigns the networks' weights by applying the Wiener solution to obtain this separation. We compare the computational complexities of our algorithm with BACK-PROPAGATION, and present preliminary empirical results showing that MIGRATION requires significantly less time to converge than does BACKPROPAGATION for some datasets.

Original languageEnglish (US)
Pages179-184
Number of pages6
StatePublished - Dec 1 1992
EventProceedings of the 1992 Artificial Neural Networks in Engineering, ANNIE'92 - St.Louis, MO, USA
Duration: Nov 15 1992Nov 18 1992

Other

OtherProceedings of the 1992 Artificial Neural Networks in Engineering, ANNIE'92
CitySt.Louis, MO, USA
Period11/15/9211/18/92

ASJC Scopus subject areas

  • Software

Fingerprint Dive into the research topics of 'Fast learning in feedforward neural networks by migrating hidden unit outputs'. Together they form a unique fingerprint.

Cite this