## Abstract

We introduce the MIGRATION algorithm, a fast learning algorithm that provides target values for the outputs of hidden units in pattern classification applications. Classification in feedforward artificial neural networks requires separating the outputs of the hidden units with the output layer's perceptron. To do this, the standard BACKPROPAGATION training algorithm employs a general-purpose gradient descent procedure to assign values to the network's weights. Competing algorithms increase convergence rates by replacing gradient descent with a faster algorithm or by using a localized weight-updating scheme. The MIGRATION algorithm instead forcefully migrates the values of the hidden units' outputs to facilitate separation by the output perceptron. It then reassigns the networks' weights by applying the Wiener solution to obtain this separation. We compare the computational complexities of our algorithm with BACK-PROPAGATION, and present preliminary empirical results showing that MIGRATION requires significantly less time to converge than does BACKPROPAGATION for some datasets.

Original language | English (US) |
---|---|

Pages | 179-184 |

Number of pages | 6 |

State | Published - Dec 1 1992 |

Event | Proceedings of the 1992 Artificial Neural Networks in Engineering, ANNIE'92 - St.Louis, MO, USA Duration: Nov 15 1992 → Nov 18 1992 |

### Other

Other | Proceedings of the 1992 Artificial Neural Networks in Engineering, ANNIE'92 |
---|---|

City | St.Louis, MO, USA |

Period | 11/15/92 → 11/18/92 |

## ASJC Scopus subject areas

- Software