Neural networks that learn state space trajectories by 'Hebbian' rule

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Summary form only given, as follows. A neural network structure has been proposed which can learn state space trajectories (sequential state transitions) by a Hebbian-like rule, but without resorting to time-delayed synaptic connections. The main idea is to use two Hopfield networks, each of which stabilizes its own memories while it drives the other network into state transition. The dynamics of the network was considered. As an emergent property, the state transitions of all individual neurons are synchronous. The learning rate of the network was estimated.

Original languageEnglish (US)
Title of host publicationProceedings. IJCNN - International Joint Conference on Neural Networks
Editors Anon
PublisherPubl by IEEE
Number of pages1
ISBN (Print)0780301641
StatePublished - Jan 1 1992
Externally publishedYes
EventInternational Joint Conference on Neural Networks - IJCNN-91-Seattle - Seattle, WA, USA
Duration: Jul 8 1991Jul 12 1991

Publication series

NameProceedings. IJCNN - International Joint Conference on Neural Networks

Other

OtherInternational Joint Conference on Neural Networks - IJCNN-91-Seattle
CitySeattle, WA, USA
Period7/8/917/12/91

Fingerprint

Neurons
Trajectories
Neural networks
Data storage equipment

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Zhang, K. (1992). Neural networks that learn state space trajectories by 'Hebbian' rule. In Anon (Ed.), Proceedings. IJCNN - International Joint Conference on Neural Networks (Proceedings. IJCNN - International Joint Conference on Neural Networks). Publ by IEEE.

Neural networks that learn state space trajectories by 'Hebbian' rule. / Zhang, Kechen.

Proceedings. IJCNN - International Joint Conference on Neural Networks. ed. / Anon. Publ by IEEE, 1992. (Proceedings. IJCNN - International Joint Conference on Neural Networks).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Zhang, K 1992, Neural networks that learn state space trajectories by 'Hebbian' rule. in Anon (ed.), Proceedings. IJCNN - International Joint Conference on Neural Networks. Proceedings. IJCNN - International Joint Conference on Neural Networks, Publ by IEEE, International Joint Conference on Neural Networks - IJCNN-91-Seattle, Seattle, WA, USA, 7/8/91.
Zhang K. Neural networks that learn state space trajectories by 'Hebbian' rule. In Anon, editor, Proceedings. IJCNN - International Joint Conference on Neural Networks. Publ by IEEE. 1992. (Proceedings. IJCNN - International Joint Conference on Neural Networks).
Zhang, Kechen. / Neural networks that learn state space trajectories by 'Hebbian' rule. Proceedings. IJCNN - International Joint Conference on Neural Networks. editor / Anon. Publ by IEEE, 1992. (Proceedings. IJCNN - International Joint Conference on Neural Networks).
@inproceedings{d1b9feeb4351405591783f57db866f2d,
title = "Neural networks that learn state space trajectories by 'Hebbian' rule",
abstract = "Summary form only given, as follows. A neural network structure has been proposed which can learn state space trajectories (sequential state transitions) by a Hebbian-like rule, but without resorting to time-delayed synaptic connections. The main idea is to use two Hopfield networks, each of which stabilizes its own memories while it drives the other network into state transition. The dynamics of the network was considered. As an emergent property, the state transitions of all individual neurons are synchronous. The learning rate of the network was estimated.",
author = "Kechen Zhang",
year = "1992",
month = "1",
day = "1",
language = "English (US)",
isbn = "0780301641",
series = "Proceedings. IJCNN - International Joint Conference on Neural Networks",
publisher = "Publ by IEEE",
editor = "Anon",
booktitle = "Proceedings. IJCNN - International Joint Conference on Neural Networks",

}

TY - GEN

T1 - Neural networks that learn state space trajectories by 'Hebbian' rule

AU - Zhang, Kechen

PY - 1992/1/1

Y1 - 1992/1/1

N2 - Summary form only given, as follows. A neural network structure has been proposed which can learn state space trajectories (sequential state transitions) by a Hebbian-like rule, but without resorting to time-delayed synaptic connections. The main idea is to use two Hopfield networks, each of which stabilizes its own memories while it drives the other network into state transition. The dynamics of the network was considered. As an emergent property, the state transitions of all individual neurons are synchronous. The learning rate of the network was estimated.

AB - Summary form only given, as follows. A neural network structure has been proposed which can learn state space trajectories (sequential state transitions) by a Hebbian-like rule, but without resorting to time-delayed synaptic connections. The main idea is to use two Hopfield networks, each of which stabilizes its own memories while it drives the other network into state transition. The dynamics of the network was considered. As an emergent property, the state transitions of all individual neurons are synchronous. The learning rate of the network was estimated.

UR - http://www.scopus.com/inward/record.url?scp=0026745693&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0026745693&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0026745693

SN - 0780301641

T3 - Proceedings. IJCNN - International Joint Conference on Neural Networks

BT - Proceedings. IJCNN - International Joint Conference on Neural Networks

A2 - Anon, null

PB - Publ by IEEE

ER -