TY - JOUR
T1 - How to compress sequential memory patterns into periodic oscillations
T2 - General reduction rules
AU - Zhang, Kechen
N1 - Publisher Copyright:
© 2014 Massachusetts Institute of Technology.
PY - 2014/8/13
Y1 - 2014/8/13
N2 - A neural network with symmetric reciprocal connections always admits a Lyapunov function, whose minima correspond to the memory states stored in the network. Networks with suitable asymmetric connections can store and retrieve a sequence ofmemory patterns, but the dynamics of these networks cannot be characterized as readily as that of the symmetric networks due to the lack of established general methods. Here, a reduction method is developed for a class of asymmetric attractor networks that store sequences of activity patterns as associative memories, as in a Hopfield network. The method projects the original activity pattern of the network to a low-dimensional space such that sequential memory retrievals in the original network correspond to periodic oscillations in the reduced system. The reduced system is self-contained and provides quantitative information about the stability and speed of sequential memory retrievals in the original network. The time evolution of the overlaps between the network state and the stored memory patterns can also be determined from extended reduced systems. The reduction procedure can be summarized by a few reduction rules, which are applied to several network models, including coupled networks and networks with time-delayed connections, and the analytical solutions of the reduced systems are confirmed by numerical simulations of the original networks. Finally, a local learning rule that provides an approximation to the connection weights involving the pseudoinverse is also presented.
AB - A neural network with symmetric reciprocal connections always admits a Lyapunov function, whose minima correspond to the memory states stored in the network. Networks with suitable asymmetric connections can store and retrieve a sequence ofmemory patterns, but the dynamics of these networks cannot be characterized as readily as that of the symmetric networks due to the lack of established general methods. Here, a reduction method is developed for a class of asymmetric attractor networks that store sequences of activity patterns as associative memories, as in a Hopfield network. The method projects the original activity pattern of the network to a low-dimensional space such that sequential memory retrievals in the original network correspond to periodic oscillations in the reduced system. The reduced system is self-contained and provides quantitative information about the stability and speed of sequential memory retrievals in the original network. The time evolution of the overlaps between the network state and the stored memory patterns can also be determined from extended reduced systems. The reduction procedure can be summarized by a few reduction rules, which are applied to several network models, including coupled networks and networks with time-delayed connections, and the analytical solutions of the reduced systems are confirmed by numerical simulations of the original networks. Finally, a local learning rule that provides an approximation to the connection weights involving the pseudoinverse is also presented.
UR - http://www.scopus.com/inward/record.url?scp=84929247749&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84929247749&partnerID=8YFLogxK
U2 - 10.1162/NECO_a_00618
DO - 10.1162/NECO_a_00618
M3 - Article
C2 - 24877729
AN - SCOPUS:84929247749
SN - 0899-7667
VL - 26
SP - 1542
EP - 1599
JO - Neural Computation
JF - Neural Computation
IS - 8
ER -