Memory and Learning of Sequential Patterns by Nonmonotone Neural Networks
Masahiko Morita
Abstract:
Conventional neural network models for temporal association generally
do not work well in the absence of synchronizing neurons. This is
because their dynamical properties are fundamentally not suitable for
storing sequential patterns, no matter what storage or learning
algorithm is used. The present article describes a nonmonotone neural
network (NNN) model in which sequential patterns are stored by being
embedded in a trajectory attractor of the dynamical system, and
recalled stably and smoothly without synchronization; recall is done
in such a way that the network state successively moves along the
trajectory. A simple and natural learning algorithm for the NNN is
also presented, where one only has to vary the input pattern gradually
and modify the synaptic weights according to a kind of covariance
rule; then the network state follows slightly behind the input
pattern, and its trajectory grows to be an attractor with a small
number of repetitions.
Keywords:
Sequential pattern memory, Temporal association, Nonmonotone neural
networks, Nonmonotone dynamics, Trajectory attractors, Spatiotemporal
pattern learning, Covariance rule.
gzipped PostScript(818KB)
PDF (888KB)