Conventional neural networks have a limited ability to memorize
temporal patterns, unless they are provided with a synchronizing
mechanism or some special structure such as delay circuits.
The present paper shows that using nonmonotone dynamics,
neural networks with continuous-time dynamics can memorize
almost arbitrary pattern sequences.
In this model, the state of the network changes gradually
from pattern to pattern.
Numerical experiments show that the trajectory along the stored
sequence can be regarded as a dynamic attractor with a large basin.
How such attractors are formed is also discussed.
Neural networks, Associative memory,
Nonmonotone dynamics, Sequential pattern, Dynamic attractor.