AbstractGradient recurrent high‐order neural networks (GRHONNs) have found great applicability to optimization, associative memory, adaptive signal‐processing and system identification problems. In this paper we show that these neural networks are asymptotically stable dynamical systems and, moreover, that their solutions remain stable under either deterministic or stochastic disturbances. We also prove that GRHONNs are dense in the space of continuous dynamical systems, contrary to the fact that their vector fields do not satisfy the Stone‐Weierstrass theorem. the significance of these theoretical results to associative memory and optimization and to the fabrication of general‐purpose, large‐scale hardware implementations is
展开▼