bozic.pdf (383.22 kB)
On global asymptotic stability of fully connected recurrent neural networks
conference contribution
posted on 2010-01-15, 09:50 authored by Danilo P. Mandic, Jonathon Chambers, Milorad M. BozicConditions for global asymptotic stability (GAS) of a nonlinear relaxation process realized by a recurrent neural network (RNN) are provided. Existence, convergence, and robustness of such a process are analyzed. This is undertaken based upon the contraction mapping theorem (CMT) and the corresponding fixed point iteration (FPI). Upper bounds for such a process are shown to be the conditions of convergence for a commonly analyzed RNN with a linear state dependence
History
School
- Mechanical, Electrical and Manufacturing Engineering
Citation
MANDIC, D.P., CHAMBERS, J.A. and BOZIC, M.M., 2000. On global asymptotic stability of fully connected recurrent neural networks. IN: IEEE International Conference on Acoustics, Speech, and Signal Processing, (ICASSP '00), Istanbul, 5-9 June, Vol 6, pp. 3406 - 3409Publisher
© IEEEVersion
- VoR (Version of Record)
Publication date
2000Notes
This is a conference paper [© IEEE]. It is also available at: http://ieeexplore.ieee.org/ Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.ISBN
0780362934Language
- en