A nonlinear black box structure represented by an affine three-layered state-space neural network with exogenous inputs is considered for nonlinear systems identification. Global stability conditions are derived based on Lyapunov’s stability theory and on the contraction mapping theorem. The problem of structural complexity is also addressed by defining an upper bound for the number of hidden layer neurons expressed as the cardinality of the dominant singular values of the oblique subspace projection of data driven Hankel matrices. Rank degeneracy stemming from nonlinearities, colored noise or finite data sets are dealt with either in terms of the sensitivity of singular values or by comparing the relevance of including additional coordinates in a given realization. Two examples are provided for illustrating the feasibility of derived global stability condition and the proposed approach for delaying with the complexity problem.
- Affine state-space neural networks
- Complexity management
- Global stability conditions
- Oblique projections
- Singular value decomposition