Robustness of globally exponential stability of delayed neural

0 downloads 0 Views 363KB Size Report
Abstract This paper analyzes the robustness of globally exponential stability of time-varying delayed neural networks. (NNs) subjected to random disturbances.
Neural Comput & Applic (2014) 25:743–749 DOI 10.1007/s00521-014-1547-7

ORIGINAL ARTICLE

Robustness of globally exponential stability of delayed neural networks in the presence of random disturbances Song Zhu • Weiwei Luo • Jinyu Li Yi Shen



Received: 11 October 2012 / Accepted: 4 January 2014 / Published online: 28 January 2014 Ó Springer-Verlag London 2014

Abstract This paper analyzes the robustness of globally exponential stability of time-varying delayed neural networks (NNs) subjected to random disturbances. Given a globally exponentially stable neural network, and in the presence of noise, we quantify how much noise intensity that the delayed neural network can remain to be globally exponentially stable. We characterize the upper bounds of the noise intensity for the delayed NNs to sustain globally exponential stability. The upper bounds of parameter uncertainty intensity are characterized by using transcendental equation. A numerical example is provided to illustrate the theoretical result. Keywords Delayed neural networks  Globally exponential stability  Random disturbances  Robustness

1 Introduction Neural networks (NNs) are nonlinear dynamic systems with some resemblance of biological NNs in the brain. In

S. Zhu (&)  W. Luo  J. Li (&) College of Sciences, China University of Mining and Technology, Xuzhou 221116, China e-mail: [email protected] J. Li e-mail: [email protected] W. Luo e-mail: [email protected] Y. Shen College of Automation and Engineering, Key Laboratory of Ministry of Education for Image Processing and Intelligent Control, Huazhong University of Science and Technology, Wuhan 430074, China e-mail: [email protected]

recent decades, many NNs have been developed and applied extensively in many fields, such as associative memories, image processing, pattern recognition, signal processing, robotics and control. For most successful applications of NNs, the stability is usually a prerequisite. The stability of NNs depends mainly on their parametrical configuration. In biological neural systems, signal transmission via synapses is usually a noisy process influenced by random fluctuations from the release of neurotransmitters and other disturbances [1]. Moreover, in the implementation of NNs, external random disturbances and time delays of signal transmission are common and hardly avoided. It is known that random disturbances and time delays in the neuron activations may result in oscillation or instability of NNs [2]. The stability analysis of delayed NNs (DNNs) and stochastic NNs (SNNs) with external random disturbances has been widely investigated in recent years (see, e.g., [3–26] and the references cited therein). It is well known that noise and time delays can lead to instability and they can destabilize stable DNNs if they exceed their limits. The instability depends on the intensity of noise. For stable DNNs, if the intensity of noise is low, the DNNs may still be stable. Therefore, it is interesting to determine how much random disturbances that a stable DNN can withstand without losing its globally exponential stability. Although the various stability properties of DNNs have been extensively investigated using the Lyapunov stability theory [3, 5, 6, 12–16, 21], the linear matrix inequality methods [4, 9–11, 17] and the matrix norm theory [7, 8], the robustness of the global stability of DNNs is rarely analyzed directly by estimating the upper bounds of noise level from the coefficients of globally exponential stability condition. Motivated by the above discussion, our aim in this paper is to quantify the parameter uncertainty level for

123

744

Neural Comput & Applic (2014) 25:743–749

stable DNNs. Different from the traditional Lyapunov stability theory and the matrix norm theory, we investigate the robust stability for globally exponential stability directly from the coefficients of the DNNs should satisfied the globally exponential stability condition. In this paper, we characterize the robustness of DNNs with additive noise by deriving the upper bounds of noise for globally exponential stability. The upper bounds of parameter uncertainty intensity are characterized by using transcendental equation. We prove theoretically that, for any globally exponentially stable DNNs, if additive noises are smaller than the derived supper bounds here, then the DNNs are guaranteed to also be globally exponentially stable. 2 Problem formulation Throughout this paper, unless otherwise specified, Rn and Rn 9 m denote the n-dimensional Euclidean space and the set of n 9 m real matrices, respectively. Let ðX; F; fFt gt  0 ; PÞ be a complete probability space with a filtration fFt gt  0 satisfying the usual conditions (i.e., the filtration contains all P—null sets and is right continuous). x(t) be a scalar Brownian motion defined on the probability space. If A is a matrix, its operator norm is denoted by kAk ¼ supfjAxj : jxj ¼ 1g, where j  j is the Euclidean norm. Denote L2F0 ð½ s; 0; Rn Þ as the family of all F0 -measurable Cð½ s; 0; Rn Þ valued random variables w ¼ fwðhÞ :  s  h  0g such that 2 sups  h  0 EjwðhÞj \1 where E{} stands for the mathematical expectation operator with respect to the given probability measure P. Consider a DNN model dzðtÞ ¼ ½AzðtÞ þ BgðzðtÞÞ þ Dgðzðt  sðtÞÞÞ þ Idt; zðtÞ ¼ wðt  t0 Þ 2 Cð½t0  s; t0 ; Rn Þ; t0  s  t  t0 ;

ð1Þ

where zðtÞ ¼ ðz1 ðtÞ; . . .; zn ðtÞÞT 2 Rn is the state vector of the neurons, t0 2 Rþ and w 2 Rn are the initial values, A = diagfa1 ; . . .; an g 2 Rnn ; ai [ 0 is the self-feedback connection weight matrix, B ¼ ðbkl Þnn 2 Rnn ; D ¼ ðdkl Þnn 2 Rnn are connection weight matrices, s(t) is a delay, which satisfies sðtÞ : ½t0 ; þ1Þ ! ½0; s; s0 ðtÞ  l\1; w ¼ fwðsÞ :  s  s  0g 2 Cð½ s; 0; Rn Þ; s is the n maximum of delay, gðÞ 2 R is a continuous bounded vector-valued activation function which satisfying the following Lipschitz condition; i.e., jgðuÞ  gðvÞj  kju  vj;

8u; v 2 Rn ; gð0Þ ¼ 0;

where k is a known constant. As usual, a vector z ¼ ½z1 ; zn ; . . .; zn T is said to be an equilibrium point of system (1) if it satisfies

123

Az ¼ ðB þ DÞgðz Þ þ I For notational convenience, we will always shift an intended equilibrium point z* of system (1) to the origin by letting x* = z - z*, f(x) = g(x ? z*) - g(z*). It is easy to transform system (1) into the following form: dxðtÞ ¼ ½AxðtÞ þ Bf ðxðtÞÞ þ Df ðxðt  sðtÞÞÞdt; xðtÞ ¼ wðt  t0 Þ 2 Cð½t0  s; t0 ; Rn Þ; t0  s  t  t0 ;

ð2Þ

In addition, the function f in (2) satisfies the following Lipschitz condition and f(0) = 0: Assumption 1 The activation function f ðÞ satisfies the following Lipschitz condition; i.e., jf ðuÞ  f ðvÞj  kju  vj;

8u; v 2 Rn ; f ð0Þ ¼ 0;

ð3Þ

where k is a known constant. It is known that, based on Assumption 1, the origin is an equilibrium point of (2). DNN (2) has a unique state x(t;t0, w) on t C t0 for any initial value t0, w. Now we define the globally exponential stability of the state of DNN (2). Definition 1 The state of DNN (2) is globally exponentially stable, if for any t0, w, there exist a [ 0 and b [ 0 such that jxðt; t0 ; wÞj  ajjwjjexpðbðt  t0 ÞÞ;

8t  t0 ;

ð4Þ

where x(t; t0, w) is the state of the model in (2). Numerous criteria for ascertaining the globally exponential stability of DNN (2) have been developed; e.g., [5, 12, 13, 16, 22] and the references therein. 3 Main results Now, the question is given a globally exponentially stable DNN, under the noisy condition, how much the intensity the DNN can bear? we consider the noise-induced DNNs described by the Itoˆ stochastic differential equation (SDNNs). dyðtÞ ¼½AyðtÞ þ Bf ðyðtÞÞ þ Df ðyðt  sðtÞÞÞdt þ GðyðtÞ; yðt  sðtÞÞ; tÞdxðtÞ; t [ t0 ; yðtÞ ¼wðt  t0 Þ 2 L2F0 ð½t0  s; t0 ; Rn Þ;

ð5Þ

t0  s  t  t0 ; where the notations of (4) are the same as in Sect. 2, f satisfies Assumption 1, G(y(t), y(t - s(t)), t) is noise function satisfying: Assumption 2 The noise function GðÞ satisfies the following condition: jGðyðtÞ; yðt  sðtÞÞ; tÞj2  r2 jyðtÞj2 þ k2 jyðt  sðtÞÞj2 ; 2

2

where r and k are constant.

ð6Þ

Neural Comput & Applic (2014) 25:743–749

745

Under Assumptions 1 and 2, SDNN (5) has a unique state for any initial value t0, w and the origin point is the equilibrium point. We will characterize how much the noise that SDNN (5) can bear to still be globally exponentially stable. For SDNN (5), we give the following definition of globally exponential stability. Definition 2 [27] SDNN (5) is said to be almost surely globally exponentially stable if for any t0 2 Rþ ; w 2 L2F0 ð½ s; 0; Rn Þ, there exist a [ 0 and b [ 0 such that 8t  t0 ; jyðt; t0 ; wÞj  akwkexpðbðt  t0 ÞÞ hold almost surely; i.e., the Lyapunov exponent limsupt!1 ðlnjyðt; t0 ; wÞj=tÞ\0 almost surely, where y(t; t0, w) is the state of SDNN (5). SDNN (5) is said to be mean square globally exponentially stable if for any t0 2 Rþ ; w 2 L2F0 ð½ s; 0; Rn Þ, there exist a [ 0 and b [ 0 such that 8t  t0 ; Ejyðt; t0 ; wÞj2  akwkexpðbðt  t0 ÞÞ hold; i.e., limsupt!1 ðlnðEjyðt; t0 ; wÞj2 Þ=tÞ\0, where y(t; t0, w) is the state of SDNN (5). If Assumption 1 hold, we have the following lemma [28]. Lemma 1 Let Assumptions 1, 2 hold. Then the globally exponential stability in sense of mean square of SDNN (5) implies the almost surely exponential stability of SDNN (5). Theorem 1 Let Assumptions 1, 2 hold and DNN (2) be globally exponential stable. SDNN (5) is mean square globally exponential stable and also almost surely globally exponential stable, if (r, k) are in the inner of the transcend equation curve 2c2 expð3 sc1 Þ þ 2a2 expð2b sÞ ¼ 1;

ð7Þ

xðtÞ  yðtÞ Zt ¼ ½AðxðsÞ  yðsÞÞ þ Bðf ðxðsÞÞ  f ðyðsÞÞÞ t0

þ Dðf ðxðs  sðsÞÞÞ  f ðyðs  sðsÞÞÞÞds Zt  GðyðsÞ; yðs  sðsÞÞ; sÞdxðsÞ: t0

s, from Assumptions 1, 2 and Holder When t  t0 þ 3 inequality [27], we have EjxðtÞ  yðtÞj2  t Z   2E ½AðxðsÞ  yðsÞÞ þ Bðf ðxðsÞÞ  f ðyðsÞÞÞ  t0 2   þDðf ðxðs  sðsÞÞÞ  f ðyðs  sðsÞÞÞÞds  2  t  Z   þ 2E GðyðsÞ; yðs  sðsÞÞ; sÞdxðsÞ   t0

 ½18 sðjjAjj2 þ jjBjj2 k2 þ 3jjDjj2 k2 Þ þ 4r2 þ 6k2  Zt  EjxðsÞ  yðsÞj2 ds t0 2 2

þ ð54 sjjDjj k þ 6k Þ

ð3 s jjDjj k þ s k Þ  2 h þ 3 s jjAjj2 þ jjBjj2 k2 þ ð1  lÞ # ! # jjDjj2 k2 1 2 2 2 þ a =b þ sr þ sk ð1  lÞ ð1  lÞ h i þ 2r2 þ 3k2 þ 54 sjjDjj2 k2 ð1 þ expð2b sÞÞ a2 =b: 3

2 2

þ ð4r2 þ 6k2 Þ

Zt

From (2) and (5), we have

EjyðsÞ  yðs  sðsÞÞj2 ds

EjxðsÞj2 ds

t0

þ 54 sjjDjj2 k2

Zt

EjxðsÞ  xðs  sðsÞÞj2 ds

t0 2

 ½18 sðjjAjj þ jjBjj2 k2 þ 3jjDjj2 k2 Þ þ 4r2 þ 6k2  Zt  EjxðsÞ  yðsÞj2 ds t0

2 2

Proof Fix t0 ; w ¼ fwðsÞ :  s  s  0g; for simplicity, we write x(t;t0, w), y(t;t0, w) as x(t), y(t), respectively.

Zt t0

where c1 ¼ 18 sðjjAjj2 þ jjBjj2 k2 þ 3jjDjj2 k2 Þ þ 4r2 þ 6k2  h þ ð108 sjjDjj2 k2 þ 12k2 Þ 6 s2 jjAjj2 þ jjBjj2 k2 i  þ jjDjj2 k2 ð1  lÞ1 þ 2 sk2 ð1  lÞ1 ; sr2 þ 2 h sjjDjj2 k2 þ 12k2 Þ s þ sð1  lÞ1 c2 ¼ ð108

2

2 2

2

þ ð54 sjjDjj k þ 6k Þ h

Zt

EjyðsÞ  yðs  sðsÞÞj2 ds

t0

sjjDjj2 k2 ð1 þ expð2b sÞÞ þ 4r þ 6k þ 108 

Zt

2

2

EjxðsÞj2 ds:

i

ð8Þ

t0

In addition, when t  t0 þ s, from (5) and Assumptions 1, 2,

123

746

Neural Comput & Applic (2014) 25:743–749

Zt

(

EjyðsÞ  yðs  sðsÞÞj ds



t0 þ s



Zs ds

t0 þ s

nh

2

2 2

2

6 sðjjAjj þ jjBjj k Þ þ 2r



i

s s

ð9Þ

2

EjyðrÞj dr ¼

ds s t0 þ

s s

 s

Zt

minðrþ Z s;tÞ

Zt dr

EjyðrÞj2 ds

s;rÞ maxðt0 þ

ð13Þ

EjyðrÞj2 dr:

ð10Þ

Ejyðr  sðrÞÞj2 dr

ds s t0 þ

s s

s ðsupt0 s  s  t0 EjyðsÞj2 Þ þ s 2



Rt t0

EjyðuÞj2 du

ð1  lÞ

:

ð11Þ

So, when t  t0 þ s, by substituting (10) and (11) into (9), we have Zt

  ð6 s3 jjDjj2 k2 þ 2 s 2 k2 Þ 2 sup EjyðsÞj  ð1  lÞ t0  s  s  t0 n h i þ 6 s2 jjAjj2 þ jjBjj2 k2 þ jjDjj2 k2 ð1  lÞ1 þ2 sk ð1  lÞ

þ 2 sr

2

o Zt

EjyðsÞj2 ds:

ð12Þ

t0

Substituting (12) into (8), when t  t0 þ s, we have EjxðtÞ  yðtÞj2  ½18 sðjjAjj2 þ jjBjj2 k2 þ 3jjDjj2 k2 Þ þ 4r2 þ 6k2  

Zt

t0

n h þ ð108 sjjDjj2 k2 þ 12k2 Þ s þ sð1  lÞ1

t0  s  s  t0 þ s

ð14Þ When t0 þ s  t  t0 þ 3 s, applying the Gronwall inequality [27], we obtain   2 2 EjxðtÞ  yðtÞj  c2 expð3 sc1 Þ sup EjyðsÞj : ð15Þ t0  s  s  t0 þ s

Therefore, 2

EjxðsÞ  yðsÞj ds

EjyðtÞj2  2EjxðtÞ  yðtÞj2 þ 2EjxðtÞj2

t0 2 2

2

1

s þ sð1  lÞ  þ ð108 sjjDjj k þ 12k Þ½    sup EjyðsÞj2 t0  s  s  t0 þ s

123

EjxðsÞ  yðsÞj2 ds

þ

s t0 þ

1



Zt

ð3 s3 jjDjj2 k2 þ s2 k2 Þ  2 h þ 3 s jjAjj2 þ jjBjj2 k2 ð1  lÞ # ! # jjDjj2 k2 1 2 2 2 þ a =b þ sr þ sk ð1  lÞ ð1  lÞ h i o þ 2r2 þ 3k2 þ 54 sjjDjj2 k2 ð1 þ expð2b sÞÞ a2 =b    sup EjyðsÞj2 :

EjyðsÞ  yðs  sðsÞÞj2 ds

2

From (13), we further have EjxðtÞ  yðtÞj2 n  18 sðjjAjj2 þ jjBjj2 k2 þ 3jjDjj2 k2 Þ þ 4r2 þ 6k2  h þ ð108 sjjDjj2 k2 þ 12k2 Þ 6 s2 jjAjj2 þ jjBjj2 k2 i o þjjDjj2 k2 ð1  lÞ1 þ 2 sk2 ð1  lÞ1 sr2 þ 2

For the same reason, we have Zs

t0

t0  s  s  t0 t0

t0

Zt

i  sk2 ð1  lÞ1 sr2 þ 2 þjjDjj2 k2 ð1  lÞ1 þ 2 9 Zt =  EjyðsÞ  xðsÞ þ xðsÞj2 ds ; h i þ 2r2 þ 3k2 þ 54 sjjDjj2 k2 ð1 þ expð2b sÞÞ   2 2  a =b sup EjyðsÞj :

By reversing the order of integral, we have Zs

sup t0  s  s  t0

o EjyðrÞj2 þ ð6 sjjDjj2 k2 þ 2k2 ÞEjyðr  sðrÞÞj2 dr:

Zt

2

þ ð54 sjjDjj k þ 6k Þ

2

Zt

ð6 s3 jjDjj2 k2 þ 2 s 2 k2 Þ ð1  lÞ   h EjyðsÞj2 þ 6 s2 jjAjj2 þ jjBjj2 k2

2 2

 ½2c2 expð3 sc1 Þ þ 2a2 expð2bðt  t0 ÞÞ   EjyðsÞj2 :  sup t0  s  s  t0 þ s

s Thus, when t0 þ s  t  t0 þ 3

ð16Þ

Neural Comput & Applic (2014) 25:743–749

747

EjyðtÞj2  ½2c2 expð3 sc1 Þ þ 2a2 expð2b sÞ   2 EjyðsÞj  sup t0  s  s  t0 þ s   ~ kÞ EjyðsÞj2 : ¼:Gðr; sup

0.6

ð17Þ

0.4

t0  s  s  t0 þ s

x

0.3

~ kÞ ¼ 2c2 expð3 sc1 Þ þ 2a2 expð2b sÞ. From (7), where Gðr; ~ kÞ have symmetry for four we can conclude that Gðr; ~ 0Þ\1; Gð1; ~ ~ 0Þ is quadrants. Since Gð0; 0Þ [ 1, and Gðr; strictly increasing for r, there exists a unique r~ such that ~ r; ~ 0Þ ¼ 1, this condition also true for k, so it shows that (7) is Gð closed curve for (r, k) and (0,0) is a inner point of the curve. ~ then Gðr; ~ kÞ\1. From (7), when (r, k) are within the region G, ~ kÞ= s, from (7), c [ 0, we can conclude Let c ¼  ln Gðr;

0.2 0.1 0

−0.1 −0.2

0

1

Ejyðt; t0 ; wÞj   expðc sÞ sup sup

 Ejyðt; t0 ; wÞj : 2

t0  s  t  t0 þ s

ð18Þ

Then, for any positive integer m ¼ 1; 2; . . .; we denote y as follow yðt0 þ ð2m  1Þ s; t0 ; wÞ :¼ fyðt0 þ ð2m  1Þ s þ s; t0 ; wÞ :  s  s  0g 2 Cð½ s; 0; Rn Þ: From the existence and uniqueness of the state of SDNN (5), we have yðt; t0 ; wÞ ¼ yðt; t0 þ ð2m  1Þ s; yðt0 þ ð2m  1Þ s; t0 ; wÞÞ: When t  t0 þ ð2m  1Þ s, from (18) Ejyðt; t0 ; wÞj2

sup t0 þð2m1Þ s  t  t0 þð2mþ1Þ s

 expðc sÞ

! Ejyðt; t0 ; wÞj

sup 

 expðcm sÞ

sup t0  s  t  t0 þ s

3

4

5

Fig. 1 The transient state of DNN (20) with s(t) = 0.0005

mean square globally exponentially stable and almost surely globally exponentially stable, provided that the noise is lower than given upper bounds. Remark 2 From the proofs of Theorem 1, we can see that the upper bounds of parameter uncertainty intensity is derived via subtle inequalities and can be estimated by solving transcendental equation. As transcendental equation can be solved by using some software such as MATLAB, the derived conditions in these theorems can be verified easily. Remark 3 From the proofs of Theorem 1, we only need the coefficients of stability condition, and then can give a robust stability for perpetuated system. But it can result conservative result as we use some Gronwall inequality and norm inequality, it enlarge the right of inequality. Next step, we will less the conservative result.

2

t0 þð2m3Þ s  t  t0 þð2m1Þ s

 

2

t

2

t0 þ s  t  t0 þ3 s



x1 x2

0.5

4 Illustrative example Ejyðt; t0 ; wÞj2



sÞ; ¼ c0 expðcm

where c0 ¼ supt0 s  t  t0 þs Ejyðt; t0 ; wÞj2 . So for any s, there must have a positive integer t [ t0 þ ð2m  1Þ m such that t0 þ ð2m  1Þ s  t  t0 þ ð2m þ 1Þ s, we have     c s c Ejyðt; t0 ; wÞj2  c0 exp ð19Þ exp  ðt  t0 Þ : 2 2 Condition (19) is also true when t0  s  t  t0 þ s. So SDNN (5) is mean square globally exponentially stable. According to Lemma 1, SDNN (5) is also almost surely globally exponentially stable (Fig. 1). Remark 1 Theorem 1 shows that when DNN is globally exponentially stable, the SDNN induced by noise can be

Example 1

Consider a two-state DNN

dxðtÞ ¼ AxðtÞ þ Bf ðxðtÞÞ þ Df ðxðt  sðtÞÞÞ: dt

ð20Þ

The parameters are as follows:  A¼

1 0

    0 1 1 1 ;B ¼ ;D ¼ 1 1 1 1

 1 ; 1

f(xj) = (|xj ? 1| - |xj - 1|)/2, s(t) = 0.00025(sin t ? 1), x(0) = [0.6, - 0.2]T. Hence, according to Theorem 2 in [16], DNN (20) is globally exponentially stable with a = 0.5, b = 0.5. In the presence of noise, the DNN becomes a SDNN: dyðtÞ ¼½AyðtÞ þ Bf ðyðtÞÞ þ Df ðyðt  sðtÞÞÞdt þ GðyðtÞ; yðt  sðtÞÞÞdwðtÞ;

ð21Þ

123

748

Neural Comput & Applic (2014) 25:743–749 0.4

y1 y2

0.5

0.3

0.4

0.2 0.3

0.1

y

λ

0.2

0

0.1

−0.1

0

−0.2

−0.1

−0.3

−0.2

−0.4 −0.4

−0.3

−0.2

−0.1

0

σ

0.1

0.2

0.3

0.4

I2

50k Rf12

-Rf22

50k -Rf11

0.6

0.8

1

t

Rf21

50k

-50k R12

-R22

-50k

5 Conclusions

50k

-R11

R21

X1

C1

20uF

X2

C2

R2

50k

50k f2(.)

f1'(.)

a1

20uF

R1

f1(.)

b1

f2'(.)

a2

b2

Fig. 3 The corresponding circuits and systems figure of the DNN (20)

where s(t) is the time-varying delay, we select G(y(t), y(t - s(t)), t) = 0.7r y(t) ? 0.5k y(t - s(t)). According to Theorem 1, let l = 0, Eq. (7) changes as  ð0:216 þ 12k2 Þð2:00975  103 þ 1:0005  103 k2  þ5  104 r2 Þ þ ð2r2 þ 3k2 þ 0:217Þ  expf½0:230 þ 6r2 þ 9k2 þ ð0:324 þ 18k2 Þð1:35  105  þ103 r2 þ 103 k2 Þ  103 þ 0:49975 ¼ 1;

ð22Þ

has its solution in Fig. 2. Figure 3 is the corresponding circuits and systems figure of the DNN (20). Figure 4

123

0.4

depicts the transient states of SDNN (21) with s(t) = 0.0005, r = 0.2, k = 0.1. It shows that SDNN (21) is also globally exponentially stable, as the parameter (r, k) in the inner of transcend equation of (22).

-50k

-50k

0.2

Fig. 4 The transient state of SDNN (21) with s(t) = 0.0005, r = 0.2, k = 0.1

Fig. 2 The stability region with (r, k) in SDNN (21)

I1

0

In this paper, the robustness of DNNs with additive noise is analyzed. The upper bounds of noise are derived for NNs to remain globally exponential stability. The results obtained here provide a theoretical basis for the designs and applications of DNNs in the presence of random disturbances. Further investigations may be aimed at the improvements of the upper bounds to allow bigger stability margins for withstanding random disturbances, also less the conservative of this result. Acknowledgments The authors would like to thank the associate editor and the referees for their detailed comments and valuable suggestions, which considerably improved the presentation of this paper. This work was supported by the Key Program of National Natural Science Foundation of China with Grant No. 61134012, National Natural Science Foundation of China with Grant No. 61203055, 11271146 and supported by the Fundamental Research Funds for the Central Universities of 2013XK03.

References 1. Haykin S (1994) Neural networks. Prentice Hall, New York 2. Pham J, Pakdaman K, Virbert J (1998) Noise-induced coherent oscillations in randomly connected neural networks. Phys Rev E 58(3):3610–3622 3. Arik S (2002) An analysis of global asymptotic stability of delayed cellular neural networks. IEEE Trans Neural Netw 13(5):1239–1242

Neural Comput & Applic (2014) 25:743–749 4. Cao J, Yuan K, Li H (2006) Global asymptotical stability of recurrent neural networks with multiple discrete delays and distributed delays. IEEE Trans Neural Netw 17(6):1646–1651 5. Chen T (2001) Global exponential stability of delayed Hopfield neural networks. Neural Netw 14(8):977–980 6. Chua LO, Yang L (1988) Celluar neural networks: theory. IEEE Trans Circuits Syst 35(10):1257–1272 7. Faydasicok O, Arik S (2012) Robust stability analysis of a class of neural networks with discrete time delays. Neural Netw 29(30):52–59 8. Faydasicok O, Arik S (2013) A new robust stability criterion for dynamical neural networks with multiple time delays. Neurocomputing 99:290–297 9. Hu L, Gao H, Zheng W (2008) Novel stability of cellular neural networks with interval time varying delay. Neural Netw 21(10):1458–1463 10. Huang H, Ho DWC, Lam J (2005) Stochastic stability analysis of fuzzy Hopfield neural networks with time-varying delays. IEEE Trans Circuits Syst II Exp. Briefs 52(5):251–255 11. Liao X, Chen G, Sanchez EN (2002) Delay dependent exponential stability of delayed neural networks: an LMI approach. Neural Netw 15(7):855–866 12. Meng X, Tian M, Hu S (2011) Stability analysis of stochastic recurrent neural networks with unbounded time-varying delays. Neurocomputing 74(6):949–953 13. Liao X, Wang J (2003) Algebraic criteria for global exponential stability of cellular neural networks with multiple time delays. IEEE Trans Circuits Syst I 50(2):268–275 14. Shen Y, Wang J (2007) Noise-induced stabilization of the recurrent neural networks with mixed time varying delays and Markovian-switching parameters. IEEE Trans Neural Netw 18(6):1857–1862 15. Shen Y, Wang J (2008) An improved algebraic criterion for global exponential stability of recurrent neural networks with time-varying delays. IEEE Trans Neural Netw 19(3):528–531 16. Shen Y, Wang J (2012) Robustness analysis of global exponential stability of recurrent neural networks in the presence of time

749

17.

18.

19.

20.

21.

22.

23.

24.

25.

26.

27. 28.

delays and random disturbances. IEEE Trans Neural Netw 23(1):87–96 Wang Z, Liu Y, Li M, Liu X (2006) Stability analysis for stochastic Cohen–Grossberg neural networks with mixed time delays. IEEE Trans Neural Netw 17(3):814–820 Zhu S, Shen Y, Liu L (2010) Exponential stability of uncertain stochastic neural networks with Markovian switching. Neural Process Lett 32:293–309 Zhu S, Shen Y (2013) Robustness analysis for connection weight matrix of global exponential stability recurrent neural networks. Neurocomputing 101:370–374 Zhu S, Shen Y, Chen G (2010) Exponential passivity of neural networks with time-varying delay and uncertainty. Phys Lett A 375(2):136–142 Zeng Z, Wang J (2006) Complete stability of cellular neural networks with time-varying delays. IEEE Trans Circuits Syst I Reg Pap 53(4):944–955 Zeng Z, Huang T, Zheng W (2010) Multistability of recurrent neural networks with time-varying delays and the piecewise linear activation function. IEEE Trans Neural Netw 21(8):1371–1377 Zhang H, Wang Z, Liu D (2009) Global asymptotic stability and robust stability of a class of Cohen–Grossberg neural networks with mixed delays. IEEE Trans Circuits Syst I 56(3):616–629 Chen H, Zhang Y, Hu P (2010) Novel delay-dependent robust stability criteria for neutral stochastic delayed neural networks. Neurocomputing 73(13–15):2554–2561 Zhu S, Shen Y (2012) Robustness analysis of global exponential stability of neural networks with Markovian switching in the presence of time-varying delays or noises. Neural Comput Appl 23(6):1563–1671 Zhu S, Shen Y (2013) Two algebraic criteria for input-to-state stability of recurrent neural networks with time-varying delays. Neural Comput Appl 22:1163–1169 Mao X (2007) Stochastic differential equations and applications, 2 edn. Harwood, Chichester Mao X (2007) Stability and stabilization of stochastic differential delay equations. IET Control Theory Appl 1(6):1551–1566

123

Suggest Documents