Exponential Lagrange Stability for Markovian Jump Uncertain Neural

0 downloads 0 Views 2MB Size Report
May 14, 2018 - 4Basis Department, Jiangsu Polytechnic College of Agriculture and Forestry, ..... exponentially stable (GES) in Lagrange sense, if it is both.
Hindawi Mathematical Problems in Engineering Volume 2018, Article ID 6489517, 15 pages https://doi.org/10.1155/2018/6489517

Research Article Exponential Lagrange Stability for Markovian Jump Uncertain Neural Networks with Leakage Delay and Mixed Time-Varying Delays via Impulsive Control J. Yogambigai,1 M. Syed Ali,1 Quanxin Zhu

,2,3 and Jingwei Cai2,4

1

Department of Mathematics, Thiruvalluvar University, Vellore, Tamil Nadu 632115, India School of Mathematical Sciences and Institute of Finance and Statistics, Nanjing Normal University, Nanjing 210023, China 3 School of Mathematical Sciences, Qufu Normal University, Qufu 273165, China 4 Basis Department, Jiangsu Polytechnic College of Agriculture and Forestry, Zhenjiang, Jiangsu 212400, China 2

Correspondence should be addressed to Quanxin Zhu; [email protected] Received 15 April 2018; Accepted 14 May 2018; Published 20 June 2018 Academic Editor: Qingling Zhang Copyright © 2018 J. Yogambigai et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The problem of exponential Lagrange stability analysis of Markovian jump neural networks with leakage delay and mixed timevarying delays is studied in this paper. By utilizing the Lyapunov functional method, employing free-weighting matrix approach and inequality techniques in matrix form, we establish several novel stability criteria such that, for all admissible parameter uncertainties, the suggested neural network is exponentially stable in Lagrange sense. The derived criteria are expressed in terms of linear matrix inequalities (LMIs). A numerical example is provided to manifest the validity of the proposed results.

1. Introduction It is known that stability is an important research topic in the mathematical field [1–13]. As a special class of mathematical models, neural networks are similar to the brain synapses link structure; neural networks possess multiple dynamic behaviors [14]. For these reasons, neural frameworks have received considerable attention as a result of their intensive applications in determination of some optimization issue, associative memory, classification of patterns, and other areas [15–18]. Since axonal signal transmission time delays often occur in various neural networks and may also cause undesirable dynamic network behaviors such as oscillation and instability, thus, it is important to study the stability of neural networks. Many practical network systems, such as image processing, communication, fault diagnosis, fixed-point computations, parallel computations, and industrial automation, can be modeled as neural networks (NNs) with time delays [19, 20]. Time-varying delays are used to indicate the promoted speed of signals is finite and uncertain in systems [21–28]. Noting that while signal propagation is sometimes instantaneous and can be modeled with discrete delays, it may also be

distributed during a certain time period so that distributed delays are incorporated into the model [29]. Thus, the issue of stability analysis for neural framework with time-varying delays is investigated via LMI technique by many authors [30–34]. Very recently, a leakage delay, which is the time delay in leakage term of the systems and a considerable factor affecting dynamics for the worse in the systems, has been applied to use in studying the problem of stability for neural networks [35–37]. In fact, the leakage term has also a great impact on the dynamical behavior of neural networks. The authors in [38] investigated the stability problem of neural networks with leakage delays and impulses: a piecewise delay method. When one models real nervous systems, random disturbance and parameter uncertainties are a unit ineluctable to be thought about. As a result, the connection weights of the neuron depend upon bound resistance and capacitance values that embody uncertainties [34]. Therefore, it is of great importance to consider the random effect on the stability of neural frameworks with parameter uncertainties [39– 41]. On the other hand, Markovian jump neural networks can be regarded as a special class of hybrid systems, which

2 can model dynamic systems whose structures are subject to random abrupt parameter changes resulting from component or interconnection failures, sudden environment changes, changing subsystem interconnections, and so forth [42–45]. A neural network has limited modes, which may jump from one to another at various periods. Thus, neural networks with such a jumping character may transform to each other, according to a Markovian chain [44, 46]. Applications of this kind of neural networks can be found in modeling production systems, economic systems, and other practical systems. But in practice applications, for example, the state of electronic networks is often subject to instantaneous changes at certain instants, this is impulsive phenomenon. Impulsive neural network model belongs to new category of dynamical systems, which is neither continuous nor discrete ones. Examples of impulsive phenomena can also be found in other fields of automatic control system, artificial intelligence, robotics, etc. As we know, neural networks could be stabilized or destabilized by impulsive phenomena [44, 47]. The presence of impulse means that the state trajectory does not preserve the basic properties. One of the most desirable properties of neural framework is the Lyapunov global stability. From a dynamical system point of view, globally stable networks in Lyapunov sense are monostable systems, which have a unique equilibrium attracting all trajectories asymptotically. However, the equilibrium sometimes does not exist in many real physical systems. When a neural network is used as an associative memory storage or for pattern recognition, the existence of many equilibriums is also necessary. On the other hand, monostable neural networks have been found to be computationally restrictive and multistable dynamics are essential to deal with important neural computations desired. For example, only the neuron with the strongest input should remain active in a winner-take-all network depending on the external input [48, 49]. This is possible only if there are multiple equilibria with some being unstable. As we know, Lagrange stability is one of the most important properties in multistability analysis of the total system which does not require the information of equilibriums [50–54]. The boundedness of solutions and the existence of globally attractive sets lead to a total system concept of stability: (asymptotic) Lagrange stability [55, 56]. Hence we concentrate on studying Lagrange stability of the neural networks with mixed time-varying delays and leakage delay in the presence of uncertainties. Motivated by the above discussions, it is necessary to study the stability properties in Lagrange sense for impulsive neural networks with parameter uncertainties. To the best of our knowledge, there are no published papers on the Lagrange stability analysis of uncertain Markovian jump neural networks with mixed time-varying delays and leakage delay. To fill this gap, we try to perform an exponential stability in Lagrange sense analysis of conceded neural frameworks with mixed time-varying delays and parameter uncertainties. By using novel Lyapunov-Krasovskii functionals together with the zero function, we establish the Lagrange stability of the neural networks with parameter uncertainties. In particular, these sufficient conditions are communicated as

Mathematical Problems in Engineering far as LMIs that can be solved numerically. Finally, a numerical illustration is given to demonstrate the adequacy of the obtained results. Notation. R𝑛 denotes the n-dimensional Euclidean space and R𝑚×𝑛 is the set of all 𝑚 × 𝑛 real matrices. The superscript 󸀠 𝑇󸀠 denotes matrix transposition and A ≥ B (respectively, A < B) where A and B are symmetric matrices (respectively, positive definite); ‖ ⋅ ‖ denotes the Euclidean norm in R𝑛 . If Q is a square matrix, denoted by 𝜆 max (Q) (respectively, 𝜆 min (Q)) it means the largest (respectively, smallest) eigenvalue of Q. Moreover, let (Ω, F, {F𝑡 }𝑡≥0 , P) be a complete probability space with a filtration {F𝑡 }𝑡≥0 satisfying the usual conditions (i.e., the filtration contains all P-null sets and is rightcontinuous). The asterisk ∗ in a symmetric matrix is used to denote term that is induced by symmetry; diag(⋅) stands for the diagonal matrix; 𝐼 and 0 denote the identity matrix and zero matrix of appropriate dimensions, respectively.

2. Problem Formulation and Preliminaries Given a complete probability space {Ω, F, {F𝑡 }𝑡≥0 , P} with a natural filtration {F𝑡 }𝑡≥0 satisfying the usual conditions, where Ω is the sample space, F is the algebra of events, and P is the probability measure defined on F, let {𝑟𝑡 , 𝑡 ≥ 0} be a right-continuous Markovian chain on the probability space (Ω, F, {F𝑡 }𝑡≥0 , P) taking values in the finite space S = {1, 2, . . . , 𝑚} with generator Φ = {𝜙𝑘𝑗 }𝑚×𝑚 (𝑘, 𝑗 ∈ 𝑆) given by P {𝑟(𝑡+Δ) = 𝑗 | 𝑟𝑡 = 𝑘} if 𝑘 ≠ 𝑗, {𝜙𝑘𝑗 Δ + 0 (Δ) , ={ 1 + 𝜙𝑘𝑘 Δ + 0 (Δ) , if 𝑘 = 𝑗. {

(1)

Here Δ > 0 and 𝜙𝑘𝑗 ≥ 0 is the transition rate from 𝑘 to 𝑗 if 𝑗 ≠ 𝑘, while 𝜙𝑘𝑘 = − ∑𝑗=𝑘̸ 𝜙𝑘𝑗 . We consider the Markovian jump uncertain neural networks with both time-varying discrete delays and distributed delays as well as leakage delay and impulsive perturbations: 𝜂̇ (𝑡) = −B (𝑟𝑡 ) 𝜂 (𝑡 − 𝜌) + D0 (𝑟𝑡 ) 𝑓 (𝜂 (𝑡)) + D1 (𝑟𝑡 ) 𝑓 (𝜂 (𝑡 − 𝜏 (𝑡))) + D2 (𝑟𝑡 ) ∫

𝑡

𝑡−𝑑(𝑡)

𝑓 (𝜂 (𝑠)) 𝑑𝑠 + J,

Δ𝜂 (𝑡ℎ ) = −Hℎ (𝑟𝑡 ) {𝜂 (𝑡ℎ− ) − B (𝑟𝑡 ) ∫

𝑡ℎ

𝑡ℎ −𝜌

𝑡 ≠ 𝑡ℎ

(2)

𝜂 (𝑠) 𝑑𝑠} ,

𝑡 = 𝑡ℎ , ℎ ∈ Z+ where 𝜂(𝑡) ∈ R𝑛 stands for the neuron state vector of the system; 𝑓(⋅) ∈ R𝑛 is the nonlinear activation function, respectively, J = (J1 , J2 , . . . , J𝑛 )𝑇 ∈ R𝑛 is an external input vector; B(𝑟𝑡 ) = diag(𝑏1 (𝑟𝑡 ), 𝑏2 (𝑟𝑡 ), . . . , 𝑏𝑛 (𝑟𝑡 )) is self-feedback parameter matrix of the neurons, where 𝑏𝑖 (𝑟𝑡 ) > 0; D0 (𝑟𝑡 ) = [𝑑0 (𝑟𝑡 )]𝑛×𝑛 ∈ R𝑛×𝑛 , D1 (𝑟𝑡 ) = [𝑑1 (𝑟𝑡 )]𝑛×𝑛 ∈ R𝑛×𝑛 , D2 (𝑟𝑡 ) = [𝑑2 (𝑟𝑡 )]𝑛×𝑛 ∈ R𝑛×𝑛 are connection weight matrices. Also,

Mathematical Problems in Engineering

3

Hℎ (𝑟𝑡 ) is the impulses gain matrix at the moment of time 𝑡ℎ . The discrete set {𝑡ℎ } satisfies 0 = 𝑡0 < 𝑡1 < ⋅ ⋅ ⋅ < 𝑡ℎ < ⋅ ⋅ ⋅ , limℎ→∞ 𝑡ℎ . 𝜂(𝑡ℎ− ) denotes the left-hand limits at 𝑡ℎ . Similarly, 𝜂(𝑡ℎ+ ) denotes the right-hand limits at 𝑡ℎ . 𝜌 is the leakage delay, 𝜏(𝑡) is the discrete time-varying delay, and 𝑑(𝑡) is the distributed time-varying delay. The initial condition associated with model (2) is 𝜂(𝑠) = 𝜓(𝑠), where 𝑠 ∈ [−𝛽, 0], and 𝜓 is differentiable on [−𝛽, 0], and 𝛽 = max{𝜌, 𝜏, 𝑑}. The time-varying delays 𝜏(𝑡) and 𝑑(𝑡) satisfy 0 ≤ 𝜏 (𝑡) ≤ 𝜏, 0 ≤ 𝑑 (𝑡) ≤ 𝑑,

(3)

= M (𝑟𝑡 ) 𝐹 (𝑟𝑡 ) [N𝑏 (𝑟𝑡 ) N0 (𝑟𝑡 ) N1 (𝑟𝑡 ) N2 (𝑟𝑡 )]

𝑉 (𝑡ℎ− , 𝑥, 𝑖) and

lim

(𝑡,𝑧,𝑗)→(𝑡ℎ− ,𝑥,𝑖)

𝑉 (𝑡ℎ+ , 𝑥, 𝑗)

lim

(𝑡,𝑧,𝑗)→(𝑡ℎ+ ,𝑥,𝑗)

(6)

with 𝑉(𝑡ℎ+ , 𝑥, 𝑗) = 𝑉(𝑡ℎ , 𝑥, 𝑗) satisfied. Lemma 5 (see [58]). Let 𝜂, 𝜇 ∈ R𝑛 , R be a positive definite matrix; then the following inequality holds: 𝜂𝑇 𝜇 + 𝜇𝑇 𝜂 ≤ 𝜂𝑇 R−1 𝜂 + 𝜇𝑇 R𝜇.

where 𝜏 and 𝑑 are constants. Here B(𝑟𝑡 ) = B(𝑟𝑡 ) + ΔB(𝑟𝑡 ), D0 (𝑟𝑡 ) = D0 (𝑟𝑡 ) + ΔD0 (𝑟𝑡 ), D1 (𝑟𝑡 ) = D1 (𝑟𝑡 ) + ΔD1 (𝑟𝑡 ) and D2 (𝑟𝑡 ) = D2 (𝑟𝑡 ) + ΔD2 (𝑟𝑡 ). ΔB(𝑟𝑡 ), ΔD0 (𝑟𝑡 ), ΔD1 (𝑟𝑡 ), and ΔD2 (𝑟𝑡 ) are realvalued unknown matrices representing time-varying parameter uncertainties and are assumed to be of the form [ΔB (𝑟𝑡 ) ΔD0 (𝑟𝑡 ) ΔD1 (𝑟𝑡 ) ΔD2 (𝑟𝑡 )]

Definition 4 (see [44]). The function 𝑉 : [𝑡0 , ∞) × R𝑛 × 𝑆 → R+ belongs to class 𝜓0 if (a) the function 𝑉 is continuous on each of the sets [𝑡ℎ−1 , 𝑡ℎ ) × R𝑛 × 𝑆 and for all 𝑡 ≥ 𝑡0 , 𝑉(𝑡, 0, 𝑖) ≡ 0, 𝑖 ∈ 𝑆; (b)𝑉(𝑡, 𝑥, 𝑖) is locally Lipschitzian in 𝑥 ∈ R𝑛 , 𝑖 ∈ 𝑆; (c) for each ℎ = 1, 2, . . . and 𝑖, 𝑗 ∈ 𝑆, there exist finite limits

(7)

Lemma 6 (see [59]). For any constant matrix R > 0, any scalars 𝛼 and 𝛽 with 𝛼 < 𝛽, and a vector function 𝜂(𝑡) : [𝛼, 𝛽] → R such that the integrals concerned are well defined, the following inequality holds: 𝑇

𝛽

𝛽

[∫ 𝜂 (𝑠) 𝑑𝑠] R [∫ 𝜂 (𝑠) 𝑑𝑠] 𝛼

(4)

𝛼

(8)

𝛽

≤ (𝛽 − 𝛼) ∫ 𝜂𝑇 (𝑠) R𝜂 (𝑠) 𝑑𝑠. 𝛼

where M(𝑟𝑡 ), N𝑏 (𝑟𝑡 ), N0 (𝑟𝑡 ), N1 (𝑟𝑡 ), and N2 (𝑟𝑡 ) are known real constant matrices for all 𝑟𝑡 ∈ 𝑆 and 𝐹(𝑟𝑡 ) is the uncertain time-varying matrix satisfying 𝐹𝑇 (𝑟𝑡 )𝐹(𝑟𝑡 ) = 𝐼, ∀𝑟𝑡 ∈ 𝑆. Assumption (A). There is a positive diagonal matrix L = diag{𝑙1 , 𝑙2 , . . . , 𝑙𝑛 } such that 𝑓(0) = 0 and 󵄨 󵄨󵄨 󵄨 󵄨 󵄨󵄨𝑓 (𝛽1 ) − 𝑓 (𝛽2 )󵄨󵄨󵄨 ≤ L 󵄨󵄨󵄨𝛽1 − 𝛽2 󵄨󵄨󵄨

R

Lemma 7 (see [59]). The LMI R = ( R11𝑇 R𝑇11 ,

R11 = R22 = conditions:

R12 R22

) < 0 with

is equivalent to one of the following

𝑇 (𝑖) R22 < 0, R11 − R12 R−1 22 R12 < 0;

(𝑖𝑖) R11 < 0, R22 − R𝑇12 R−1 11 R12 < 0.

(5)

for 𝛽1 ≠ 𝛽2 . For convenience, each possible value of 𝑟𝑡 is denoted by 𝑘, 𝑘 ∈ 𝑆 in the sequel. Then we have B𝑘 = B(𝑟𝑡 ), D0𝑘 = D0 (𝑟𝑡 ), D1𝑘 = D1 (𝑟𝑡 ), D2𝑘 = D0 (𝑟𝑡 ), Hℎ𝑘 = Hℎ (𝑟𝑡 ).

R𝑇22

12

(9)

Lemma 8 (see [57]). Let 𝑉(𝑡) ∈ 𝐶([0, +∞), R), and there are two positive constants 𝛿 and 𝜌 such that 𝐷+ 𝑉 (𝑡) ≤ −𝛿𝑉 (𝑡) + 𝜌, 𝑡 ≥ 0,

(10)

𝛿 𝛿 ≤ (𝑉 (0) − ) 𝑒−𝛿𝑡 , 𝜌 𝜌

(11)

and then Definition 1 (see [57]). The neural network (2) is said to be uniformly stable in Lagrange sense, if, for any 𝛿 > 0, there is a positive constant 𝐻 = 𝐻(𝛿) > 0 such that ‖𝜂(𝑡, 𝜓)‖ < 𝐻 for any 𝜓 ∈ 𝑅𝛿 = {𝜓 ∈ 𝐶([−𝛽, 0], R𝑛 ) : ‖𝜓‖ < 𝐻, 𝑡 ≥ 0}. Definition 2 (see [57]). If there exists a radially unbounded and positive definite function 𝑉(⋅), a nonnegative continuous function 𝐻(⋅), and two positive constants 𝛾 and 𝜀 such that, for any solution 𝜂(𝑡) of neural network (2), 𝑉(𝜂(𝑡)) > 𝛾 implies 𝑉(𝜂(𝑡)) − 𝛾 ≤ 𝐻(𝜓)𝑒−𝜀𝑡 for any 𝑡 ≥ 0 and 𝜓 ∈ 𝑅𝛿 , then the neural network (2) is said to be globally exponentially attractive (GEA) with respect to 𝑉(𝜂(𝑡)), and the compact set Ω = {𝜂(𝑡) ∈ R𝑛 : 𝑉(𝜂(𝑡)) ≤ 𝛾} is said to be a GEA set of (2). Definition 3 (see [57]). The neural network (2) is globally exponentially stable (GES) in Lagrange sense, if it is both uniformly stable in Lagrange sense and GEA. If there is a need to emphasize the Lyapunov-like functions, the neural network will be called GES in Lagrange sense with respect to 𝑉.

𝑉 (𝑡) −

𝑡 ≥ 0.

In particular, if 𝑉(𝑡) ≥ 𝛿/𝜌 for 𝑡 ≥ 0, then 𝑉(𝑡) exponentially approaches 𝜌/𝛿 as t increases. Lemma 9 (see [60]). Let D, E, and F(𝑡) be real matrices of appropriate dimensions, and F(𝑡) satisfies F𝑇 (𝑡)F(𝑡) = 𝐼. Then, the following inequality holds for any constant 𝜀 > 0: DF (𝑡) E + E𝑇 F𝑇 (𝑡) D𝑇 = 𝜀DD𝑇 + 𝜀−1 E𝑇 E.

(12)

3. Main Results The following theorem presents a Lagrange stability condition for the Markovian jump neural networks (2) with uncertainties (i.e., ΔB(𝑟𝑡 ) ≠ 0, ΔD0 (𝑟𝑡 ) ≠ 0, ΔD1 (𝑟𝑡 ) ≠ 0, ΔD2 (𝑟𝑡 ) ≠ 0).

4

Mathematical Problems in Engineering

Theorem 10. Under assumption (A), for given constants 𝜏, 𝑑, 𝛿 > 0, there exist positive definite matrices W1 , W2 , W3 , W4 , R1𝑘 , R2 , R3 , positive diagonal matrices Q, S, and any appropriate dimensional matrices P1 , P2 , X11 , X12 , X13 , X22 , X23 , X33 such that the following linear matrix inequalities (LMIs) hold: X11 X12 X13 [ ∗ X X ] X=[ 22 23 ] > 0, [ ∗



(13)

X33 ]

R1𝑘 (𝐼 − Hℎ𝑘 ) R1𝑙 ∗

R1𝑙

Ψ4,6 = −D𝑇1𝑘 R1𝑘 B𝑘 , Ψ4,7 = D𝑇1𝑘 P𝑇1 , Ψ4,8 = D𝑇1𝑘 P𝑇2 − 𝜀1 N𝑇1𝑘 N𝑏𝑘 , Ψ5,5 = −

𝑒−𝛿𝑑 R3 + 𝜀1 N𝑇2𝑘 N2𝑘 + 𝜀2 N𝑇2𝑘 N2𝑘 , 𝑑

Ψ5,6 = −D𝑇2𝑘 R1𝑘 B𝑘 ,

𝑇

[

Ψ4,5 = 𝜀1 N𝑇1𝑘 N2𝑘 + 𝜀2 N𝑇1𝑘 N2𝑘 ,

Ψ5,7 = D𝑇2𝑘 P𝑇1 ,

] ≥ 0, (14) ℎ ∈ Z+ , [here 𝑟𝑡ℎ = 𝑙]

Ψ5,8 = D𝑇2𝑘 P𝑇2 − 𝜀1 N𝑇2𝑘 N𝑏𝑘 , 𝑚

Ψ6,6 = 𝛿B𝑘 R1𝑘 B𝑘 + ∑𝜙𝑘𝑗 B𝑘 R1𝑗 B𝑘 −

Ψ = Ψ(𝑟,𝑠) < 0 (𝑟, 𝑠 = 1, 2, . . . , 14) , where

𝑗=1

(15)

𝑒−𝛿𝜌 R2 , 𝜌

Ψ6,10 = −R𝑇1𝑘 M𝑘 B𝑘 , Ψ6,12 = B𝑘 R1𝑘 ,

𝑚

Ψ1,1 = −R1𝑘 B𝑘 − B𝑘 R1𝑘 + 𝛿R1𝑘 + ∑𝜙𝑘𝑗 R1𝑗 𝑗=1

+ 𝛿R2 + 𝜏X11 + X13 + +

X𝑇13

+ LQL

𝜀2 N𝑇𝑏𝑘 N𝑏𝑘 ,

Ψ1,2 = 𝜏X12 − X13 + Ψ1,3 = R1𝑘 D0𝑘 − Ψ1,4 = R1𝑘 D1𝑘 −

𝜀2 N𝑇𝑏𝑘 N1𝑘 ,

Ψ1,5 = R1𝑘 D2𝑘 −

𝜀2 N𝑇𝑏𝑘 N2𝑘 ,

Ψ8,14 = P2 , Ψ9,9 = −𝜀1 𝐼, Ψ10,10 = −𝜀2 𝐼, 𝑚

Ψ11,11 = −W1 ,

𝑗=1

Ψ12,12 = −W2 ,

R𝑇1𝑘 M𝑘 ,

Ψ13,13 = −W3 ,

Ψ1,11 = R1𝑘 ,

Ψ14,14 = −W4 ,

Ψ2,2 = 𝜏X22 − X23 − X𝑇23 + LSL, Ψ3,3 = 𝑑R3 − Q + 𝜀1 N𝑇0𝑘 N0𝑘 + 𝜀2 N𝑇0𝑘 N0𝑘 , Ψ3,4 = 𝜀1 N𝑇0𝑘 N1𝑘 + 𝜀2 N𝑇0𝑘 N1𝑘 , Ψ3,5 = 𝜀1 N𝑇0𝑘 N2𝑘 + 𝜀2 N𝑇0𝑘 N2𝑘 , Ψ3,6 =

−D𝑇0𝑘 R1𝑘 B𝑘 ,

Ψ3,7 = D𝑇0𝑘 P𝑇1 , Ψ3,8 =

D𝑇0𝑘 P𝑇2

Ψ7,9 = P𝑇1 M𝑘 , Ψ7,13 = P1 ,

Ψ8,9 = P𝑇2 M𝑘 ,

Ψ1,6 = B𝑘 R1𝑘 B𝑘 − 𝛿R1𝑘 B𝑘 − ∑𝜙𝑘𝑗 R1𝑗 B𝑘 , Ψ1,10 =

Ψ7,8 = −P1 B𝑘 − P𝑇2 ,

Ψ8,8 = −P2 B𝑘 − B𝑘 P𝑇2 + 𝜀1 N𝑇𝑏𝑘 N𝑏𝑘 ,

X𝑇23 ,

𝜀2 N𝑇𝑏𝑘 N0𝑘 ,

Ψ7,7 = 𝜏𝑒𝛿𝜏 X33 − P1 − P𝑇1 ,



𝜀1 N𝑇0𝑘 N𝑏𝑘 ,

Ψ4,4 = −S + 𝜀1 N𝑇1𝑘 N1𝑘 + 𝜀2 N𝑇1𝑘 N1𝑘 ,

(16) and other Ψ(𝑟,𝑠) are equal to zero. Then the Markovian jump uncertain neural network (2) is globally exponentially stable in Lagrange sense. Moreover, the set { 󵄩 󵄩 Ω = {𝜂 (𝑡) ∈ R𝑛 : 󵄩󵄩󵄩𝜂 (𝑡)󵄩󵄩󵄩 { ≤(

𝑇

1/2

J (W1 + W2 + W3 + W4 ) J ) 𝛿𝜆 min (R1 )

(17) } 𝑒𝜌‖B𝑘 ‖ } }

is an estimation of globally exponentially attractive set of (2).

Mathematical Problems in Engineering

5

Proof. Consider the following Lyapunov-Krasovskii functional candidate for model (2) as 5

𝑉 (𝑟𝑡 , 𝑡) = ∑ 𝑉𝑎 (𝑟𝑡 , 𝑡) ,

(18)

𝑎=1

where

𝑡−𝜌

⋅ R1𝑘 (𝜂 (𝑡) − B𝑘 ∫

𝑡

𝑡−𝜌

0

𝑉2 (𝑟𝑡 , 𝑡) = ∫ ∫

𝑡

−𝜌 𝑡+𝜃

− B𝑘 ∫

0

− B𝑘 ∫

−𝑑 𝑡+𝜃

𝜂 (𝑠) R2 𝜂 (𝑠) 𝑑𝑠 𝑑𝜃,

𝑒𝛿(𝑠−𝑡) 𝑓𝑇 (𝜂 (𝑠)) R3 𝑓 (𝜂 (𝑠)) 𝑑𝑠 𝑑𝜃,

0

𝑡

𝑉4 (𝑟𝑡 , 𝑡) = ∫ ∫

−𝜏 𝑡+𝜃

𝑒𝛿(𝜏+𝑠−𝑡) 𝜂𝑇̇ (𝑠) X33 𝜂̇ (𝑠) 𝑑𝑠 𝑑𝜃,

𝑡

𝜃

0

𝜃−𝜏(𝜃)

𝑉5 (𝑟𝑡 , 𝑡) = ∫ ∫

𝑒𝛿(𝑠−𝑡) 𝜗𝑇 (𝜃, 𝑠) X𝜗 (𝜃, 𝑠) 𝑑𝑠 𝑑𝜃,

and 𝜗(𝜃, 𝑠) = [𝜂𝑇 (𝜃) 𝜂𝑇 (𝜃 − 𝜏(𝜃)) 𝜂𝑇̇ (𝑠)]𝑇 . Computing the time derivative of 𝑉1 (𝑟𝑡 , 𝑡), along the trajectories of model (2), and using Lemma 5, we obtain 𝑉1̇ (𝑟𝑡 , 𝑡) = (𝜂 (𝑡) − B𝑘 ∫

𝑡

𝑡−𝜌

𝑇

𝑇

+ B𝑘 𝜂 (𝑡 − 𝜌)) R1𝑘 (𝜂 (𝑡) − B𝑘 ∫

𝑡

𝑡−𝜌

𝑚

+ ∑𝜙𝑘𝑗 (𝜂 (𝑡) − B𝑘 ∫

𝑡

𝑡−𝜌

𝑗=1

− B𝑘 ∫

𝑡

𝑚

𝑗=1

⋅ R1𝑘 D0𝑘 𝑓 (𝜂 (𝑡)) + 𝑓𝑇 (𝜂 (𝑡)) D𝑇0𝑘 R1𝑘 𝜂 (𝑡) + 𝜂𝑇 (𝑡) R1𝑘 D1𝑘 𝑓 (𝜂 (𝑡 − 𝜏 (𝑡))) + 𝑓𝑇 (𝜂 (𝑡 − 𝜏 (𝑡))) D𝑇1𝑘 R1𝑘 𝜂 (𝑡) + 𝜂𝑇 (𝑡) ⋅ R1𝑘 D2𝑘 ∫

𝑡−𝜌

𝑡

𝑡−𝑑(𝑡)

𝑡

+ (∫

𝑡−𝑑(𝑡)

𝑓 (𝜂 (𝑠)) 𝑑𝑠 𝑇

𝑓 (𝜂 (𝑠)) 𝑑𝑠) × D𝑇2𝑘 R1𝑘 𝜂 (𝑡) + 𝜂𝑇 (𝑡) 𝑚

⋅ (B𝑘 R1𝑘 B𝑘 − 𝛿R1𝑘 B𝑘 − ∑ 𝜙𝑘𝑗 R1𝑗 B𝑘 ) 𝑡

𝜂 (𝑠) 𝑑𝑠 + (∫

⋅∫

𝑇

𝜂 (𝑠) 𝑑𝑠) R1𝑘 − B𝑘 𝜂 (𝑡)

𝑗=1

⋅ D𝑇0𝑘 R1𝑘 B𝑘 × ∫

𝑡

𝑡−𝜌

𝑡−𝑑(𝑡)

𝑓 (𝜂 (𝑠)) 𝑑𝑠 + J) + −B𝑘 𝜂 (𝑡)

𝜂 (𝑠) 𝑑𝑠 − (∫

𝑡

𝑡−𝜌

𝑇

𝜂 (𝑠) 𝑑𝑠)

⋅ B𝑘 R1𝑘 D0𝑘 𝑓 (𝜂 (𝑡)) − 𝑓𝑇 (𝜂 (𝑡 − 𝜏 (𝑡))) ⋅ D𝑇1𝑘 R1𝑘 B𝑘 × ∫

𝑡

𝑡−𝜌

𝑡

𝜂 (𝑠) 𝑑𝑠) (B𝑘 R1𝑘 B𝑘

− 𝛿B𝑘 R1𝑘 − ∑𝜙𝑘𝑗 B𝑘 R1𝑗 ) 𝜂 (𝑡) − 𝑓𝑇 (𝜂 (𝑡))

𝜂 (𝑠) 𝑑𝑠) + (𝜂 (𝑡)

+ D0𝑘 𝑓 (𝜂 (𝑡)) + D1𝑘 𝑓 (𝜂 (𝑡 − 𝜏 (𝑡)) + D2𝑘 ∫

𝑇

𝑡

𝑡−𝜌

𝜂 (𝑠) 𝑑𝑠)

𝑇

𝑡

𝜂 (𝑠) 𝑑𝑠) ,

𝑚

𝑡−𝜌

𝑡−𝜌

𝑇

𝜂 (𝑠) 𝑑𝑠) R1𝑗 (𝜂 (𝑡)

𝑗=1

𝜂 (𝑠) 𝑑𝑠) R1𝑗 (𝜂 (𝑡)

≤ −𝛿𝑉1 (𝑟𝑡 , 𝑡) + 𝛿 (𝜂 (𝑡) − B𝑘 ∫ 𝑡

𝑗=1

𝑇

𝜂 (𝑠) 𝑑𝑠) ,

⋅ R1𝑘 (𝜂 (𝑡) − B𝑘 ∫

𝑚

𝜂 (𝑠) 𝑑𝑠) + ∑ 𝜙𝑘𝑗 (𝜂 (𝑡)

𝑇 + R1𝑘 W−1 1 R1𝑘 + ∑ 𝜙𝑘𝑗 R1𝑗 ) 𝜂 (𝑡) + 𝜂 (𝑡)

𝑡−𝜌

𝑡

𝑡−𝜌

− B𝑘 ∫

𝜂 (𝑠) 𝑑𝑠)

𝑓 (𝜂 (𝑠)) 𝑑𝑠 + J) R1𝑘 (𝜂 (𝑡)

≤ −𝛿𝑉1 (𝑟𝑡 , 𝑡) + 𝜂𝑇 (𝑡) (−R1𝑘 B𝑘 − B𝑘 R1𝑘 + 𝛿R1𝑘

𝜂 (𝑠) 𝑑𝑠) R1𝑘 (𝜂̇ (𝑡)

− B𝑘 𝜂 (𝑡) + B𝑘 𝜂 (𝑡 − 𝜌)) + (𝜂̇ (𝑡) − B𝑘 𝜂 (𝑡)

𝑡

𝑡−𝜌

(19) 𝑡

𝑡

𝑡−𝜌

𝛿(𝑠−𝑡) 𝑇

𝑒

𝑡

𝑡−𝜌

𝜂 (𝑠) 𝑑𝑠) ,

𝑇

𝑡

𝑡−𝑑(𝑡)

𝜂 (𝑠) 𝑑𝑠)

𝑉3 (𝑟𝑡 , 𝑡) =∫ ∫

+ D2𝑘 ∫ − B𝑘 ∫

𝑇

𝑡

𝑉1 (𝑟𝑡 , 𝑡) = (𝜂 (𝑡) − B𝑘 ∫

+ D0𝑘 𝑓 (𝜂 (𝑡)) + D1𝑘 𝑓 (𝜂 (𝑡 − 𝜏 (𝑡))

𝜂 (𝑠) 𝑑𝑠 − (∫

𝑡−𝜌

⋅ B𝑘 R1𝑘 D1𝑘 𝑓 (𝜂 (𝑡 − 𝜏 (𝑡))) 𝑡

− (∫

𝑡−𝑑(𝑡)

𝑡

𝑇

𝑓 (𝜂 (𝑠)) 𝑑𝑠) D𝑇2𝑘 R1𝑘

𝑇

𝜂 (𝑠) 𝑑𝑠)

6

Mathematical Problems in Engineering × B𝑘 ∫

𝑡

𝑡−𝜌

𝜂 (𝑠) 𝑑𝑠 − (∫

𝑡−𝜌

⋅ B𝑘 R1𝑘 D2𝑘 ∫

𝑡

𝑡−𝑑(𝑡)

+ (∫

𝑡

𝑇

𝑡

≤ −𝛿𝑉5 (𝑟𝑡 , 𝑡) + ∫

𝜂 (𝑠) 𝑑𝑠)

𝑡−𝜏(𝑡)

= −𝛿𝑉5 (𝑟𝑡 , 𝑡) + 𝜏 (𝑡) [𝜂𝑇 (𝑡) X11 𝜂 (𝑡)

𝑓 (𝜂 (𝑠)) 𝑑𝑠

+ 𝜂𝑇 (𝑡) X12 𝜂 (𝑡 − 𝜏 (𝑡)) + 𝜂𝑇 (𝑡 − 𝜏 (𝑡)) X𝑇12 𝜂 (𝑡)

𝑇

𝑡

𝑡−𝜌

𝜗𝑇 (𝑡, 𝑠) X𝜗 (𝑡, 𝑠) 𝑑𝑠

𝜂 (𝑠) 𝑑𝑠) × (𝛿B𝑘 R1𝑘 B𝑘

+ 𝜂𝑇 (𝑡 − 𝜏 (𝑡)) X22 𝜂 (𝑡 − 𝜏 (𝑡))] + 𝜂𝑇 (𝑡) (X13

𝑚

+ X𝑇13 ) 𝜂 (𝑡) + 𝜂𝑇 (𝑡) (−X13 + X𝑇23 ) 𝜂 (𝑡 − 𝜏 (𝑡))

+ B𝑘 R1𝑘 W−1 2 R1𝑘 B𝑘 + ∑ 𝜙𝑘𝑗 B𝑘 R1𝑗 B𝑘 ) 𝑗=1

𝑡

+ 𝜂𝑇 (𝑡 − 𝜏 (𝑡)) (−X𝑇13 + X23 ) 𝜂 (𝑡) + 𝜂𝑇 (𝑡 − 𝜏 (𝑡))

𝑇

⋅∫

𝑡−𝜌

𝜂 (𝑠) 𝑑𝑠 + J (W1 + W2 ) J. (20)

⋅ (X𝑇23 + X23 ) 𝜂 (𝑡 − 𝜏 (𝑡)) 𝑡

Calculating the time derivative of 𝑉𝑏 (𝑟𝑡 , 𝑡) (𝑏 = 2, 3, 4, 5), we have 0

𝑉2̇ (𝑟𝑡 , 𝑡) = −𝛿 ∫ ∫

𝑡

−𝜌 𝑡+𝜃

𝑡

𝑡−𝜌

−𝑒



𝑡

𝜂 (𝑠) R2 𝜂 (𝑠) 𝑑𝑠 ≤ −𝛿𝑉2 (𝑟𝑡 , 𝑡)

+ 𝜌𝜂𝑇 (𝑡) R2 𝜂 (𝑡) − ⋅ R2 (∫

𝑡

𝑡−𝜌

𝑡 𝑒−𝛿𝜌 (∫ 𝜂 (𝑠) 𝑑𝑠) 𝜌 𝑡−𝜌

(21)

𝑇

(26) 𝑇

− 𝑓 (𝜂 (𝑡 − 𝜏 (𝑡))) S𝑓 (𝜂 (𝑡 − 𝜏 (𝑡))) . Left-multiplying two sides of (2) by 𝜂𝑇̇ (𝑡)P1 , we obtain 𝜂𝑇̇ (𝑡) P1 𝜂̇ (𝑡) = −𝜂𝑇̇ (𝑡) P1 B𝑘 𝜂 (𝑡 − 𝜌)

𝜂 (𝑠) 𝑑𝑠) ,

+ 𝜂𝑇̇ (𝑡) P1 D0𝑘 𝑓 (𝜂 (𝑡))

𝑉3̇ (𝑟𝑡 , 𝑡) ≤ −𝛿𝑉3 (𝑟𝑡 , 𝑡) + 𝑑𝑓𝑇 (𝜂 (𝑡)) R3 𝑓 (𝜂 (𝑡)) 𝑇

𝑡 𝑡 𝑒−𝛿𝑑 − 𝜂 (𝑠) 𝑑𝑠) R3 (∫ 𝜂 (𝑠) 𝑑𝑠) , (∫ 𝑑 𝑡−𝑑(𝑡) 𝑡−𝑑(𝑡) 0

𝑡

𝑉4̇ (𝑟𝑡 , 𝑡) = −𝛿 ∫ ∫

−𝜏 𝑡+𝜃

+ 𝜂𝑇̇ (𝑡) P1 D2𝑘 ∫

𝑡−𝜏

𝑡

𝑡−𝜏

𝑡−𝜏(𝑡)

𝑡

𝜃

0

𝜃−𝜏(𝜃)

− 𝛿∫ ∫

𝑓 (𝜂 (𝑠)) 𝑑𝑠

Taking the transpose on two sides of (27), we have

+ 𝜏𝑒𝛿𝜏 𝜂𝑇̇ (𝑡) X33 𝜂̇ (𝑡) − ∫ 𝑉5̇ (𝑟𝑡 , 𝑡) = ∫

(27)

+ 𝜂𝑇̇ (𝑡) P1 J.

𝑒𝛿(𝜏+𝑠−𝑡) 𝜂𝑇̇ (𝑠) X33 𝜂̇ (𝑠) 𝑑𝑠 𝑑𝜃

𝑒𝛿(𝜏+𝑠−𝑡) 𝜂𝑇̇ (𝑠) X33 𝜂̇ (𝑠) 𝑑𝑠 ≤ −𝛿𝑉4 (𝑟𝑡 , 𝑡)

𝑡

𝑡

𝑡−𝑑(𝑡)

+ 𝜏𝑒 𝜂̇ (𝑡) X33 𝜂̇ (𝑡) 𝑡

+ 𝜂𝑇̇ (𝑡) P1 D1𝑘 𝑓 (𝜂 (𝑡 − 𝜏 (𝑡)))

(22)

𝛿𝜏 𝑇

−∫

(25)

0 ≤ 𝜂𝑇 (𝑡 − 𝜏 (𝑡)) LSL𝜂 (𝑡 − 𝜏 (𝑡))

𝑇

𝑡−𝜌

(24)

0 ≤ 𝜂𝑇 (𝑡) LQL𝜂 (𝑡) − 𝑓𝑇 (𝜂 (𝑡)) Q𝑓 (𝜂 (𝑡)) ,

𝑒𝛿(𝑠−𝑡) 𝜂𝑇 (𝑠) R2 𝜂 (𝑠) 𝑑𝑠

≤ −𝛿𝑉2 (𝑟𝑡 , 𝑡) + 𝜌𝜂𝑇 (𝑡) R2 𝜂 (𝑡) −𝛿𝜌

𝑡−𝜏

𝜂𝑇̇ (𝑠) X33 𝜂̇ (𝑠) 𝑑𝑠.

From assumption (A), it follows that

𝑒𝛿(𝑠−𝑡) 𝜂𝑇 (𝑠) R2 𝜂 (𝑠) 𝑑𝑠 𝑑𝜃

+ 𝜌𝜂𝑇 (𝑡) R2 𝜂 (𝑡) − ∫

+∫

𝜂𝑇̇ (𝑠) X33 𝜂̇ (𝑠) 𝑑𝑠,

𝑒𝛿(𝑠−𝑡) 𝜗𝑇 (𝑡, 𝑠) X𝜗 (𝑡, 𝑠) 𝑑𝑠

𝑒𝛿(𝑠−𝑡) 𝜗𝑇 (𝜃, 𝑠) X𝜗 (𝜃, 𝑠) 𝑑𝑠 𝑑𝜃

(23) 𝜂𝑇̇ (𝑡) P𝑇1 𝜂̇ (𝑡) = −𝜂𝑇 (𝑡 − 𝜌) B𝑘 P𝑇1 𝜂̇ (𝑡) + 𝑓𝑇 (𝜂 (𝑡)) D𝑇0𝑘 P𝑇1 𝜂̇ (𝑡) + 𝑓𝑇 (𝜂 (𝑡 − 𝜏 (𝑡))) D𝑇1𝑘 P𝑇1 𝜂̇ (𝑡) + (∫

𝑡

𝑡−𝑑(𝑡)

𝑇

𝑓 (𝜂 (𝑠)) 𝑑𝑠) D𝑇2𝑘 P𝑇1 𝜂̇ (𝑡)

+ J𝑇 P𝑇1 𝜂̇ (𝑡) .

(28)

Mathematical Problems in Engineering

7 𝑇

𝑡

Adding (27) to (28) and using Lemma 5, we get

+ (∫

𝑡−𝑑(𝑡)

0 ≤ 𝜂𝑇̇ (𝑡) (−P1 − P𝑇1 + P1 W−1 3 P1 ) 𝜂̇ (𝑡)

𝑓 (𝜂 (𝑠)) 𝑑𝑠) D𝑇2𝑘 P𝑇2 ) 𝜂 (𝑡 − 𝜌)

+ J𝑇 (𝑡) W4 J.

− 𝜂𝑇̇ (𝑡) P1 B𝑘 𝜂 (𝑡 − 𝜌) − 𝜂𝑇 (𝑡 − 𝜌) B𝑘 P𝑇1 𝜂̇ (𝑡)

(30)

+ 𝜂𝑇̇ (𝑡) P1 × D0𝑘 𝑓 (𝜂 (𝑡)) + 𝑓𝑇 (𝜂 (𝑡)) D𝑇0𝑘 P𝑇1 𝜂̇ (𝑡)

Combining (20)-(26), (29), and (30), we obtain

𝑇

+ 𝜂̇ (𝑡) P1 D1𝑘 𝑓 (𝜂 (𝑡 − 𝜏 (𝑡)))

(29)

+ 𝑓𝑇 (𝜂 (𝑡 − 𝜏 (𝑡))) D𝑇1𝑘 P𝑇1 𝜂̇ (𝑡) + 𝜂𝑇̇ (𝑡) P1 D2𝑘 ∫

𝑡

𝑡−𝑑(𝑡)

+ (∫

(31) 𝑇

+ J (𝑡) (W1 + W2 + W3 + W4 ) J,

𝑓 (𝜂 (𝑠)) 𝑑𝑠 𝑇

𝑡

𝑡−𝑑(𝑡)

𝑉̇ (𝑟𝑡 , 𝑡) ≤ −𝛿𝑉 (𝑟𝑡 , 𝑡) + 𝜉𝑇 (𝑡) Θ𝜉 (𝑡)

where 𝜉(𝑇) = [𝜂𝑇 (𝑡) 𝜂𝑇 (𝑡 − 𝜏(𝑡)) 𝑓𝑇(𝜂(𝑡)) 𝑓𝑇 (𝜂(𝑡 − 𝑡 𝑡 𝜏(𝑡))(∫𝑡−𝑑(𝑡) 𝑓(𝜂(𝑠))𝑑𝑠)𝑇 (∫𝑡−𝜌 𝜂(𝑠)𝑑𝑠)𝑇 𝜂𝑇̇ (𝑡) 𝜂𝑇(𝑡 − 𝜌)]𝑇 , and

𝑓 (𝜂 (𝑠)) 𝑑𝑠) D𝑇2𝑘 P𝑇1 𝜂̇ (𝑡)

+ J𝑇 (𝑡) W3 J Similarly, we obtain the zero inequality from (2) by multiplying 𝜂𝑇 (𝑡 − 𝜌)P2 and using Lemma 5 as follows: 0 = −𝜂𝑇 (𝑡 − 𝜌) P2 𝜂̇ (𝑡) − 𝜂𝑇̇ (𝑡) P𝑇2 𝜂 (𝑡 − 𝜌) − 𝜂𝑇 (𝑡 − 𝜌) (P2 B𝑘 + B𝑘 P𝑇2 ) 𝜂 (𝑡 − 𝜌) + 𝜂𝑇 (𝑡 − 𝜌) ⋅ P2 D0𝑘 𝑓 (𝜂 (𝑡)) + 𝑓𝑇 (𝜂 (𝑡)) D𝑇0𝑘 P𝑇2 𝜂 (𝑡 − 𝜌) + 𝜂𝑇 (𝑡 − 𝜌) P2 D1𝑘 𝑓 (𝜂 (𝑡 − 𝜏 (𝑡)) + 𝑓𝑇 (𝜂 (𝑡 − 𝜏 (𝑡)) D𝑇1𝑘 P𝑇2 ) 𝜂 (𝑡 − 𝜌) + 𝜂𝑇 (𝑡 − 𝜌) P2 D2𝑘 ∫

𝑡

𝑡−𝑑(𝑡)

+ (∫

𝑡

𝑡−𝑑(𝑡)

𝑓 (𝜂 (𝑠)) 𝑑𝑠

𝑇

𝑓 (𝜂 (𝑠)) 𝑑𝑠) D𝑇2𝑘 P𝑇2 ) 𝜂 (𝑡 − 𝜌) + 𝜂𝑇 (𝑡

− 𝜌) P2 J + J𝑇 (𝑡) P𝑇2 𝜂 (𝑡 − 𝜌) ≤ −𝜂𝑇 (𝑡 − 𝜌) ⋅ P2 𝜂̇ (𝑡) − 𝜂𝑇̇ (𝑡) P𝑇2 𝜂 (𝑡 − 𝜌) + 𝜂𝑇 (𝑡 − 𝜌) ⋅ (−P2 B𝑘 −

B𝑘 P𝑇2

+

P2 W−1 4 P2 ) 𝜂 (𝑡

− 𝜌)

+ 𝜂𝑇 (𝑡 − 𝜌) P2 D0𝑘 𝑓 (𝜂 (𝑡)) + 𝑓𝑇 (𝜂 (𝑡)) ⋅ D𝑇0𝑘 P𝑇2 𝜂 (𝑡 − 𝜌) + 𝜂𝑇 (𝑡 − 𝜌)

Θ = (Θ𝑖𝑗 )8×8 ,

in which Θ1,1 = −R1𝑘 B𝑘 − B𝑘 R1𝑘 + 𝛿R1𝑘 + R1𝑘 W−1 1 R1𝑘 + 𝑚 𝑇 ∑𝑗=1 𝜙𝑘𝑗 R1𝑗 + 𝛿R2 + 𝜏X11 + X13 + X13 + LQL, Θ1,2 = 𝜏X12 − X13 + X𝑇23 , Θ1,3 = R1𝑘 D0𝑘 , Θ1,4 = R1𝑘 D1𝑘 , Θ1,5 = R1𝑘 D2𝑘 , Θ1,6 = B𝑘 R1𝑘 B𝑘 − 𝛿R1𝑘 B𝑘 − ∑𝑚 𝑗=1 𝜙𝑘𝑗 R1𝑗 B𝑘 , 𝑇 Θ2,2 = 𝜏X22 − X23 − X23 + LSL, Θ3,3 = 𝑑R3 − Q, Θ3,6 = −D𝑇0𝑘 R1𝑘 B𝑘 = D𝑇0𝑘 P𝑇1 , Θ3,8 = D𝑇0𝑘 P𝑇2 , Θ4,4 = −S, Θ4,6 = −D𝑇1𝑘 R1𝑘 B𝑘 , Θ4,7 = D𝑇1𝑘 P𝑇1 , Θ4,8 = D𝑇1𝑘 P𝑇2 , Θ5,5 = −(𝑒−𝛿𝑑 /𝑑)R3 , Θ5,6 = −D𝑇2𝑘 R1𝑘 B𝑘 , Θ5,7 = D𝑇2𝑘 P𝑇1 , Θ5,8 = D𝑇2𝑘 P𝑇2 , Θ6,6 = 𝛿B𝑘 R1𝑘 B𝑘 + B𝑘 R1𝑘 W−1 2 R1𝑘 B𝑘 + 𝑚 −𝛿𝜌 𝛿𝜏 ∑𝑗=1 𝜙𝑘𝑗 B𝑘 R1𝑗 B𝑘 −(𝑒 /𝜌)R2 , Θ7,7 = 𝜏𝑒 X33 −P1 −P𝑇1 + 𝑇 𝑇 P1 W−1 3 P1 , Θ7,8 = −P1 B𝑘 − P2 , Θ8,8 = −P2 B𝑘 − B𝑘 P2 + P2 W−1 4 P2 , and the rest of Θ𝑖𝑗 are zero. Now we consider the change of 𝑉(𝑟𝑡 , 𝑡) at impulse time 𝑡 = 𝑡ℎ , ℎ ∈ Z+ . By (2), we have 𝑡ℎ

𝜂 (𝑡ℎ ) − B𝑘 ∫

𝑡ℎ −𝜌

𝜂 (𝑠) 𝑑𝑠 𝑡ℎ

= 𝜂 (𝑡ℎ− ) − Hℎ𝑘 [𝜂 (𝑡ℎ− ) − B𝑘 ∫

𝑡ℎ −𝜌

− B𝑘 ∫

𝑡ℎ

𝑡ℎ −𝜌

⋅ P2 D1𝑘 𝑓 (𝜂 (𝑡 − 𝜏 (𝑡))

(32)

(33) 𝜂 (𝑠) 𝑑𝑠 𝑡ℎ

𝑇

+ 𝑓 (𝜂 (𝑡 −

𝜏 (𝑡)) D𝑇1𝑘 P𝑇2 ) 𝜂 (𝑡

+ 𝜂𝑇 (𝑡 − 𝜌) P2 D2𝑘 ∫

𝑡

𝑡−𝑑(𝑡)

− 𝜌)

𝑓 (𝜂 (𝑠)) 𝑑𝑠

𝜂 (𝑠) 𝑑𝑠]

= (𝐼 − Hℎ𝑘 ) [𝜂 (𝑡ℎ− ) − B𝑘 ∫

𝑡ℎ −𝜌

Moreover, it follows from (14) that

𝜂 (𝑠) 𝑑𝑠] .

8

Mathematical Problems in Engineering

𝑇

[

R1𝑘 (𝐼 − Hℎ𝑘 ) R1𝑙 ∗

R1𝑙

] ≥ 0 ⇐⇒

𝑇 𝑇 𝐼 0 R (𝐼 − Hℎ𝑘 ) R1𝑙 𝐼 − (𝐼 − Hℎ𝑘 ) ] ≥ 0 ⇐⇒ ] [ 1𝑘 ][ [ − (𝐼 − Hℎ𝑘 ) 𝐼 ∗ 𝐼 ∗ R1𝑙 𝑇

[

R1𝑘 − (𝐼 − Hℎ𝑘 ) R1𝑙 (𝐼 − Hℎ𝑘 )

0



R1𝑙

(34)

] ≥ 0 ⇐⇒

𝑇

R1𝑘 − (𝐼 − Hℎ𝑘 ) R1𝑙 (𝐼 − Hℎ𝑘 ) ≥ 0

On the other hand, we obtain

Combining (33) and (34), we obtain 𝑡ℎ

𝑉1 (𝑟𝑡ℎ , 𝑡ℎ ) = [𝜂 (𝑡ℎ ) − B𝑘 ∫

𝑡ℎ −𝜌

⋅ R1𝑘 [𝜂 (𝑡ℎ ) − B𝑘 ∫

𝑡ℎ

= [𝜂 (𝑡ℎ− ) − B𝑘 ∫ ⋅ R1𝑘 (𝐼 −

−𝜌

0

− B𝑘 ∫ 𝜂 (𝑠) 𝑑𝑠) −𝜌

0

𝜂 (𝑠) 𝑑𝑠] (𝐼 − Hℎ𝑘 )

Hℎ𝑘 ) [𝜂 (𝑡ℎ− )

≤ [𝜂 (𝑡ℎ− ) − B𝑘 ∫

𝜂 (𝑠) 𝑑𝑠]

𝑇

𝑡ℎ −𝜌

− B𝑘 ∫

𝑡ℎ

𝑡ℎ −𝜌

𝑇

−𝜌 𝜃

(35)

0

𝑡ℎ

⋅ R1𝑘 [𝜂 (𝑡ℎ− ) − B𝑘 ∫

𝑡ℎ −𝜌

𝜂 (𝑠) 𝑑𝑠]

−𝑑 𝜃 0

+ J (𝑡) (W1 + W2 + W3 + W4 ) J.

+

(37)



𝑇

J (𝑡) (W1 + W2 + W3 + W4 ) J −𝛿𝑡 )𝑒 . 𝛿

Let 󵄩 󵄩 2󵄩 󵄩 𝜌󵄩 󵄩 H = [(1 + 𝜌 󵄩󵄩󵄩B𝑘 󵄩󵄩󵄩) 󵄩󵄩󵄩R1𝑘 󵄩󵄩󵄩 + 󵄩󵄩󵄩R2 󵄩󵄩󵄩 𝛿 +

𝑑 󵄩󵄩 󵄩󵄩 󵄩 󵄩2 2 󵄩R 󵄩 max {𝑙 }] sup 󵄩󵄩𝜓 (𝑠)󵄩󵄩󵄩 𝛿 󵄩 3 󵄩 1≤𝑖≤𝑛 𝑖 𝑠∈[−ℎ,0] 󵄩

(40)

𝜏 󵄩 󵄩2 󵄩 󵄩 + [ 𝑒𝛿𝜏 󵄩󵄩󵄩X33 󵄩󵄩󵄩] sup 󵄩󵄩󵄩𝜓̇ (𝑠)󵄩󵄩󵄩 . 𝛿 𝑠∈[−ℎ,0]

J𝑇 (𝑡) (W1 + W2 + W3 + W4 ) J 𝛿

≤ (𝑉 (𝑟0 , 0)

𝑑 󵄩󵄩 󵄩󵄩 󵄩 󵄩2 2 󵄩R 󵄩 max {𝑙 }] sup 󵄩󵄩𝜓 (𝑠)󵄩󵄩󵄩 𝛿 󵄩 3 󵄩 1≤𝑖≤𝑛 𝑖 𝑠∈[−ℎ,0] 󵄩

𝜏 󵄩 󵄩2 󵄩 󵄩 + [ 𝑒𝛿𝜏 󵄩󵄩󵄩X33 󵄩󵄩󵄩] sup 󵄩󵄩󵄩𝜓̇ (𝑠)󵄩󵄩󵄩 . 𝛿 𝑠∈[−ℎ,0]

(36)

Combining Lemma 7 and (37), we get 𝑉 (𝑟𝑡 , 𝑡) −

𝜂̇ (𝑠) X33 𝜂̇ (𝑠) 𝑑𝑠 𝑑𝜃

󵄩 𝜌󵄩 󵄩 󵄩 󵄩 2󵄩 ≤ [(1 + 𝜌 󵄩󵄩󵄩B𝑘 󵄩󵄩󵄩) 󵄩󵄩󵄩R1𝑘 󵄩󵄩󵄩 + 󵄩󵄩󵄩R2 󵄩󵄩󵄩 𝛿

Obviously, by employing Lemma 6, we see that Ψ < 0 in (15) is equivalent to Θ < 0. Thus, it follows from (31) that

𝑇

𝛿(𝜏+𝑠) 𝑇

−𝜏 𝜃

This yields

𝑉̇ (𝑟𝑡 , 𝑡) ≤ −𝛿𝑉 (𝑟𝑡 , 𝑡)

(39)

0

+∫ ∫ 𝑒

𝜂 (𝑠) 𝑑𝑠] = 𝑉1 (𝑟𝑡ℎ− , 𝑡ℎ− ) .

𝑉 (𝑟𝑡ℎ , 𝑡ℎ ) ≤ 𝑉 (𝑟𝑡ℎ− , 𝑡ℎ− ) , ℎ ∈ Z+ .

0

+ ∫ ∫ 𝑒𝛿𝑠 𝑓𝑇 (𝜂 (𝑠)) R3 𝑓 (𝜂 (𝑠)) 𝑑𝑠 𝑑𝜃

𝜂 (𝑠) 𝑑𝑠]

𝑡ℎ −𝜌

0

+ ∫ ∫ 𝑒𝛿𝑠 𝜂𝑇 (𝑠) R2 𝜂 (𝑠) 𝑑𝑠 𝑑𝜃

𝑇

𝑡ℎ

𝑇

𝑉 (𝑟0 , 0) = (𝜂 (0) − B𝑘 ∫ 𝜂 (𝑠) 𝑑𝑠) R1𝑘 (𝜂 (0)

𝜂 (𝑠) 𝑑𝑠]

𝑡ℎ −𝜌

𝑡ℎ

0

𝑇

From (38), we have 𝑉 (𝑟𝑡 , 𝑡) − (38)

≤ H𝑒

J𝑇 (𝑡) (W1 + W2 + W3 + W4 ) J 𝛿

−𝛿𝑡

(41)

.

Hence, the Markovian jump neural network is globally exponentially attractive.

Mathematical Problems in Engineering

9

From definition of 𝑉(𝑟𝑡 , 𝑡), we have

the following results we derived the Lagrange stability conditions for the Markovian jump neural networks without uncertainties (i.e., ΔB(𝑟𝑡 ) = ΔD0 (𝑟𝑡 ) = ΔD1 (𝑟𝑡 ) = ΔD2 (𝑟𝑡 ) = 0).

󵄩󵄩 󵄩󵄩2 𝑡 󵄩 󵄩 𝑉 (𝑟𝑡 , 𝑡) ≥ 𝜆 min (R1𝑘 ) 󵄩󵄩󵄩󵄩𝜂 (𝑡) − B𝑘 ∫ 𝜂 (𝑠) 𝑑𝑠󵄩󵄩󵄩󵄩 . 󵄩󵄩 󵄩󵄩 𝑡−𝜌

(42)

It follows from (41) and (42) that 󵄩󵄩 󵄩󵄩 𝑡 H 󵄩󵄩 󵄩 󵄩󵄩𝜂 (𝑡) − B𝑘 ∫ 𝜂 (𝑠) 𝑑𝑠󵄩󵄩󵄩 ≤ ( 󵄩󵄩 󵄩 𝜆 min (R1𝑘 ) 󵄩󵄩 𝑡−𝜌 󵄩 J𝑇 (𝑡) (W1 + W2 + W3 + W4 ) J + ) 𝛿𝜆 min (R1𝑘 )

(43)

1/2

.

Theorem 11. Under assumption (A), for given constants 𝜏, 𝑑, 𝛿 > 0, there exist positive definite matrices W1 , W2 , W3 , W4 , R1𝑘 , R2 , R3 , positive diagonal matrices Q, S, and any appropriate dimensional matrices P1 , P2 , X11 , X12 , X13 , X22 , X23 , X33 such that the following linear matrix inequalities (LMIs) hold: X11 X12 X13 [ ∗ X X ] X=[ 22 23 ] > 0,

Furthermore, 󵄩󵄩 󵄩 󵄩󵄩𝜂 (𝑡)󵄩󵄩󵄩 ≤ ( 𝜆 +

H min (R1𝑘 )

𝑇

J (𝑡) (W1 + W2 + W3 + W4 ) J ) 𝛿𝜆 min (R1𝑘 )

[ ∗ 󵄩 󵄩 + 󵄩󵄩󵄩B𝑘 󵄩󵄩󵄩 (44)

R1𝑘 (𝐼 − Hℎ𝑘 ) R1𝑙

[



R1𝑙

] ≥ 0, (48) ℎ ∈ Z+ , [ℎ𝑒𝑟𝑒 𝑟𝑡ℎ = 𝑙]

𝑡

󵄩󵄩 󵄩 󵄩󵄩𝜂 (𝑠)󵄩󵄩󵄩 𝑑𝑠. 𝑡−𝜌

Ψ̆ = Ψ̆ (𝑟,𝑠) < 0

From the well-known Gronwall inequality, we get H min (R1𝑘 )

(𝑟, 𝑠 = 1, 2, . . . , 12) , where

1/2

J𝑇 (𝑡) (W1 + W2 + W3 + W4 ) J + ) 𝛿𝜆 min (R1𝑘 )

(45) 𝑒𝜌‖B𝑘 ‖ .

𝑚

Ψ̆ 1,1 = −R1𝑘 B𝑘 − B𝑘 R1𝑘 + 𝛿R1𝑘 + ∑𝜙𝑘𝑗 R1𝑗

Therefore, the Markovian jump neural network (2) is uniformly stable in Lagrange sense. So, the Markovian jump neural network (2) is globally exponentially stable in Lagrange sense. Moreover, we know from (41) and (42) that a globally exponentially attractive set of (2) is as follows: Ω = {𝜂 (𝑡) ∈ R𝑛 | 𝑉 (𝑟𝑡 , 𝑡)

𝑗=1

+ 𝛿R2 + 𝜏X11 + X13 + X𝑇13 + LQL, Ψ̆ 1,2 = 𝜏X12 − X13 + X𝑇23 , Ψ̆ 1,3 = R1𝑘 D0𝑘 , Ψ̆ 1,4 = R1𝑘 D1𝑘 ,

{ J𝑇 (𝑡) (W1 + W2 + W3 + W4 ) J ≤ } ⊆ {𝜂 (𝑡) 𝛿 { 󵄩 󵄩 ∈ R𝑛 | 󵄩󵄩󵄩𝜂 (𝑡)󵄩󵄩󵄩

Ψ̆ 1,5 = R1𝑘 D2𝑘 , 𝑚

Ψ̆ 1,6 = B𝑘 R1𝑘 B𝑘 − 𝛿R1𝑘 B𝑘 − ∑ 𝜙𝑘𝑗 R1𝑗 B𝑘 , (46)

1/2

J𝑇 (𝑡) (W1 + W2 + W3 + W4 ) J ) ≤( 𝛿𝜆 min (R1𝑘 ) 𝜌‖B𝑘 ‖ }

⋅𝑒

X33 ]

𝑇

1/2

⋅∫

󵄩󵄩 󵄩 󵄩󵄩𝜂 (𝑡)󵄩󵄩󵄩 ≤ ( 𝜆



(47)

}. }

The proof is completed. Theorem 10 provides a Lagrange stability criteria for Markovian jump neural networks with uncertainties (i.e., ΔB(𝑟𝑡 ) ≠ 0, ΔD0 (𝑟𝑡 ) ≠ 0, ΔD1 (𝑟𝑡 ) ≠ 0, ΔD2 (𝑟𝑡 ) ≠ 0). In

𝑗=1

Ψ̆ 1,9 = R1𝑘 , Ψ̆ 2,2 = 𝜏X22 − X23 − X𝑇23 + LSL, Ψ̆ 3,3 = 𝑑R3 − Q, Ψ̆ 3,6 = −D𝑇0𝑘 R1𝑘 B𝑘 Ψ̆ 3,7 = D𝑇0𝑘 P𝑇1 , Ψ̆ 3,8 = D𝑇0𝑘 P𝑇2 , Ψ̆ 4,4 = −S,

(49)

10

Mathematical Problems in Engineering Ψ̆ 4,6 = −D𝑇1𝑘 R1𝑘 B𝑘 ,

a delayed neural network without Markovian jump parameters described by

Ψ̆ 4,7 = D𝑇1𝑘 P𝑇1 ,

𝜂̇ (𝑡) = −B𝑘 𝜂 (𝑡 − 𝜌) + D0𝑘 𝑓 (𝜂 (𝑡))

Ψ̆ 4,8 = D𝑇1𝑘 P𝑇2 , Ψ̆ 5,5

+ D1𝑘 𝑓 (𝜂 (𝑡 − 𝜏 (𝑡)))

𝑒−𝛿𝑑 =− R3 , 𝑑

+ D2𝑘 ∫

𝑓 (𝜂 (𝑠)) 𝑑𝑠 + J,

𝑡−𝑑(𝑡)

Ψ̆ 5,6 = −D𝑇2𝑘 R1𝑘 B𝑘 , Ψ̆ 5,7 =

𝑡

Δ𝜂 (𝑡ℎ ) = −Hℎ𝑘 {𝜂 (𝑡ℎ− ) − B𝑘 ∫

D𝑇2𝑘 P𝑇1 ,

𝑡ℎ

𝑡ℎ −𝜌

(52)

𝜂 (𝑠) 𝑑𝑠} , 𝑡 = 𝑡ℎ , 𝜅 ∈ Z+

Ψ̆ 5,8 = D𝑇2𝑘 P𝑇2 , 𝑚 𝑒−𝛿𝜌 Ψ̆ 6,6 = 𝛿B𝑘 R1𝑘 B𝑘 + ∑𝜙𝑘𝑗 B𝑘 R1𝑗 B𝑘 − R2 , 𝜌 𝑗=1

Using the same method in Theorem 10, we can get the following results Corollary 13. Under assumption (A), for given constants 𝜏, 𝑑, 𝛿 > 0, there exist positive definite matrices W1 , W2 , W3 , W4 , R1𝑘 , R2 , R3 , positive diagonal matrices Q, S, and any appropriate dimensional matrices P1 , P2 , X11 , X12 , X13 , X22 , X23 , X33 such that the following linear matrix inequalities (LMIs) hold:

Ψ̆ 6,10 = B𝑘 R1𝑘 , Ψ̆ 7,7 = 𝜏𝑒𝛿𝜏 X33 − P1 − P𝑇1 , Ψ̆ 7,8 = −P1 B𝑘 − P𝑇2 , Ψ̆ 7,11 = P1 ,

X11 X12 X13 [ ∗ X X ] X=[ 22 23 ] > 0,

Ψ̆ 8,8 = −P2 B𝑘 − B𝑘 P𝑇2 ,

[ ∗

Ψ̆ 8,12 = P2 ,



(53)

X33 ]

𝑇

Ψ̆ 9,9 = −W1 ,

[

Ψ̆ 10,10 = −W2 ,

R1𝑘 (𝐼 − Hℎ𝑘 ) R1𝑘 ∗

R1𝑘

] ≥ 0, ℎ ∈ Z+ ,

̂=Ψ ̂(𝑟,𝑠) < 0 Ψ

Ψ̆ 11,11 = −W3 ,

(𝑟, 𝑠 = 1, 2, . . . , 14) ,

Ψ̆ 12,12 = −W4 ,

where and other Ψ̆ (𝑟,𝑠) are zero.

̂1,1 = −R1𝑘 B𝑘 − B𝑘 R1𝑘 + 𝛿R1𝑘 + 𝛿R2 + 𝜏X11 Ψ (50)

Then, the Markovian jump neural network (2) (with ΔB(𝑟𝑡 ) = ΔD0 (𝑟𝑡 ) = ΔD1 (𝑟𝑡 ) = ΔD2 (𝑟𝑡 ) = 0) is globally exponentially stable in Lagrange sense. Moreover, the set { 󵄩 󵄩 Ω = {𝜂 (𝑡) ∈ R𝑛 | 󵄩󵄩󵄩𝜂 (𝑡)󵄩󵄩󵄩 { 𝑇

𝑡 ≠ 𝑡ℎ

J (W1 + W2 + W3 + W4 ) J ≤( ) 𝛿𝜆 min (R1 )

+ X13 + X𝑇13 + LQL + 𝜀2 N𝑇𝑏𝑘 N𝑏𝑘 , ̂1,2 = 𝜏X12 − X13 + X𝑇 , Ψ 23 ̂1,3 = R1𝑘 D0𝑘 − 𝜀2 N𝑇 N0𝑘 , Ψ 𝑏𝑘 ̂1,4 = R1𝑘 D1𝑘 − 𝜀2 N𝑇 N1𝑘 , Ψ 𝑏𝑘

1/2

(51) 𝜌‖B𝑘 ‖ }

𝑒

} } is an estimation of globally exponentially attractive set of (2). Proof. Define the Lyapunov-Krasovskii functional candidate as 𝑉(𝑟𝑡 , 𝑡) as defined in Theorem 10. By employing the same method in Theorem 10, we can easily prove the desired result. Remark 12. Now, we consider the following neural network, as a special case of neural network (2) that reduces to

̂1,5 = R1𝑘 D2𝑘 − 𝜀2 N𝑇 N2𝑘 , Ψ 𝑏𝑘 ̂1,6 = B𝑘 R1𝑘 B𝑘 − 𝛿R1𝑘 B𝑘 , Ψ ̂1,10 = R𝑇 M𝑘 , Ψ 1𝑘 ̂1,11 = R1𝑘 , Ψ ̂2,2 = 𝜏X22 − X23 − X𝑇 + LSL, Ψ 23 ̂3,3 = 𝑑R3 − Q + 𝜀1 N𝑇 N0𝑘 + 𝜀2 N𝑇 N0𝑘 , Ψ 0𝑘 0𝑘 ̂3,4 = 𝜀1 N𝑇 N1𝑘 + 𝜀2 N𝑇 N1𝑘 , Ψ 0𝑘 0𝑘

(54)

(55)

Mathematical Problems in Engineering

11

̂3,5 = 𝜀1 N𝑇 N2𝑘 + 𝜀2 N𝑇 N2𝑘 , Ψ 0𝑘 0𝑘 ̂3,6 = Ψ

Then, the uncertain neural network (2) is globally exponentially stable in Lagrange sense. Moreover, the set

−D𝑇0𝑘 R1𝑘 B𝑘 ,

{ 󵄩 󵄩 Ω = {𝜂 (𝑡) ∈ R𝑛 | 󵄩󵄩󵄩𝜂 (𝑡)󵄩󵄩󵄩 {

̂3,7 = D𝑇 P𝑇 , Ψ 0𝑘 1 ̂3,8 = D𝑇 P𝑇 − 𝜀1 N𝑇 N𝑏𝑘 , Ψ 0𝑘 2 0𝑘 ̂4,4 = −S + 𝜀1 N𝑇 N1𝑘 + 𝜀2 N𝑇 N1𝑘 , Ψ 1𝑘 1𝑘

≤(

̂4,5 = 𝜀1 N𝑇 N2𝑘 + 𝜀2 N𝑇 N2𝑘 , Ψ 1𝑘 1𝑘

̂4,8 = D𝑇 P𝑇 − 𝜀1 N𝑇 N𝑏𝑘 , Ψ 1𝑘 2 1𝑘 𝑑

R3 + 𝜀1 N𝑇2𝑘 N2𝑘 + 𝜀2 N𝑇2𝑘 N2𝑘 ,

̂5,6 = −D𝑇 R1𝑘 B𝑘 , Ψ 2𝑘 ̂5,7 = D𝑇 P𝑇 , Ψ 2𝑘 1 ̂5,8 = D𝑇 P𝑇 − 𝜀1 N𝑇 N𝑏𝑘 , Ψ 2𝑘 2 2𝑘

4. Numerical Example

−𝛿𝜌 ̂6,6 = 𝛿B𝑘 R1𝑘 B𝑘 − 𝑒 R2 , Ψ 𝜌

̂6,10 = Ψ

This section provides numerical result to demonstrate the effectiveness of the presented strategy.

−R𝑇1𝑘 M𝑘 B𝑘 ,

Example 1. Consider the following delayed uncertain Markovian jump neural networks with 2 modes:

̂6,12 = B𝑘 R1𝑘 , Ψ ̂7,7 = 𝜏𝑒𝛿𝜏 X33 − P1 − P𝑇 , Ψ 1

𝜂̇ (𝑡) = −B𝑘 𝜂 (𝑡 − 𝜌) + D0𝑘 𝑓 (𝜂 (𝑡))

̂7,8 = −P1 B𝑘 − P𝑇 , Ψ 2

+ D1𝑘 𝑓 (𝜂 (𝑡 − 𝜏 (𝑡)))

̂7,9 = P𝑇 M𝑘 , Ψ 1

+ D2𝑘 ∫

𝑡

𝑡−𝑑(𝑡)

̂7,13 = P1 , Ψ

𝑓 (𝜂 (𝑠)) 𝑑𝑠 + J, 𝑡 ≠ 𝑡ℎ 𝑡ℎ

Δ𝜂 (𝑡ℎ ) = −Hℎ𝑘 {𝜂 (𝑡ℎ− ) − B𝑘 ∫

̂8,8 = −P2 B𝑘 − B𝑘 P𝑇 + 𝜀1 N𝑇 N𝑏𝑘 , Ψ 2 𝑏𝑘

𝑡ℎ −𝜌

̂8,9 = P𝑇 M𝑘 , Ψ 2

𝜂 (𝑠) 𝑑𝑠} , 𝑡 = 𝑡ℎ , ℎ ∈ Z+

̂8,14 = P2 , Ψ ̂9,9 = −𝜀1 𝐼, Ψ ̂10,10 = −𝜀2 𝐼, Ψ

B1 = [

1 0 ], 0 1

D01 = [

0.2 −1 ], 0 −3

̂11,11 = −W1 , Ψ D11 = [

̂12,12 = −W2 , Ψ ̂13,13 = −W3 , Ψ ̂14,14 = −W4 , Ψ

} 𝑒𝜌‖B𝑘 ‖ } }

Remark 14. In this paper we studied exponential Lagrange stability for Markovian jump uncertain neural networks with leakage delay and mixed time-varying delays via impulsive control. The stability of neural networks was studied by applying the differential inequality and Lyapunov method [30– 34]. And the authors considered Lyapunov stability, where the Lyapunov stability of equilibrium point can be regarded as a special case of the Lagrange stability. In addition we considered model uncertainties, Markovian jumping parameters, and leakage delay in the concerned networks since it will affect the stability of the systems. Thus our results are more general than the results in the literature.

̂4,7 = D𝑇 P𝑇 , Ψ 1𝑘 1

−𝛿𝑑

J (W1 + W2 + W3 + W4 ) J ) 𝛿𝜆 min (R1𝑘 )

(57)

is an estimation of globally exponentially attractive set of (2).

̂4,6 = −D𝑇 R1𝑘 B𝑘 , Ψ 1𝑘

̂5,5 = − 𝑒 Ψ

1/2

𝑇

D21 = [ ̂(𝑟×𝑠) are zero. and other Ψ (56)

B2 = [

−0.2 −0.2 0.2 −0.3 −0.3 −0.8 0.4 −0.1 1 0 ], 0 1

], ],

(58)

12

Mathematical Problems in Engineering 2 1 D02 = [ ], −1 0.2

515.2687 −0.0025 ], W4 = [ −0.0025 515.1238

−0.3 −0.4 D12 = [ ], 0.4 −0.5

8.4321 0 Q=[ ], 0 8.4321

0.4 −0.2 D22 = [ ], −0.2 0.4

6.0806 0 S=[ ], 0 6.0806

0.4 0 Hℎ1 = Hℎ2 = [ ], 0 0.4

1.1608 0.0506 P1 = [ ], 0.0506 0.7860

0.2 0 M1 = [ ], 0 0.2

1.1608 0.0506 P2 = [ ], 0.0506 0.7860

0.1 0 N𝑏1 = N01 = N11 = N21 = [ ], 0 0.1

2.2685 −0.0457 X11 = [ ], −0.0457 2.3587

0.5 0 ], M2 = [ 0 0.5

−3.5666 −0.0549 ], X12 = [ −0.0549 −3.6697

0.2 0 N𝑏2 = N02 = N12 = N22 = [ ], 0 0.2

−2.1441 −0.0331 X13 = [ ], −0.0331 −2.2075

0.5 J = [ ], 0.2

2.5925 0.0542 X22 = [ ], 0.0542 2.6978

𝐿 = diag {0.5, 0.5} .

2.1427 0.0326 X23 = [ ], 0.0326 2.2061

(59) In order to obtain Lagrange stability, we take 𝜏 = 0.8, 𝑑 = 0.5, 𝛿 = 0.5, and 𝜌 = 0.5 and apply Theorem 10; we get feasible solutions to LMIs (14) and (15) as follows: 0.1899 −0.1535 ], R11 = [ −0.1535 0.2065 0.6008 0.0496 R12 = [ ], 0.0496 0.6771

0.3626 0.0263 X33 = [ ]. 0.0263 0.4137 (60) From Theorem 10, we know that the uncertain neural networks concerned are globally exponentially stable in Lagrange sense, and 󵄩 󵄩 Ω = {𝜂 (𝑡) ∈ R𝑛 | 󵄩󵄩󵄩𝜂 (𝑡)󵄩󵄩󵄩 ≤ 12.0212}

(61)

1.2472 −0.8560 R2 = [ ], −0.8560 1.2001

is an estimation of globally exponentially attractive set of (2).

8.2198 −0.7129 R3 = [ ], −0.7129 6.6828

5. Conclusion

514.8650 0.0785 W1 = [ ], 0.0785 514.9600 514.6668 0.0266 W2 = [ ], 0.0266 514.6908 514.8894 −0.0147 ], W3 = [ −0.0147 514.8409

The problem of Lagrange stability analysis has been investigated in this paper for uncertain Markovian jump neural networks with mixed time-varying delays and leakage delay. By choosing a Lyapunov-Krasovskii functional, the improved delay-dependent criteria are based on the Lagrange stability approach with attractive set. The Lagrange stability criterion has been obtained by solving a set of LMIs, which can guarantee the global exponential stability of the concerned system. A numerical example is given to illustrate the validity and feasibility of the obtained theoretical results.

Mathematical Problems in Engineering

Data Availability No data were used to support this study.

Conflicts of Interest The authors declare that they have no conflicts of interest.

13 [12] C. Liu and Y.-J. Peng, “Stability of periodic steady-state solutions to a non-isentropic Euler-Maxwell system,” Zeitschrift f¨ur angewandte Mathematik und Physik, vol. 68, no. 5, Art. 105, 17 pages, 2017. [13] X. Zheng, Y. Shang, and X. Peng, “Orbital stability of periodic traveling wave solutions to the generalized Zakharov equations,” Acta Mathematica Scientia B, vol. 37, no. 4, pp. 998–1018, 2017.

Acknowledgments

[14] S. Haykin, Neural Networks: A Comprehensive Foundation, NJ, USA, Prentice-Hall, 1998.

This work was jointly supported by the National Natural Science Foundation of China (61773217, 61374080), CSIR, 25(0274)/17/EMR-II dated 27/04/2017, the Natural Science Foundation of Jiangsu Province (BK20161552), Qing Lan Project of Jiangsu Province, the Priority Academic Program Development of Jiangsu Higher Education Institutions, and Advanced Study and Training of Professional Leaders of Higher Vocational Colleges in Jiangsu (2017GRFX019).

[15] Y. Guo, “Global asymptotic stability analysis for integrodifferential systems modeling neural networks with delays,” Zeitschrift f¨ur Angewandte Mathematik und Physik, vol. 61, no. 6, pp. 971–978, 2010.

References [1] G. Wang, “Existence-stability theorems for strong vector setvalued equilibrium problems in reflexive Banach spaces,” Journal of Inequalities and Applications, 2015:239, 14 pages, 2015. [2] Y. Bai and X. Mu, “Global asymptotic stability of a generalized SIRS epidemic model with transfer from infectious to susceptible,” Journal of Applied Analysis and Computation, vol. 8, no. 2, pp. 402–412, 2018. [3] L. Gao, D. Wang, and G. Wang, “Further results on exponential stability for impulsive switched nonlinear time-delay systems with delayed impulse effects,” Applied Mathematics and Computation, vol. 268, pp. 186–200, 2015. [4] Y. Li, Y. Sun, and F. Meng, “New criteria for exponential stability of switched time-varying systems with delays and nonlinear disturbances,” Nonlinear Analysis: Hybrid Systems, vol. 26, pp. 284–291, 2017. [5] L. G. Wang, K. P. Xu, and Q. W. Liu, “On the stability of a mixed functional equation deriving from additive, quadratic and cubic mappings,” Acta Mathematica Sinica, vol. 30, no. 6, pp. 1033– 1049, 2014. [6] L. G. Wang and B. Liu, “Fuzzy stability of a functional equation deriving from additive, quadratic, cubic and quartic functions,” Acta Mathematica Sinica, vol. 55, no. 5, pp. 841–854, 2012. [7] F. Li and Y. Bao, “Uniform stability of the solution for a memorytype elasticity system with nonhomogeneous boundary control condition,” Journal of Dynamical and Control Systems, vol. 23, no. 2, pp. 301–315, 2017. [8] Y.-H. Feng and C.-M. Liu, “Stability of steady-state solutions to Navier-Stokes-Poisson systems,” Journal of Mathematical Analysis and Applications, vol. 462, no. 2, pp. 1679–1694, 2018. [9] J. Hu and A. Xu, “On stability of F-Gorenstein flat categories,” Algebra Colloquium, vol. 23, no. 2, pp. 251–262, 2016. [10] W. W. Sun, “Stabilization analysis of time-delay Hamiltonian systems in the presence of saturation,” Applied Mathematics and Computation, vol. 217, no. 23, pp. 9625–9634, 2011. [11] X. Zheng, Y. Shang, and X. Peng, “Orbital stability of solitary waves of the coupled Klein-Gordon-Zakharov equations,” Mathematical Methods in the Applied Sciences, vol. 40, no. 7, pp. 2623–2633, 2017.

[16] Y. Guo, “Global stability analysis for a class of Cohen-Grossberg neural network models,” Bulletin of the Korean Mathematical Society, vol. 49, no. 6, pp. 1193–1198, 2012. [17] A. Abdurahman, H. Jiang, and K. Rahman, “Function projective synchronization of memristor-based Cohen–Grossberg neural networks with time-varying delays,” Cognitive Neurodynamics, vol. 9, no. 6, pp. 603–613, 2015. [18] K. Shi, S. Zhong, H. Zhu, X. Liu, and Y. Zeng, “New delaydependent stability criteria for neutral-type neural networks with mixed random time-varying delays,” Neurocomputing, vol. 168, pp. 896–907, 2015. [19] Y. Guo, “Mean square exponential stability of stochastic delay cellular neural networks,” Electronic Journal of Qualitative Theory of Differential Equations, No. 34, 10 pages, 2013. [20] Y. Guo, “Exponential stability analysis of travelling waves solutions for nonlinear delayed cellular neural networks,” Dynamical Systems. An International Journal, vol. 32, no. 4, pp. 490–503, 2017. [21] W. Sun and L. Peng, “Observer-based robust adaptive control for uncertain stochastic Hamiltonian systems with state and input delays,” Nonlinear Analysis: Modelling and Control, vol. 19, no. 4, pp. 626–645, 2014. [22] L. G. Wang and B. Liu, “The Hyers-Ulam stability of a functional equation deriving from quadratic and cubic functions in quasi𝛽-normed spaces,” Acta Mathematica Sinica, vol. 26, no. 12, pp. 2335–2348, 2010. [23] Y. Guo, “Nontrivial periodic solutions of nonlinear functional differential systems with feedback control,” Turkish Journal of Mathematics, vol. 34, no. 1, pp. 35–44, 2010. [24] L. Li, F. Meng, and P. Ju, “Some new integral inequalities and their applications in studying the stability of nonlinear integrodifferential equations with time delay,” Journal of Mathematical Analysis and Applications, vol. 377, no. 2, pp. 853–862, 2011. [25] M. Li and J. Wang, “Exploring delayed Mittag-Leffler type matrix functions to study finite time stability of fractional delay differential equations,” Applied Mathematics and Computation, vol. 324, pp. 254–265, 2018. [26] G. Liu, S. Xu, Y. Wei, Z. Qi, and Z. Zhang, “New insight into reachable set estimation for uncertain singular time-delay systems,” Applied Mathematics and Computation, vol. 320, pp. 769–780, 2018. [27] W. Sun, Y. Wang, and R. Yang, “L2 disturbance attenuation for a class of time delay Hamiltonian systems,” Journal of Systems Science & Complexity, vol. 24, no. 4, pp. 672–682, 2011.

14 [28] W. Qi, Y. Kao, X. Gao, and Y. Wei, “Controller design for time-delay system with stochastic disturbance and actuator saturation via a new criterion,” Applied Mathematics and Computation, vol. 320, pp. 535–546, 2018. [29] Y. Guo, “Mean square global asymptotic stability of stochastic recurrent neural networks with distributed delays,” Applied Mathematics and Computation, vol. 215, no. 2, pp. 791–795, 2009. [30] S. Boyd, L. El Ghaoui, E . Feron, and V. Balakrishnan, Linear Matrix Inequalities in System and Control Theory, SIAM, Philadelphia, Pa, USA, 1994. [31] R. Anbuvithya, K. Mathiyalagan, R. Sakthivel, and P. Prakash, “Passivity of memristor-based BAM neural networks with different memductance and uncertain delays,” Cognitive Neurodynamics, vol. 10, no. 4, pp. 339–351, 2016. [32] Y. Guo, “Globally robust stability analysis for stochastic CohenGrossberg neural networks with impulse and time-varying delays,” Ukrainian Mathematical Journal, vol. 69, no. 8, pp. 1049–1060, 2017. [33] R. Sathy and P. Balasubramaniam, “Stability analysis of fuzzy Markovian jumping Cohen-Grossberg BAM neural networks with mixed time-varying delays,” Communications in Nonlinear Science and Numerical Simulation, vol. 16, no. 4, pp. 2054–2064, 2011. [34] M. Syed Ali, “Robust stability of stochastic uncertain recurrent neural networks with Markovian jumping parameters and timevarying delays,” International Journal of Machine Learning and Cybernetics, vol. 5, no. 1, pp. 13–22, 2014. [35] K. Gopalsamy, “Leakage delays in BAM,” Journal of Mathematical Analysis and Applications, vol. 325, no. 2, pp. 1117–1132, 2007. [36] Q. Zhu, J. Cao, T. Hayat, and F. Alsaadi, “Robust stability of Markovian jump stochastic neural networks with time delays in the leakage terms,” Neural Processing Letters, vol. 41, pp. 1–27, 2015. [37] M. Syed Ali, S. Arik, and M. Esther Rani, “Passivity analysis of stochastic neural networks with leakage delay and Markovian jumping parameters,” Neurocomputing, vol. 218, pp. 139–145, 2016. [38] R. S. Kumar, G. Sugumaran, R. Raja, Q. Zhu, and U. K. Raja, “New stability criterion of neural networks with leakage delays and impulses: a piecewise delay method,” Cognitive Neurodynamics, vol. 10, no. 1, pp. 85–98, 2016. [39] M. Syed Ali and M. Esther Rani, “Passivity analysis of uncertain stochastic neural networks with time-varying delays and Markovian jumping parameters,” Network: Computation in Neural Systems, vol. 26, no. 3-4, pp. 73–96, 2015. [40] O. M. Kwon, M. J. Park, J. H. Park, S. M. Lee, and E. J. Cha, “Passivity analysis of uncertain neural networks with mixed time-varying delays,” Nonlinear Dynamics, vol. 73, no. 4, pp. 2175–2189, 2013. [41] S. Zhu, Y. Shen, and G. Chen, “Exponential passivity of neural networks with time-varying delay and uncertainty,” Physics Letters A, vol. 375, no. 2, pp. 136–142, 2010. [42] Y. Liu, W. Liu, M. A. Obaid, and I. A. Abbas, “Exponential stability of Markovian jumping Cohen-Grossberg neural networks with mixed mode-dependent time-delays,” Neurocomputing, vol. 177, pp. 409–415, 2016. [43] Q. Zhu and J. Cao, “Exponential stability of stochastic neural networks with both Markovian jump parameters and mixed time delays,” IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 41, no. 2, pp. 341–353, 2011.

Mathematical Problems in Engineering [44] Q. Zhu and J. Cao, “Stability analysis of markovian jump stochastic BAM neural networks with impulse control and mixed time delays,” IEEE Transactions on Neural Networks and Learning Systems, vol. 23, no. 3, pp. 467–479, 2012. [45] M. Syed Ali, “Stability of Markovian jumping recurrent neural networks with discrete and distributed time-varying delays,” Neurocomputing, vol. 149, pp. 1280–1285, 2015. [46] M. Syed Ali, N. Gunasekaran, and Q. Zhu, “State estimation of T-S fuzzy delayed neural networks with Markovian jumping parameters using sampled-data control,” Fuzzy Sets and Systems, vol. 306, pp. 87–104, 2017. [47] G. Chen, O. van Gaans, and S. Verduyn Lunel, “Fixed points and pth moment exponential stability of stochastic delayed recurrent neural networks with impulses,” Applied Mathematics Letters, vol. 27, pp. 36–42, 2014. [48] G. Seiler and J. A. Nossek, “Winner-Take-All Cellular Neural Networks,” IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing, vol. 40, no. 3, pp. 184–190, 1993. [49] H. Wersing, W.-J. Beyn, and H. Ritter, “Dynamical stability conditions for recurrent neural networks with unsaturating piecewise linear transfer functions,” Neural Computation, vol. 13, no. 8, pp. 1811–1825, 2001. [50] Q. Luo, Z. Zeng, and X. Liao, “Global exponential stability in Lagrange sense for neutral type recurrent neural networks,” Neurocomputing, vol. 74, no. 4, pp. 638–645, 2011. [51] A. Wu, Z. Zeng, C. Fu, and W. Shen, “Global exponential stability in Lagrange sense for periodic neural networks with various activation functions,” Neurocomputing, vol. 74, no. 5, pp. 831– 837, 2011. [52] X. Wang, M. Jiang, and S. Fang, “Stability analysis in Lagrange sense for a non-autonomous Cohen-Grossberg neural network with mixed delays,” Nonlinear Analysis. Theory, Methods & Applications. An International Multidisciplinary Journal, vol. 70, no. 12, pp. 4294–4306, 2009. [53] B. Wang, J. Jian, and M. Jiang, “Stability in Lagrange sense for Cohen-Grossberg neural networks with time-varying delays and finite distributed delays,” Nonlinear Analysis: Hybrid Systems, vol. 4, no. 1, pp. 65–78, 2010. [54] Z. Tu, J. Jian, and K. Wang, “Global exponential stability in Lagrange sense for recurrent neural networks with both time-varying delays and general activation functions via LMI approach,” Nonlinear Analysis: Real World Applications, vol. 12, no. 4, pp. 2174–2182, 2011. [55] Z. Tu, J. Jian, and B. Wang, “Positive invariant sets and global exponential attractive sets of a class of neural networks with unbounded time-delays,” Communications in Nonlinear Science and Numerical Simulation, vol. 16, no. 9, pp. 3738–3745, 2011. [56] X. Liao, Q. Luo, and Z. Zeng, “Positive invariant and global exponential attractive sets of neural networks with time-varying delays,” Neurocomputing, vol. 71, no. 4-6, pp. 513–518, 2008. [57] X. Liao, Q. Luo, Z. Zeng, and Y. Guo, “Global exponential stability in Lagrange sense for recurrent neural networks with time delays,” Nonlinear Analysis: Real World Applications, vol. 9, no. 4, pp. 1535–1557, 2008. [58] H. Wu, X. Liao, W. Feng, and S. Guo, “Mean square stability of uncertain stochastic BAM neural networks with interval timevarying delays,” Cognitive Neurodynamics, vol. 6, no. 5, pp. 443– 458, 2012.

Mathematical Problems in Engineering [59] K. Gu, “An integral inequality in the stability problem of timedelay systems,” in Proceedings of the 39th IEEE Conference on Decision and Control, vol. 3, pp. 2805–2810, Sydney, Australia, December 2000. [60] B. Chen, X. Liu, C. Lin, and K. Liu, “Robust H∞ control of Takagi–Sugeno fuzzy systems with state and input time delays,” Fuzzy Sets and Systems, vol. 160, no. 4, pp. 403–422, 2009.

15

Advances in

Operations Research Hindawi www.hindawi.com

Volume 2018

Advances in

Decision Sciences Hindawi www.hindawi.com

Volume 2018

Journal of

Applied Mathematics Hindawi www.hindawi.com

Volume 2018

The Scientific World Journal Hindawi Publishing Corporation http://www.hindawi.com www.hindawi.com

Volume 2018 2013

Journal of

Probability and Statistics Hindawi www.hindawi.com

Volume 2018

International Journal of Mathematics and Mathematical Sciences

Journal of

Optimization Hindawi www.hindawi.com

Hindawi www.hindawi.com

Volume 2018

Volume 2018

Submit your manuscripts at www.hindawi.com International Journal of

Engineering Mathematics Hindawi www.hindawi.com

International Journal of

Analysis

Journal of

Complex Analysis Hindawi www.hindawi.com

Volume 2018

International Journal of

Stochastic Analysis Hindawi www.hindawi.com

Hindawi www.hindawi.com

Volume 2018

Volume 2018

Advances in

Numerical Analysis Hindawi www.hindawi.com

Volume 2018

Journal of

Hindawi www.hindawi.com

Volume 2018

Journal of

Mathematics Hindawi www.hindawi.com

Mathematical Problems in Engineering

Function Spaces Volume 2018

Hindawi www.hindawi.com

Volume 2018

International Journal of

Differential Equations Hindawi www.hindawi.com

Volume 2018

Abstract and Applied Analysis Hindawi www.hindawi.com

Volume 2018

Discrete Dynamics in Nature and Society Hindawi www.hindawi.com

Volume 2018

Advances in

Mathematical Physics Volume 2018

Hindawi www.hindawi.com

Volume 2018

Suggest Documents