Mean-Square Exponential Input-to-State Stability of Stochastic Fuzzy

0 downloads 0 Views 1MB Size Report
Sep 11, 2018 - 3Basis Department, Jiangsu Polytechnic College of Agriculture and Forestry, Zhenjiang ...... schrift fßr Angewandte Mathematik und Physik, vol.
Hindawi Mathematical Problems in Engineering Volume 2018, Article ID 6289019, 11 pages https://doi.org/10.1155/2018/6289019

Research Article Mean-Square Exponential Input-to-State Stability of Stochastic Fuzzy Recurrent Neural Networks with Multiproportional Delays and Distributed Delays Tianyu Wang,1,2 Quanxin Zhu

,1 and Jingwei Cai3

1

Key Laboratory of HPC-SIP (MOE), College of Mathematics and Statistics, Hunan Normal University, Changsha 410081, Hunan, China 2 School of Mathematical Sciences and Institute of Finance and Statistics, Nanjing Normal University, Nanjing 210023, Jiangsu, China 3 Basis Department, Jiangsu Polytechnic College of Agriculture and Forestry, Zhenjiang 212400, Jiangsu, China Correspondence should be addressed to Quanxin Zhu; [email protected] Received 15 May 2018; Revised 21 July 2018; Accepted 11 September 2018; Published 4 October 2018 Academic Editor: Sabri Arik Copyright Š 2018 Tianyu Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. We are interested in a class of stochastic fuzzy recurrent neural networks with multiproportional delays and distributed delays. By constructing suitable Lyapunov-Krasovskii functionals and applying stochastic analysis theory, It̂ 𝑜’s formula and Dynkin’s formula, we derive novel sufficient conditions for mean-square exponential input-to-state stability of the suggested system. Some remarks and discussions are given to show that our results extend and improve some previous results in the literature. Finally, two examples and their simulations are provided to illustrate the effectiveness of the theoretical results.

1. Introduction Since the theory and application of cellular neural networks was proposed by L.O. Chua and L. Yang in 1998, neural networks have become a hot topic. They can be applied to the analysis of static images and signal processing, optimization, pattern recognition, and image processing. Usually, a neural network is an information processing system. Its characteristic is local connections between cells, and its output functions are piecewise linear. Clearly, it is easy to realize large-scale nonlinear analog signals in real time and parallel processing, which improves the running speed. As is well-known, the stability is an important theoretical problem in the field of dynamics systems (e.g., see [1–24]). Thus, it is interesting to investigate the stability of nonlinear neural networks. On one hand, the switching speed of amplifier is limited and the errors occur in electronic components. As a consequence, delays happen to dynamics systems, and the delays often destroy the stability of dynamics systems, even cause the heavy oscillation (e.g., see [25–35]). So it is significant to study the stability of delayed neural networks. For example,

[36] discussed the global stability analysis for a class of Cohen-Grossberg neural network models. A new comparison principle is firstly introduced to study the stability of stochastic delayed neural networks in [37]. Global asymptotic stability analysis for integrodifferential systems modeling neural networks with delays was investigated in [38]. In [39], Zhu et al. considered the robust stability of Markovian jump stochastic neural networks with delays in the leakage terms. For more related results we refer the authors to [25, 40–43] and references therein. It is worthy to point out that all of the works aforementioned were focused on the traditional types of delays such as constant delays, time-varying bounded delays, and bounded distribute delays. However, delays in real lives may be unbounded. In this case, a class of so-called proportional delays can be used to describe the model of human brain, where delays give information of history and the entire history affects the present. Thus, it is interesting to study the stability of neural networks with proportional delays. On the other hand, all of the works mentioned above were focused on the traditional neural networks models,

2

Mathematical Problems in Engineering 𝑛

which did not consider fuzzy logic. But in the factual operations, we always encounter some inconveniences such as the complicity and the uncertainty or vagueness. In fact, vagueness always opposite to exactness. Therefore, vagueness can not be avoided in the human way of regarding the world. So fuzzy theory is regarded as the best suitable setting to take vagueness into consideration. It is reported that there have appeared many results on the stability analysis of fuzzy neural networks in the literatures. For example, Li and Zhu introduced a new way to study the stability of stochastic fuzzy delayed Cohen-Grossberg neural networks [44]. They used Lyapunov functional, stochastic analysis technique and nonnegative semimartingale convergence theorem to solve the problem. In [45], Balasubramaniam and Ali studied the robust exponential stability of uncertain fuzzy Cohen-Grossberg neural networks with time-varying delays. However, to the best of our knowledge, until now, there have been no works on the stability of fuzzy neural networks with proportional delays. Motivated by the above discussion, in this paper we investigate the problem of the input-to-state stability analysis for a class of the stochastic fuzzy delayed recurrent neural networks with multiproportional delays and distributed delays. Some novel sufficient conditions are derived to ensure the mean-square exponential input-to-state stability of the suggested system based on constructing suitable LyapunovKrasovskii functionals and stochastic analysis theory, It̂ 𝑜’s formula and Dynkin’s formula. Several remarks and discussions are presented to show that our results extend and improve some previous results in the literature. Finally, two examples and their simulations are given to show the effectiveness of the obtained results. The rest of the paper is as follows. In Section 2, we introduce the model, some necessary assumptions, and preliminaries. In Section 3, we investigate the mean-square exponential stability of the considered model. In Section 4, we provide two examples to illustrate the effectiveness of the obtained results. Finally, we conclude the paper in Section 5.

for all 𝑡 ≥ 0, 𝑖 = 1, 2, . . . , 𝑛, where 𝑥𝑖 (𝑡) represents the state variable of the 𝑖th neuron at time 𝑡; 𝑑𝑖 is the self-feedback connection weight strength. The constants 𝑎𝑖𝑗 , 𝑏𝑖𝑗 , 𝑐𝑖𝑗 ,𝑑𝑖𝑗 ,𝑒𝑖𝑗 , and 𝑓𝑖𝑗 are the connection weights of the 𝑗th neuron to the 𝑖th neuron at time 𝑡 or 𝑝𝑗 𝑡. 𝑓𝑗 (𝑥𝑗 (𝑡)), 𝑔𝑗 (𝑥𝑗 (𝑝𝑗 𝑡)), and ℎ𝑗 (𝑥𝑗 (𝑡)) are the 𝑗th neuron activation functions at time 𝑡 or 𝑝𝑗 𝑡. 𝑢𝑖 (𝑡) is the control input of the ith neuron at time 𝑡, and 𝑢 = (𝑢1 (𝑡), 𝑢2 (𝑡), . . . , 𝑢𝑛 (𝑡)) ∈ 𝑙∞ . ⋀ and ⋁ denote the fuzzy AND and fuzzy OR operation, respectively. The noise perturbation 𝜎𝑖𝑗 : R×R is a Borel measurable function, and 𝑤𝑗 (𝑡), 𝑗 ≥ 0, 𝑗 = 1, 2, . . . , 𝑛 are scalar standard Brownian motions defined on a complete probability space (Ω, F, P) with a natural filtration {F𝑡 }𝑡≥1 . The constants 𝑝𝑗 , 𝑗 = 1, 2, . . . , 𝑛 are proportional delay factors and satisfy 0 < 𝑝𝑗 𝑡 = 𝑡 − (1 − 𝑝𝑗 )𝑡, where (1 − 𝑝𝑗 )𝑡 are time-varying continuous functions that satisfy (1 − 𝑝𝑗 )𝑡 󳨀→ +∞ as 𝑡 󳨀→ +∞ and 𝑝𝑗 ≠ 1. 𝜏(𝑡) is the timė ≤ 𝛿 < 1. varying delay, which satisfies 0 ≤ 𝜏(𝑡) ≤ 𝜏̃ and 𝜏(𝑡) Throughout this paper, we assume that the following conditions are satisfied.

2. Model Formulation and Preliminaries

Assumption 1. There exist positive constants 𝐿 𝑖 and 𝑀𝑖 such that

Let 𝐶([𝑝, 1]; R𝑛 ) denote the family of continuous functions 𝜙 from [𝑝, 1] to R𝑛 with the uniform norm ‖𝜙‖ = sup𝑝≤𝜃≤1 |𝜙(𝜃)|. Denote by L2F0 ([𝑝, 1]; R𝑛 ) the family of all F0 measurable, 𝐶([𝑝, 1]; R𝑛 ) valued stochastic variables 𝜙 = 1 𝜙(𝑠) : 𝑝 ≤ 𝑠 ≤ 1 satisfying ∫𝑝 E|𝜙(𝑠)|2 𝑑𝑠 < ∞, and 𝐶([−𝜏, 1]; R𝑛 ) valued stochastic variables 𝜙 = 𝜙(𝑠) : −𝜏 ≤ 𝑠 ≤ 0 0 satisfying ∫−𝜏 E|𝜙(𝑠)|2 𝑑𝑠 < ∞, in which E stands for the correspondent expectation operator with respect to the given probability measure 𝑃. 𝑙∞ denotes the class of essentially bounded functions 𝑢 from [1, ∞) to R with ‖𝑢‖∞ = ess sup𝑡≥1 |𝑢(𝑡)| < ∞, R denotes real number. R𝑛 denotes 𝑛 dimensions Euclidean space. In this section, we consider the following class of stochastic fuzzy delayed recurrent neural networks with multiproportional delays and distributed delays:

𝑑𝑥𝑖 (𝑡) = [−𝑑𝑖 𝑥𝑖 (𝑡) + ⋀𝑎𝑖𝑗 𝑓𝑗 (𝑥𝑗 (𝑡)) 𝑗=1 [ 𝑛

𝑛

𝑗=1

𝑗=1

+ ⋀𝑏𝑖𝑗 𝑔𝑗 (𝑥𝑗 (𝑝𝑗 𝑡)) + ⋀𝑐𝑖𝑗 ∫ 𝑛

𝑛

𝑗=1

𝑗=1

𝑡

𝑡−𝜏(𝑡)

ℎ𝑗 (𝑥𝑗 (𝑠)) 𝑑𝑠

+ ⋁𝑑𝑖𝑗 𝑓𝑗 (𝑥𝑗 (𝑡)) + ⋁𝑒𝑖𝑗 𝑔𝑗 (𝑥𝑗 (𝑝𝑗 𝑡)) 𝑛

+ ⋁𝑓𝑖𝑗 ∫ 𝑗=1

𝑡

𝑡−𝜏(𝑡)

(1)

ℎ𝑗 (𝑥𝑗 (𝑠)) 𝑑𝑠 + 𝑢𝑖 (𝑡)] 𝑑𝑡 ]

𝑛

+ ∑𝜎𝑖𝑗 (𝑥𝑗 (𝑡) , 𝑥𝑗 (𝑝𝑗 𝑡)) 𝑑𝑤𝑗 (𝑡) , 𝑗=1

𝑥𝑖 (𝑡) = 𝜑𝑖 (𝑡) ,

𝑝 ≤ 𝑡 ≤ 0,

(2)

󵄨󵄨 󵄨 󵄨󵄨𝑓𝑖 (𝑢) − 𝑓𝑖 (V)󵄨󵄨󵄨 ≤ 𝐿 𝑖 |𝑢 − V| , 󵄨 󵄨󵄨 󵄨󵄨𝑔𝑖 (𝑢) − 𝑔𝑖 (V)󵄨󵄨󵄨 ≤ 𝑀𝑖 |𝑢 − V| ,

(3)

for all 𝑢, V ∈ R and 𝑖 = 1, 2, . . . , 𝑛. Assumption 2. There exist nonnegative constants 𝜇𝑖𝑗 , ]𝑖𝑗 , and 𝛿𝑖𝑗 such that [𝜎𝑖𝑗 (𝑥, 𝑦, 𝑧) − 𝜎𝑖𝑗 (𝑥󸀠 , 𝑦󸀠 , 𝑧󸀠 )] 2

2 2

2

≤ 𝜇𝑖𝑗 (𝑥 − 𝑥󸀠 ) + ]𝑖𝑗 (𝑦 − 𝑦󸀠 ) + 𝛿𝑖𝑗 (𝑧 − 𝑧󸀠 ) for all 𝑥, 𝑥󸀠 , 𝑦, 𝑦󸀠 , 𝑧, 𝑧󸀠 ∈ R and 𝑖, 𝑗 = 1, 2, . . . , 𝑛.

(4)

Mathematical Problems in Engineering

3 𝑛 𝑛 󵄨 󵄨 2𝜂𝑖 𝑑𝑖 ≥ (1 + 𝜆) 𝜂𝑖 + 𝛼𝑖 + 𝜏𝛽𝑖 + ∑𝜂𝑗 𝜇𝑗𝑖 + ∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑎𝑗𝑖 󵄨󵄨󵄨󵄨 𝐿 𝑖

Assumption 3.

𝑗=1

𝑓𝑗 (0) = 𝑔𝑗 (0) = ℎ𝑗 (0) = 𝑢𝑗 (0) = 0, 𝜎𝑖𝑗 (0, 0) = 0,

𝑖, 𝑗 = 1, 2, . . . , 𝑛.

𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 + 𝜂𝑖 ∑ 󵄨󵄨󵄨󵄨𝑎𝑖𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 + 𝜂𝑖 ∑ 󵄨󵄨󵄨󵄨𝑏𝑖𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗

(5)

𝑗=1

Obviously, under Assumptions 1–3 we see that there exists a unique solution of system (1)-(2). Let 𝑥(𝑡, 𝜑) denote the solution from the initial data 𝑥(𝑠) = 𝜑(𝑠) on 𝑠 ∈ [𝑝, 1] in L2F0 ([𝑝, 1]; R𝑛 ). It is clear that system (1)-(2) has a trivial solution or zero solution 𝑥(𝑡; 0) ≡ 0 corresponding to the initial data 𝜑(𝑠) = 0. By applying the following variable ̃𝑗 (𝑡) = transformations 𝑦𝑖 (𝑡) = 𝑥𝑖 (𝑒𝑡 ), V𝑖 (𝑡) = 𝑢𝑖 (𝑒𝑡 ), 𝑤 𝑡 𝑡 𝑤𝑗 (𝑒 ), 𝜙𝑖 (𝑡) = 𝜑𝑖 (𝑒 ), then system (1)-(2) is equivalently transformed into the following stochastic recurrent neural networks with constant delays and time-varying coefficients

𝑗=1

𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 + 𝜂𝑖 ∑ 󵄨󵄨󵄨󵄨𝑐𝑖𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 + ∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑑𝑗𝑖 󵄨󵄨󵄨󵄨 𝐿 𝑖 𝑗=1

𝑗=1

𝑗=1

𝑛 󵄨 󵄨 + 𝜂𝑖 ∑ 󵄨󵄨󵄨󵄨𝑓𝑖𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 , 𝑗=1

𝑛 𝑛 󵄨 󵄨 𝛼𝑖 ≥ ∑ 𝑒𝜆𝜏 𝜂𝑗 ]𝑗𝑖 + ∑𝑒𝜆𝜏 𝜂𝑗 󵄨󵄨󵄨󵄨𝑏𝑗𝑖 󵄨󵄨󵄨󵄨 𝑀𝑖 𝑗=1

+ ⋀𝑏𝑖𝑗 𝑔𝑗 (𝑦𝑗 (𝑡 − 𝜏𝑗 )) + ⋀𝑐𝑖𝑗 ∫ 𝑗=1

𝑗=1

𝑛

𝑛

𝑗=1

𝑗=1

𝑡

𝑡−𝜏(𝑡)

𝑗=1

𝛽𝑖 ≥

𝑦𝑖 (𝑡) = 𝜙𝑖 (𝑡) ,

(11)

Proof. Since system (6)-(7) is equivalent to system (1)-(2), we only need to prove that the trivial solution of system (6)-(7) is mean-square exponentially input-to-state stable. To this end, we let 𝜎(𝑡) = (𝜎𝑖𝑗 (𝑡))𝑛×𝑛 , 𝜎𝑖𝑗 = 𝜎𝑖𝑗 (𝑦𝑗 (𝑡), 𝑦𝑗 (𝑡−𝜏𝑗 )), for the sake of simplicity.

ℎ𝑗 (𝑦𝑗 (𝑠)) 𝑑𝑠 + ]𝑖 (𝑡)] 𝑑𝑡 + ∑𝜎𝑖𝑗 𝑡−𝜏(𝑡) 𝑗=1 ]

⋅ (𝑦𝑗 (𝑡) , 𝑦𝑗 (𝑡 − 𝜏𝑗 )) 𝑑̃ 𝑤𝑗 (𝑡) ,

𝑛 𝑛 𝜏 󵄨 󵄨 󵄨 󵄨 (∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑐𝑗𝑖 󵄨󵄨󵄨󵄨 𝑁𝑖 + ∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑓𝑗𝑖 󵄨󵄨󵄨󵄨 𝑁𝑖 ) , (1 − 𝛿) 𝑗=1 𝑗=1

(6)

𝑛

𝑡

𝜆𝜏

𝑗=1

+ ⋁𝑑𝑖𝑗 𝑓𝑗 (𝑦𝑗 (𝑡)) + ⋁𝑒𝑖𝑗 𝑔𝑗 (𝑦𝑗 (𝑡 − 𝜏𝑗 )) 𝑛

(10)

󵄨 󵄨 + ∑𝑒 𝜂𝑗 󵄨󵄨󵄨󵄨𝑒𝑗𝑖 󵄨󵄨󵄨󵄨 𝑀𝑖 ,

ℎ𝑗 (𝑦𝑗 (𝑠)) 𝑑𝑠

+ ⋁𝑓𝑖𝑗 ∫

𝑗=1

𝑛

𝑑𝑦𝑖 (𝑡) = 𝑒𝑡 [−𝑑𝑖 𝑦𝑖 (𝑡) + ⋀𝑎𝑖𝑗 𝑓𝑗 (𝑦𝑗 (𝑡)) 𝑗=1 [ 𝑛

(9)

𝑗=1

𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 + 𝜂𝑖 ∑ 󵄨󵄨󵄨󵄨𝑑𝑖𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 + 𝜂𝑖 ∑ 󵄨󵄨󵄨󵄨𝑒𝑖𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗

𝑛

𝑛

𝑗=1

𝑡 ≥ 0,

𝑛

− 𝜏 ≤ 𝑡 ≤ 0,

(7)

where

𝑡

𝑉 (𝑡, 𝑦 (𝑡)) = 𝑒(𝜆−1)𝑡 ∑𝜂𝑖 𝑦𝑖2 (𝑡) + ∫

𝑡−𝜏𝑖

𝑖=1

+∍

0

𝑡

∍

−𝜏(𝑡) 𝑡+𝜃

𝜆𝑠

𝑒

𝑛

∑𝛽𝑖 𝑦𝑖2 𝑖=1

𝑛

𝑒𝜆𝑠 ∑𝛼𝑖 𝑦𝑖2 (𝑠) 𝑑𝑠 𝑖=1

(12)

(𝑠) 𝑑𝑠.

𝜏𝑗 = − log 𝑝𝑗 , 𝑗 = 1, 2, . . . , 𝑛, 𝜏 = max {̃ 𝜏, max 𝜏𝑗 } , 1≤𝑗≤𝑛

Then by Itˆo’s formula, we have the following stochastic differential equation: (8)

𝜙𝑖 (𝑡) ∈ 𝐶 ([−𝜏, 0] ; R) ,

𝑑𝑉 (𝑡, 𝑦 (𝑡)) = L𝑉 (𝑡, 𝑦 (𝑡)) 𝑑𝑡 𝑇

𝜙 (𝑡) = (𝜙1 (𝑡) , 𝜙2 (𝑡) , . . . , 𝜙𝑛 (𝑡)) ∈ 𝐶 ([−𝜏, 0] ; R𝑛 ) .

3. Main Results In this section, we will discuss the mean-square exponential input-to-state stability of the trivial solution for system (1)-(2) under Assumptions 1–3. Theorem 4. Let Assumptions 1–3 hold. The trivial solution of system (1)-(2) is mean-square exponentially input-to-state stable, if there exist positive scalars 𝜂𝑖 , 𝛼𝑖 , 𝛽𝑖 , (𝑖 = 1, 2, ..., 𝑛), 𝜆 > 1 such that

+ 𝑉𝑦 (𝑡, 𝑦 (𝑡)) 𝜎 (𝑡) 𝑑𝑤 (𝑡) ,

(13)

where 𝑉𝑦 (𝑡, 𝑦(𝑡)) = (𝜕𝑉(𝑡, 𝑦(𝑡))/𝜕𝑦1 , . . . , 𝜕𝑉(𝑡, 𝑦(𝑡))/𝜕𝑦𝑛 ) and L is the weak infinitesimal operator such that 𝑛

L𝑉 (𝑡, 𝑦 (𝑡)) = (𝜆 − 1) 𝑒(𝜆−1)𝑡 ∑𝜂𝑖 𝑦𝑖2 (𝑡) 𝑖=1

𝑛

+ 2𝑒(𝜆−1)𝑡 ∑𝜂𝑖 𝑦𝑖 (𝑡) 𝑒𝜆𝑡 [−𝑑𝑖 𝑦𝑖 (𝑡) 𝑖=1 [

4

Mathematical Problems in Engineering 𝑛

𝑛

𝑛

𝑗=1

𝑗=1

𝑗=1

+ ⋀𝑎𝑖𝑗 𝑓𝑗 (𝑦𝑗 (𝑡)) + ⋀𝑏𝑖𝑗 𝑔𝑗 (𝑦𝑗 (𝑡 − 𝜏𝑗 )) + ⋀𝑐𝑖𝑗 ⋅∫

𝑡−𝜏(𝑡)

ℎ𝑗 (𝑦𝑗 (𝑠)) 𝑑𝑠 + ⋁𝑑𝑖𝑗 𝑓𝑗 (𝑦𝑗 (𝑡)) 𝑗=1

𝑛

𝑛

𝑗=1

𝑗=1

+ ⋁𝑒𝑖𝑗 𝑔𝑗 (𝑦𝑗 (𝑡 − 𝜏𝑗 )) + ⋁𝑓𝑖𝑗 ∫ 𝑛

𝑛

𝑖=1

𝑗=1

𝑡

ℎ𝑗 (𝑦𝑗 (𝑠)) 𝑑𝑠

𝑡−𝜏(𝑡)

+ ]𝑖 (𝑡)] + 𝑒(𝜆−1)𝑡 ∑𝜂𝑖 ∑𝜎2 (𝑦𝑗 (𝑡) , 𝑦𝑗 (𝑡 − 𝜏𝑗 )) ] 𝜆𝑡

+𝑒

𝑛

∑𝛼𝑖 𝑦𝑖2 𝑖=1

𝜆(𝑡−𝜏)

(𝑡) − 𝑒

𝑖𝑗

𝑛

∑𝛼𝑖 𝑦𝑖2 𝑖=1

𝑛

∑𝛽𝑖 𝑦𝑖2 (𝑠) 𝑑𝑠

𝑡−𝜏(𝑡) 𝑖=1

𝑖=1 𝑛

𝑛

𝑛

𝑖=1

𝑖=1

𝑖=1

≤ 𝜆𝑒𝜆𝑡 ∑𝜂𝑖 𝑦𝑖2 (𝑡) − 2𝑒𝜆𝑡 ∑𝜂𝑖 𝑑𝑖 𝑦𝑖2 (𝑡) + 2𝑒𝜆𝑡 ∑𝜂𝑖 𝑦𝑖 (𝑡) 𝑛

𝑛

𝑗=1

𝑖=1

⋅ ⋀𝑎𝑖𝑗 𝑓𝑗 (𝑦𝑗 (𝑡)) + 2𝑒𝜆𝑡 ∑𝜂𝑖 𝑦𝑖 (𝑡) 𝑛

𝑛

𝑗=1

𝑖=1

𝑗=1

𝑖=1

𝑛

𝑗=1

𝑖=1

𝑛

𝑛

𝑗=1

𝑖=1

𝑗=1

𝑛

𝑖=1

𝑛

𝑛

𝑛

𝑖=1

𝑗=1

𝑖=1

𝑗=1

𝑛

𝑖=1

𝑖=1

𝑛

𝑛

𝑖=1

𝑗=1

𝑛

𝑖=1

𝑗=1

𝑗=1

𝑛

𝑛

𝑖=1

𝑗=1

𝑗=1

𝑗=1

𝑗=1

𝑛

∑𝛽𝑖 𝑦𝑖2 (𝑠) 𝑑𝑠

≤ 𝑒 ∑ (−2𝜂𝑖 𝑑𝑖 + 𝜆𝜂𝑖 + ∑𝜂𝑗 𝜇𝑗𝑖 + 𝛼𝑖 + 𝑛

𝑖=1

𝑛 𝑛 󵄨 𝑡 󵄨󵄨 󵄨 󵄨 󵄨 󵄨󵄨 󵄨 󵄨 (𝑦𝑗 (𝑠)) 𝑑𝑠󵄨󵄨󵄨 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 ∑ 󵄨󵄨󵄨󵄨𝑓𝑖𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 󵄨󵄨󵄨∫ 󵄨 󵄨󵄨 𝑡−𝜏(𝑡) 󵄨 𝑖=1 𝑗=1

𝑡−𝜏(𝑡) 𝑖=1

𝑖=1

𝜆𝑡

𝑡

𝑛

∑𝛽𝑖 𝑦𝑖2 (𝑠) 𝑑𝑠

𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 ∑ 󵄨󵄨󵄨󵄨𝑒𝑖𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 󵄨󵄨󵄨󵄨𝑦𝑖 (𝑡 − 𝜏𝑗 )󵄨󵄨󵄨󵄨

+ 𝑒𝜆𝑡 ∑𝛼𝑖 𝑦𝑖2 (𝑡) − 𝑒𝜆(𝑡−𝜏) ∑𝛼𝑖 𝑦𝑖2 (𝑡 − 𝜏𝑖 ) 𝑛

𝑡

𝑛

𝑖=1

+ 𝜏𝑒𝜆𝑡 ∑𝛽𝑖 𝑦𝑖2 (𝑡) − (1 − 𝛿) 𝑒𝜆𝑡 ∫

𝑗=1

𝑛

𝑖=1

+ 𝑒𝜆𝑡 ∑𝜂𝑖 ∑𝜇𝑖𝑗 𝑦𝑗2 + 𝑒𝜆𝑡 ∑𝜂𝑖 ∑]𝑖𝑗 𝑦𝑗2 (𝑡 − 𝜏𝑗 ) 𝑛

𝑖=1

𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 ∑ 󵄨󵄨󵄨󵄨𝑑𝑖𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨

ℎ𝑗 (𝑦𝑗 (𝑠)) 𝑑𝑠 + 2𝑒𝜆𝑡 ∑𝜂𝑖 𝑦𝑖 (𝑡) ]𝑖 (𝑡)

𝑛

𝑛

𝑛 𝑛 󵄨 𝑡 󵄨󵄨 󵄨 󵄨 󵄨 󵄨󵄨 󵄨 󵄨 (𝑦𝑗 (𝑠)) 𝑑𝑠󵄨󵄨󵄨 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 ∑ 󵄨󵄨󵄨󵄨𝑐𝑖𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 󵄨󵄨󵄨∫ 󵄨 󵄨󵄨 󵄨 𝑡−𝜏(𝑡) 𝑖=1 𝑗=1

⋅ ⋁𝑒𝑖𝑗 𝑔𝑗 (𝑦𝑗 (𝑡 − 𝜏𝑗 )) + 2𝑒𝜆𝑡 ∑𝜂𝑖 𝑦𝑖 (𝑡) ⋁𝑓𝑖𝑗

𝑡−𝜏(𝑡)

𝑛

󵄨 󵄨 ⋅ 󵄨󵄨󵄨]𝑖 (𝑡)󵄨󵄨󵄨 + 𝑒𝜆𝑡 ∑ (−𝑒−𝜆𝜏 𝛼𝑖 + ∑𝜂𝑗 ]𝑗𝑖 ) 𝑦𝑖2 (𝑡 − 𝜏𝑖 )

𝑖=1

𝑛

𝑡

𝑛 󵄨󵄨 󵄨󵄨 𝑡 󵄨 󵄨 󵄨 󵄨 ℎ𝑗 (𝑦𝑗 (𝑠)) 𝑑𝑠 − ℎ𝑗 (0)󵄨󵄨󵄨 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 ⋅ 󵄨󵄨󵄨∫ 󵄨󵄨 󵄨󵄨 𝑡−𝜏(𝑡) 𝑖=1

𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 ∑ 󵄨󵄨󵄨󵄨𝑏𝑖𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 󵄨󵄨󵄨󵄨𝑦𝑖 (𝑡 − 𝜏𝑗 )󵄨󵄨󵄨󵄨

⋅ ⋁𝑑𝑖𝑗 𝑓𝑗 (𝑦𝑗 (𝑡)) + 2𝑒𝜆𝑡 ∑𝜂𝑖 𝑦𝑖 (𝑡)

⋅∫

𝑗=1

󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 ∑ 󵄨󵄨󵄨󵄨𝑎𝑖𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨

ℎ𝑗 (𝑦𝑗 (𝑠)) 𝑑𝑠 + 2𝑒𝜆𝑡 ∑𝜂𝑖 𝑦𝑖 (𝑡)

𝑛

𝑗=1

≤ 𝑒𝜆𝑡 ∑ (−2𝜂𝑖 𝑑𝑖 + 𝜆𝜂𝑖 + ∑𝜂𝑗 𝜇𝑗𝑖 + 𝛼𝑖 + 𝜏𝛽𝑖 ) 𝑦𝑖2 (𝑡)

𝑛

𝑡−𝜏(𝑡)

𝑖=1

𝑡−𝜏(𝑡) 𝑖=1

𝑛

𝑡

𝑛 𝑛 󵄨 󵄨 󵄨 󵄨󵄨 󵄨 − 𝑓𝑗 (0)󵄨󵄨󵄨󵄨 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 ⋁ 󵄨󵄨󵄨󵄨𝑒𝑖𝑗 󵄨󵄨󵄨󵄨 󵄨󵄨󵄨󵄨𝑔𝑗 (𝑦𝑗 (𝑡 − 𝜏𝑗 ))

+ 𝑒𝜆𝑡 (𝛿 − 1) ∫

⋅ ⋀𝑏𝑖𝑗 𝑔𝑗 (𝑦𝑗 (𝑡 − 𝜏𝑗 )) + 2𝑒𝜆𝑡 ∑𝜂𝑖 𝑦𝑖 (𝑡) ⋀𝑐𝑖𝑗 ⋅∫

𝑛 𝑛 󵄨󵄨 󵄨 󵄨 󵄨 󵄨󵄨 󵄨 − ℎ𝑗 (0)󵄨󵄨󵄨 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 ⋁ 󵄨󵄨󵄨󵄨𝑑𝑖𝑗 󵄨󵄨󵄨󵄨 󵄨󵄨󵄨󵄨𝑓𝑗 (𝑦𝑗 (𝑡)) 󵄨󵄨 𝑖=1 𝑗=1

𝑖=1

𝑛

𝑡

𝑗=1

𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 󵄨 − 𝑔𝑗 (0)󵄨󵄨󵄨󵄨 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 ⋁ 󵄨󵄨󵄨󵄨𝑓𝑖𝑗 󵄨󵄨󵄨󵄨

(𝑡 − 𝜏𝑖 ) + 𝜏 (𝑡)

⋅ 𝑒𝜆𝑡 ∑𝛽𝑖 𝑦𝑖2 (𝑡) − (1 − 𝜏̇ (𝑡)) 𝑒𝜆𝑡 ∫

𝑖=1

𝑛 𝑛 󵄨 𝑡 󵄨 󵄨 󵄨 󵄨󵄨 󵄨 ℎ (𝑦 (𝑠)) 𝑑𝑠 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 ⋀ 󵄨󵄨󵄨󵄨𝑐𝑖𝑗 󵄨󵄨󵄨󵄨 󵄨󵄨󵄨∫ 󵄨󵄨 𝑡−𝜏(𝑡) 𝑗 𝑗 𝑖=1 𝑗=1

𝑛

𝑡

𝑛 𝑛 󵄨 󵄨 󵄨 󵄨󵄨 󵄨 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 ⋀ 󵄨󵄨󵄨󵄨𝑏𝑖𝑗 󵄨󵄨󵄨󵄨 󵄨󵄨󵄨󵄨𝑔𝑗 (𝑦𝑗 (𝑡 − 𝜏𝑗 )) − 𝑔𝑗 (0)󵄨󵄨󵄨󵄨

𝜏𝛽𝑖 ) 𝑦𝑖2

󵄨 󵄨 󵄨 󵄨󵄨 󵄨 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 ⋀ 󵄨󵄨󵄨󵄨𝑎𝑖𝑗 󵄨󵄨󵄨󵄨 󵄨󵄨󵄨󵄨𝑓𝑗 (𝑦𝑗 (𝑡)) − 𝑓𝑗 (0)󵄨󵄨󵄨󵄨

(𝑡)

𝑛

𝑛

𝑖=1

𝑖=1

󵄨󵄨 󵄨 󵄨 + 2𝑒𝜆𝑡 ∑𝜂𝑖 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 󵄨󵄨󵄨]𝑖 (𝑡)󵄨󵄨󵄨 + 𝑒𝜆𝑡 ∑ (−𝑒−𝜆𝜏 𝛼𝑖 𝑛

𝑡

+ ∑𝜂𝑗 ]𝑗𝑖 ) 𝑦𝑖2 (𝑡 − 𝜏𝑖 ) + 𝑒𝜆𝑡 (𝛿 − 1) ∫ 𝑗=1

⋅ (𝑠) 𝑑𝑠

𝑛

∑𝛽𝑖 𝑦𝑖2

𝑡−𝜏(𝑡) 𝑖=1

Mathematical Problems in Engineering 𝑛

𝑛

𝑖=1

𝑗=1

𝜆𝑡

5

≤ 𝑒 ∑ (−2𝜂𝑖 𝑑𝑖 + 𝜆𝜂𝑖 + ∑𝜂𝑗 𝜇𝑗𝑖 + 𝛼𝑖 +

𝜏𝛽𝑖 ) 𝑦𝑖2

(𝑡)

𝑛 𝑛 󵄨 󵄨 󵄨2 󵄨 󵄨2 󵄨 + 𝑒𝜆𝑡 ∑ ∑ 𝜂𝑖 󵄨󵄨󵄨󵄨𝑎𝑖𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 (󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨󵄨𝑦𝑗 (𝑡)󵄨󵄨󵄨󵄨 )

2

𝑛 𝑛 𝑡 󵄨 󵄨 + ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑓𝑘𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 ) ∑ (∫ 𝑘=1

𝑡−𝜏(𝑡)

𝑘=1

𝑦𝑘 (𝑠) 𝑑𝑠)

+ 𝑒𝜆𝑡 max 𝜂𝑗 ‖]‖2∞ . 1≤𝑗≤𝑛

𝑖=1 𝑗=1

𝑛 𝑛 󵄨 󵄨 󵄨2 󵄨 󵄨2 󵄨 + 𝑒𝜆𝑡 ∑ ∑ 𝜂𝑖 󵄨󵄨󵄨󵄨𝑏𝑖𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 (󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨󵄨𝑦𝑗 (𝑡 − 𝜏𝑗 )󵄨󵄨󵄨󵄨 ) 𝑖=1 𝑗=1

(14) Letting 𝛽 = min 𝛽𝑘 , then we obtain 𝑛

L𝑉 (𝑡, 𝑦 (𝑡)) ≤ −𝑒𝜆𝑡 min [2𝜂𝑗 𝑑𝑗 − 𝜆𝜂𝑗 − ∑ 𝜂𝑘 𝜇𝑘𝑗 1≤𝑗≤𝑛

󵄨 𝑡 󵄨󵄨2 󵄨 󵄨 󵄨 󵄨2 󵄨󵄨 󵄨 + 𝑒𝜆𝑡 ∑ ∑ 𝜂𝑖 󵄨󵄨󵄨󵄨𝑐𝑖𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 (󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨∫ 𝑦𝑗 (𝑠) 𝑑𝑠󵄨󵄨󵄨 ) 󵄨 󵄨󵄨 𝑡−𝜏(𝑡) 󵄨 𝑖=1 𝑗=1

𝑘=1

𝑛

𝑛

𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 − 𝛼𝑗 − 𝜏𝛽𝑗 − 𝜂𝑗 ∑ 󵄨󵄨󵄨󵄨𝑎𝑗𝑘 󵄨󵄨󵄨󵄨 𝐿 𝑘 − ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑎𝑘𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 𝑘=1

𝑛

𝑛

󵄨 󵄨 󵄨2 󵄨 󵄨2 󵄨 + 𝑒𝜆𝑡 ∑ ∑ 𝜂𝑖 󵄨󵄨󵄨󵄨𝑑𝑖𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 (󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨󵄨𝑦𝑗 (𝑡)󵄨󵄨󵄨󵄨 ) 𝑖=1 𝑗=1 𝑛

𝑛

󵄨 󵄨 󵄨2 󵄨 󵄨2 󵄨 + 𝑒 ∑ ∑ 𝜂𝑖 󵄨󵄨󵄨󵄨𝑒𝑖𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 (󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨󵄨𝑦𝑗 (𝑡 − 𝜏𝑗 )󵄨󵄨󵄨󵄨 ) 𝜆𝑡

𝑖=1 𝑗=1

󵄨 𝑡 󵄨󵄨󵄨2 󵄨 󵄨 󵄨 󵄨2 󵄨󵄨 + 𝑒 ∑ ∑ 𝜂𝑖 󵄨󵄨󵄨󵄨𝑓𝑖𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 (󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨∫ 𝑦𝑗 (𝑠) 𝑑𝑠󵄨󵄨󵄨 ) 󵄨󵄨 𝑡−𝜏(𝑡) 󵄨󵄨 𝑛

𝑛

𝜆𝑡

𝑛 𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 − ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑏𝑗𝑘 󵄨󵄨󵄨󵄨 𝑀𝑘 − ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑐𝑗𝑘 󵄨󵄨󵄨󵄨 𝑁𝑘 − 𝜂𝑗 ∑ 󵄨󵄨󵄨󵄨𝑑𝑗𝑘 󵄨󵄨󵄨󵄨 𝐿 𝑘 𝑘=1

𝑛

𝑖=1

𝑖=1

󵄨 󵄨2 󵄨 󵄨2 + 𝑒𝜆𝑡 ∑𝜂𝑖 ( 󵄨󵄨󵄨𝑦𝑖 (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨]𝑖 (𝑡)󵄨󵄨󵄨 + 𝑒𝜆𝑡 ∑ (−𝑒−𝜆𝜏 𝛼𝑖 +

𝑛

∑𝜂𝑗 ]𝑗𝑖 ) 𝑦𝑖2 𝑗=1

𝑘=1

⋅∫

𝑛

𝑛

1≤𝑗≤𝑛

𝑘=1

𝑛

𝑛

𝑛

𝑘=1

𝑘=1

𝑘=1

󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 − 𝜂𝑗 ∑ 󵄨󵄨󵄨󵄨𝑎𝑗𝑘 󵄨󵄨󵄨󵄨 𝐿 𝑘 − ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑎𝑘𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 − ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑏𝑗𝑘 󵄨󵄨󵄨󵄨 𝑀𝑘 𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 − ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑐𝑗𝑘 󵄨󵄨󵄨󵄨 𝑁𝑘 − 𝜂𝑗 ∑ 󵄨󵄨󵄨󵄨𝑑𝑗𝑘 󵄨󵄨󵄨󵄨 𝐿 𝑘 − ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑑𝑘𝑗󵄨󵄨󵄨󵄨 𝐿 𝑗 𝑛

𝑘=1

𝑘=1

𝑛

𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 − ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑒𝑗𝑘 󵄨󵄨󵄨󵄨 𝑀𝑘 − ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑓𝑗𝑘 󵄨󵄨󵄨󵄨 𝑁𝑘 − 𝜂𝑗 ] ∑𝑦𝑖2 (𝑡) 𝑖=1

𝑘=1

1≤𝑗≤𝑛

𝑘=1

𝑛

𝑛

𝑘=1

𝑘=1

󵄨 󵄨 𝛼𝑗 − ∑ 𝜂𝑘 ]𝑘𝑗 − ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑏𝑘𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗

𝑡

𝑡−𝜏(𝑡)

𝑘=1

𝑛 󵄨 󵄨 − ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑒𝑘𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 ] ∑𝑦𝑖2 (𝑡 − 𝜏𝑖 ) − 𝑒𝜆𝑡 min (1 − 𝛿) 1≤𝑗≤𝑛

⋅∫

𝑡

𝑖=1

𝑛 𝑛 󵄨 󵄨 ∑ 𝛽𝑘 𝑦𝑘2 (𝑠) 𝑑𝑠 + 𝑒𝜆𝑡 min ( ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑐𝑘𝑗󵄨󵄨󵄨󵄨 𝑁𝑗

𝑡−𝜏(𝑡) 𝑘=1

𝑡

𝑛

∑ 𝑦𝑘2 (𝑠) 𝑑𝑠

𝑡−𝜏(𝑡) 𝑘=1

1≤𝑗≤𝑛

𝑘=1

𝑘=1

𝑘=1

1𝑑𝑠 ⋅ ∫

𝑡

𝑡−𝜏(𝑡)

𝑦𝑘2 (𝑠) 𝑑𝑠)

+ 𝑒𝜆𝑡 max 𝜂𝑗 ‖]‖2∞ 1≤𝑗≤𝑛

𝑛

≤ −𝑒𝜆𝑡 min [2𝜂𝑗 𝑑𝑗 − 𝜆𝜂𝑗 − ∑ 𝜂𝑘 𝜇𝑘𝑗 − 𝛼𝑗 − 𝜏𝛽𝑗 1≤𝑗≤𝑛

𝑘=1

𝑛 𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 − 𝜂𝑗 ∑ 󵄨󵄨󵄨󵄨𝑎𝑗𝑘 󵄨󵄨󵄨󵄨 𝐿 𝑘 − ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑎𝑘𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 − ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑏𝑗𝑘 󵄨󵄨󵄨󵄨 𝑀𝑘 𝑘=1

𝑘=1

𝑘=1

𝑛 𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 − ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑐𝑗𝑘 󵄨󵄨󵄨󵄨 𝑁𝑘 − 𝜂𝑗 ∑ 󵄨󵄨󵄨󵄨𝑑𝑗𝑘 󵄨󵄨󵄨󵄨 𝐿 𝑘 − ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑑𝑘𝑗󵄨󵄨󵄨󵄨 𝐿 𝑗 𝑘=1

𝑘=1

𝑘=1

𝑛 𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 − ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑒𝑗𝑘 󵄨󵄨󵄨󵄨 𝑀𝑘 − ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑓𝑗𝑘 󵄨󵄨󵄨󵄨 𝑁𝑘 − 𝜂𝑗 ] ∑𝑦𝑖2 (𝑡) 𝑘=1

𝑛

𝑘=1

𝑖=1

𝑘=1

⋅ ∑ (∫

≤ −𝑒𝜆𝑡 min [2𝜂𝑗 𝑑𝑗 − 𝜆𝜂𝑗 − ∑ 𝜂𝑘 𝜇𝑘𝑗 − 𝛼𝑗 − 𝜏𝛽𝑗

− 𝑒 min [𝑒

𝑘=1

𝑛 𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 − ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑏𝑘𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 − ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑒𝑘𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 ] ∑𝑦𝑖2 (𝑡 − 𝜏𝑖 )

1≤𝑗≤𝑛

𝑛

−𝜆𝜏

1≤𝑗≤𝑛

𝑖=1

𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 + 𝑒𝜆𝑡 min ( ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑐𝑘𝑗󵄨󵄨󵄨󵄨 𝑁𝑗 + ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑓𝑘𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 )

𝑡−𝜏(𝑡) 𝑖=1

𝜆𝑡

𝑛

1≤𝑗≤𝑛

∑𝛽𝑖 𝑦𝑖2 (𝑠) 𝑑𝑠

𝑘=1

𝑘=1

− 𝜂𝑗 ] ∑𝑦𝑖2 (𝑡) − 𝑒𝜆𝑡 min [𝑒−𝜆𝜏 𝛼𝑗 − ∑ 𝜂𝑘 ]𝑘𝑗

(𝑡 − 𝜏𝑖 ) + 𝑒 (𝛿 − 1)

𝑘=1

𝑘=1

𝑘=1

− 𝑒𝜆𝑡 min (1 − 𝛿) 𝛽 ∫

𝜆𝑡

𝑛

𝑡

𝑘=1

𝑛 𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 − ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑑𝑘𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 − ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑒𝑗𝑘 󵄨󵄨󵄨󵄨 𝑀𝑘 − ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑓𝑗𝑘 󵄨󵄨󵄨󵄨 𝑁𝑘

𝑖=1 𝑗=1 𝑛

𝑘=1

𝑖=1

𝑘=1

𝑛 𝑛 󵄨 󵄨 − 𝑒𝜆𝑡 min [𝑒−𝜆𝜏 𝛼𝑗 − ∑ 𝜂𝑘 ]𝑘𝑗 − ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑏𝑘𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 1≤𝑗≤𝑛

𝑘=1

𝑘=1

𝑛 𝑛 󵄨 󵄨 − ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑒𝑘𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 ] ∑𝑦𝑖2 (𝑡 − 𝜏𝑖 ) − 𝑒𝜆𝑡 min [(1 − 𝛿) 𝛽 𝑘=1

𝑖=1

1≤𝑗≤𝑛

6

Mathematical Problems in Engineering 󵄨 󵄨2 E 󵄨󵄨󵄨𝑦 (𝑡)󵄨󵄨󵄨

𝑛 𝑛 󵄨 󵄨 󵄨 󵄨 − ( ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑐𝑘𝑗󵄨󵄨󵄨󵄨 𝑁𝑗 + ∑ 𝜂𝑘 󵄨󵄨󵄨󵄨𝑓𝑘𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 ) 𝜏] 𝑘=1

⋅∫

𝑘=1

≤

𝑛

𝑡

∑ 𝑦𝑘2 (𝑠) 𝑑𝑠 + 𝑒𝜆𝑡 max 𝜂𝑗 ‖]‖2∞ . 1≤𝑗≤𝑛

𝑡−𝜏(𝑡) 𝑘=1

(15) Now we define a Markov time as follows: 𝜌 fl inf {𝑠 ≥ 0 : |𝑥 (𝑠)| ≥ 𝑘} .

(16)

By using the Dynkin formula, we have E𝑉 (𝑡 ∧ 𝜌𝑘 , 𝑦 (𝑡 ∧ 𝜌𝑘 )) − E𝑉 (0, 𝑦 (0)) = E [∫

𝑡∧𝜌𝑘

0

(17)

L𝑉 (𝑠, 𝑦 (𝑠)) 𝑑𝑠] ,

which implies E𝑉 (𝑡 ∧ 𝜌𝑘 , 𝑦 (𝑡 ∧ 𝜌𝑘 )) 𝑡∧𝜌𝑘

= E𝑉 (0, 𝑦 (0)) + E [∫

0

L𝑉 (𝑠, 𝑦 (𝑠)) 𝑑𝑠] .

(18)

Letting 𝑘 󳨀→ ∞ on both sides (18), it follows from the monotone convergence theorem, (9), (10), and (12) that 𝑡

E𝑉 (𝑡, 𝑦 (𝑡)) ≤ E𝑉 (0, 𝑥 (0)) + ‖𝜐‖2∞ max 𝜂𝑗 ∫ 𝑒𝜆𝑠 𝑑𝑠 1≤𝑗≤𝑛

0

0

𝑛

𝑖=1

−𝜏𝑖

𝑖=1

0

∍

𝑡

−𝜏(𝑡) 𝑡+𝜃

+

𝑛

E𝑒𝜆𝑠 ∑𝛽𝑖 𝑦𝑖2 (𝑠) 𝑑𝑠

Remark 8. Compared with the result in [46], our model is also more general than that in [46] since distributed delays and fuzzy factor in this paper were ignored in [46].

2

2

2

𝑗=1

𝑗=1

𝑡

2

2

𝑗=1

𝑗=1

𝑡−𝜏(𝑡)

ℎ𝑗 (𝑥𝑗 (𝑠)) 𝑑𝑠 (22)

+ ⋁𝑑𝑖𝑗 𝑓𝑗 (𝑥𝑗 (𝑡)) + ⋁𝑒𝑖𝑗 𝑔𝑗 (𝑥𝑗 (𝑝𝑗 𝑡))

On the other hand, from the definition of 𝑉(𝑡, 𝑥(𝑡), we have E𝑉 (𝑡, 𝑦 (𝑡)) ≥ E𝑒

Remark 7. Compared with the result in [44], our model is more general than that in [44]. In fact, multiproportional delays and distributed delays are considered in this paper and they yield much difficulty in the proof of our result, whereas only a simple constant delay was discussed in [44].

+ ⋀𝑏𝑖𝑗 𝑔𝑗 (𝑥𝑗 (𝑝𝑗 𝑡)) + ⋀𝑐𝑖𝑗 ∫

1 2 𝜆𝑡 ‖]‖∞ max 𝜂𝑗 (𝑒 − 1) . 1≤𝑗≤𝑛 𝜆

∑𝜂𝑖 𝑦𝑖2 𝑖=1

Remark 6. If we ignore the effects of delays, then system (1)(2) becomes a stochastic recurrent neural network without delays. The results obtained in this paper are also applicable to the case of stochastic recurrent neural networks without delays.

𝑑𝑥𝑖 (𝑡) = [−𝑑𝑖 𝑥𝑖 (𝑡) + ⋀𝑎𝑖𝑗 𝑓𝑗 (𝑥𝑗 (𝑡)) 𝑗=1 [

1 2 𝜆𝑡 ‖]‖∞ max 𝜂𝑗 (𝑒 − 1) 1≤𝑗≤𝑛 𝜆

𝑛

Corollary 5. Assume that all the conditions of Theorem 4 hold. Then the trivial solution of system (1)-(2) with 𝑢(𝑡) ≡ 0 is meansquare exponentially stable.

(19)

𝑖=1

(𝜆−1)𝑡

From (21) we see that the trivial solution of system (6)-(7) is mean-square exponentially input-to-state stable. The proof of Theorem 4 is completed.

Example 1 (2-dimension case). Consider the case of 2-dimension stochastic recurrent neural networks with multiproportional delays

󵄩 󵄩2 ≤ (max 𝜂𝑗 + 𝜏 max 𝛼𝑗 + 𝜏 max 𝛽𝑗 ) E 󵄩󵄩󵄩𝜙󵄩󵄩󵄩 1≤𝑗≤𝑛 1≤𝑗≤𝑛 1≤𝑗≤𝑛 +

max1≤𝑖≤𝑛 𝜂𝑖 󵄩 󵄩2 ⋅ 𝑒−(𝜆−1)𝑡 E 󵄩󵄩󵄩𝜙󵄩󵄩󵄩 + ‖𝜐‖2∞ . 𝜆 max1≤𝑖≤𝑛 𝜂𝑖

In this section, we will use two examples to show the effectiveness of the obtained result.

= ∑𝜂𝑖 E𝑦𝑖2 (0) + ∫ E𝑒𝜆𝑠 ∑𝛼𝑖 𝑦𝑖2 (𝑠) 𝑑𝑠 +∫

(21)

4. Illustrative Examples

1 ≤ E𝑉 (0, 𝑥 (0)) + ‖]‖2∞ max 𝜂𝑗 (𝑒𝜆𝑡 − 1) 1≤𝑗≤𝑛 𝜆 𝑛

max1≤𝑖≤𝑛 𝜂𝑖 + 𝜏 max1≤𝑖≤𝑛 𝛼𝑖 + 𝜏 max1≤𝑖≤𝑛 𝛽𝑖 min1≤𝑖≤𝑛 𝜂𝑖

2

(𝑡)

󵄨 󵄨2 ≥ 𝑒(𝜆−1)𝑡 min 𝜂𝑖 E 󵄨󵄨󵄨𝑦 (𝑡)󵄨󵄨󵄨 .

(20)

1≤𝑖≤𝑛

Combining (19) and (20), the following inequation holds:

𝑗=1

2

𝑡

ℎ𝑗 (𝑥𝑗 (𝑠)) 𝑑𝑠 + 𝑢𝑖 (𝑡)] 𝑑𝑡 + ∑𝜎𝑖𝑗 𝑡−𝜏(𝑡) 𝑗=1 ]

+ ⋁𝑓𝑖𝑗 ∫

⋅ (𝑥𝑗 (𝑡) , 𝑥𝑗 (𝑝𝑗 𝑡)) 𝑑𝑤𝑗 (𝑡) , 𝑥𝑖 (𝑡) = 𝜑𝑖 (𝑡) ,

𝑝 ≤ 𝑡 ≤ 1, 𝑖 = 1, 2,

(23)

Mathematical Problems in Engineering

7 2𝜂2 𝑑2 = 8

where {0.1𝑥, 𝑓 (𝑥) = 𝑔 (𝑥) = ℎ (𝑥) = { 0.1 tan (𝑥) , {

2

if 𝑥 ≤ 0, if 𝑥 > 0,

> (1 + 𝜆) 𝜂2 + 𝛼2 + 𝜏𝛽2 + ∑𝜂𝑗 𝜇𝑗2 𝑗=1

(24)

2 2 󵄨 󵄨 󵄨 󵄨 + ∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑎𝑗2 󵄨󵄨󵄨󵄨 𝐿 2 + 𝜂2 ∑ 󵄨󵄨󵄨󵄨𝑎2𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗

𝑢 (𝑡) = 0.08 sin (𝑡) ,

𝑗=1

and

2 2 󵄨 󵄨 󵄨 󵄨 + 𝜂2 ∑ 󵄨󵄨󵄨󵄨𝑏2𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 + 𝜂2 ∑ 󵄨󵄨󵄨󵄨𝑐2𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗

(𝜎𝑖𝑗 (𝑥𝑗 (𝑡)) , (𝑝𝑗 𝑡))2×2 0.4𝑥1 (𝑡) 0.2 (𝑥2 (𝑡) + 𝑥2 (𝑝2 𝑡)) =( ). 0.2𝑥1 (𝑝1 𝑡) 0.1 (𝑥2 (𝑡) + 𝑥2 (𝑝2 𝑡))

𝑗=1

(25)

(𝑎𝑖𝑗 )2×2

0.6 0.3 ), (𝑏𝑖𝑗 )2×2 = ( 0.8 0.2 0.4 0.3 ), (𝑐𝑖𝑗 )2×2 = ( 0.2 0.5 0.5 0.7 ), (𝑑𝑖𝑗 )2×2 = ( 0.3 0.8 (𝑒𝑖𝑗 )2×2

0.4 0.6 =( ), 0.5 0.3

(𝑓𝑖𝑗 )2×2

0.3 0.2 =( ), 0.5 0.4

(30)

𝑗=1

2 2 󵄨 󵄨 󵄨 󵄨 + ∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑑𝑗2 󵄨󵄨󵄨󵄨 𝐿 2 + 𝜂2 ∑ 󵄨󵄨󵄨󵄨𝑑2𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 𝑗=1

Other parameters of system (22)-(23) are given as follows: 0.5 0.3 =( ), 0.4 0.8

𝑗=1

𝑗=1

2 2 󵄨 󵄨 󵄨 󵄨 + 𝜂2 ∑ 󵄨󵄨󵄨󵄨𝑒2𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 + 𝜂2 ∑ 󵄨󵄨󵄨󵄨𝑓2𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 = 4.39327, 𝑗=1

(26)

𝑗=1

𝛼1 = 2.6 2 2 󵄨 󵄨 ≥ ∑𝑒𝜆𝜏 𝜂𝑗 ]𝑗1 + ∑ 𝑒𝜆𝜏 𝜂𝑗 󵄨󵄨󵄨󵄨𝑏𝑗1 󵄨󵄨󵄨󵄨 𝑀1 𝑗=1

𝑗=1

(31)

2 󵄨 󵄨 + ∑𝑒𝜆𝜏 𝜂𝑗 󵄨󵄨󵄨󵄨𝑒𝑗1 󵄨󵄨󵄨󵄨 𝑀1 = 0.889511,

(27)

𝑗=1

𝛼2 = 1.6

(28)

2 2 󵄨 󵄨 ≥ ∑𝑒𝜆𝜏 𝜂𝑗 ]𝑗2 + ∑ 𝑒𝜆𝜏 𝜂𝑗 󵄨󵄨󵄨󵄨𝑏𝑗2 󵄨󵄨󵄨󵄨 𝑀2 𝑗=1

𝑗=1

(32)

2 󵄨 󵄨 + ∑𝑒𝜆𝜏 𝜂𝑗 󵄨󵄨󵄨󵄨𝑒𝑗2 󵄨󵄨󵄨󵄨 𝑀2 = 0.471548,

𝑑1 = 9, 𝑑2 = 8. Take 𝑝1 = 𝑝2 = 0.5, 𝜆 = 1.1, 𝜂1 = 0.5, 𝜂2 = 0.5, 𝛼1 = 2.6, 𝛼2 = 1.6, 𝛽1 = 1.3, 𝛽2 = 1.7, and from the definition of 𝜏, we obtain 𝜏 = 0.6931. It is easy to check that Assumptions 1–3 are satisfied. Moreover, a simple computation yields 2𝜂1 𝑑1 = 9

𝑗=1

𝛽1 = 1.3 ≥

2 2 𝜏 󵄨 󵄨 󵄨 󵄨 (∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑐𝑗1 󵄨󵄨󵄨󵄨 𝑁1 + ∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑓𝑗1 󵄨󵄨󵄨󵄨 𝑁1 ) (1 − 𝛿) 𝑗=1 𝑗=1

(33)

= 0.1581, 2

𝛽2 = 1.7

> (1 + 𝜆) 𝜂1 + 𝛼1 + 𝜏𝛽1 + ∑ 𝜂𝑗 𝜇𝑗1 𝑗=1

2

2

𝑗=1

𝑗=1

≥

󵄨 󵄨 󵄨 󵄨 + ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑎𝑗1 󵄨󵄨󵄨󵄨 𝐿 1 + 𝜂1 ∑ 󵄨󵄨󵄨󵄨𝑎1𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 2

2 󵄨 󵄨 󵄨 󵄨 + 𝜂1 ∑ 󵄨󵄨󵄨󵄨𝑏1𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 + 𝜂1 ∑ 󵄨󵄨󵄨󵄨𝑐1𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 𝑗=1

𝑗=1

2 2 󵄨 󵄨 󵄨 󵄨 + ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑑𝑗1 󵄨󵄨󵄨󵄨 𝐿 1 + 𝜂1 ∑ 󵄨󵄨󵄨󵄨𝑑1𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 𝑗=1

𝑗=1

2 2 󵄨 󵄨 󵄨 󵄨 + 𝜂1 ∑ 󵄨󵄨󵄨󵄨𝑒1𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 + 𝜂1 ∑ 󵄨󵄨󵄨󵄨𝑓1𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 = 5.14103, 𝑗=1

𝑗=1

(29)

2 2 𝜏 󵄨 󵄨 󵄨 󵄨 (∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑐𝑗2 󵄨󵄨󵄨󵄨 𝑁2 + ∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑓𝑗2 󵄨󵄨󵄨󵄨 𝑁2 ) (1 − 𝛿) 𝑗=1 𝑗=1

(34)

= 0.1581. Hence, all the conditions of Theorem 4 are satisfied. System (22)-(23) is mean-square exponentially input-to-state stable. Obviously, system (22)-(23) is mean-square exponentially stable when 𝑢(𝑡) ≡ 0. Example 2 (3-dimension case). Consider the case of 3dimension stochastic recurrent neural networks with multiproportional delays

8

Mathematical Problems in Engineering 3

3

𝑑𝑥𝑖 (𝑡) = [−𝑑𝑖 𝑥𝑖 (𝑡) + ⋀𝑎𝑖𝑗 𝑓𝑗 (𝑥𝑗 (𝑡)) 𝑗=1 [ 3

3

𝑗=1

𝑗=1

+ ⋀𝑏𝑖𝑗 𝑔𝑗 (𝑥𝑗 (𝑝𝑗 𝑡)) + ⋀𝑐𝑖𝑗 ∫

+ ⋁𝑓𝑖𝑗 ∫

𝑡

𝑡−𝜏(𝑡)

𝑗=1

ℎ𝑗 (𝑥𝑗 (𝑠)) 𝑑𝑠 + 𝑢𝑖 (𝑡)] 𝑑𝑡 ]

3

𝑡

𝑡−𝜏(𝑡)

+ ∑𝜎𝑖𝑗 (𝑥𝑗 (𝑡) , 𝑥𝑗 (𝑝𝑗 𝑡)) 𝑑𝑤𝑗 (𝑡) ,

ℎ𝑗 (𝑥𝑗 (𝑠)) 𝑑𝑠

𝑗=1

(35) 3

3

𝑗=1

𝑗=1

𝑥𝑖 (𝑡) = 𝜑𝑖 (𝑡) ,

+ ⋁𝑑𝑖𝑗 𝑓𝑗 (𝑥𝑗 (𝑡)) + ⋁𝑒𝑖𝑗 𝑔𝑗 (𝑥𝑗 (𝑝𝑗 𝑡))

𝑡 ∈ [𝑝, 1] , 𝑖 = 1, 2, 3,

where

𝑓 (𝑥) = 𝑔 (𝑥) = ℎ (𝑥) = 0.2 tanh (𝑥) ,

(37)

𝑢 (𝑡) = 0.1 sin (𝑡) , 0.2𝑥1 (𝑡)

0.2 (𝑥2 (𝑝2 𝑡))

(𝜎𝑖𝑗 )3×3 = (0.4𝑥1 (𝑝1 𝑡) 0.4 (𝑥2 (𝑡) + 𝑥2 (𝑝2 𝑡)) 0.2𝑥1 (𝑡)

0.1𝑥2 (𝑡)

Other parameters of system (22)-(23) are given as follows:

(𝑎𝑖𝑗 )3×3 = ( 0.9 0.3 0.5 ) , 0.2 0.6 −0.3

0.4 0.3 −0.2 (𝑏𝑖𝑗 )3×3 = (0.5 0.8

the definition of 𝜏 that 𝜏 = 0.6931. It is easy to check that Assumptions 1–3 are satisfied. A direct computation gives

𝑗=1

3 3 󵄨 󵄨 󵄨 󵄨 + ∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑎𝑗1 󵄨󵄨󵄨󵄨 𝐿 1 + 𝜂1 ∑ 󵄨󵄨󵄨󵄨𝑎1𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 𝑗=1

𝑗=1

3 3 󵄨 󵄨 󵄨 󵄨 + 𝜂1 ∑ 󵄨󵄨󵄨󵄨𝑏1𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 + 𝜂1 ∑ 󵄨󵄨󵄨󵄨𝑐1𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗

0.2 0.3) ,

𝑗=1

(40)

(𝑑𝑖𝑗 )3×3 = (0.6 0.2 0.3) ,

𝑗=1

3 3 󵄨 󵄨 󵄨 󵄨 + ∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑑𝑗1 󵄨󵄨󵄨󵄨 𝐿 1 + 𝜂1 ∑ 󵄨󵄨󵄨󵄨𝑑1𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 𝑗=1

0.3 0.2 0.4

𝑗=1

3 3 󵄨 󵄨 󵄨 󵄨 + 𝜂1 ∑ 󵄨󵄨󵄨󵄨𝑒1𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 + 𝜂1 ∑ 󵄨󵄨󵄨󵄨𝑓1𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 = 5.60903,

0.5 0.2 0.2

𝑗=1

(𝑒𝑖𝑗 )3×3 = (0.4 0.3 0.5) , 0.5 0.3 0.2

(38)

0.4 (𝑥3 (𝑡) + 𝑥3 (𝑝3 (𝑡)))

(39)

0.6 −0.4 0.5

0.7 0.1 0.3

),

3

0.9 −0.3 0.6

0.7 0.4 0.5

0.2𝑥3 (𝑝3 (𝑡))

> (1 + 𝜆) 𝜂1 + 𝛼1 + 𝜏𝛽1 + ∑𝜂𝑗 𝜇𝑗1

0.4 ) ,

−0.6 0.2 0.5

0.3𝑥3 (𝑡)

2𝜂1 𝑑1 = 6.4

−0.8 0.5 0.5

(𝑐𝑖𝑗 )3×3 = ( 0.3

(36)

𝑗=1

2𝜂2 𝑑2 = 7.2 (41)

(𝑓𝑖𝑗 )3×3 = (0.4 0.1 0.2) , 0.3 0.4 0.3

𝑑1 = 8, 𝑑2 = 9, 𝑑3 = 10. Take 𝑝1 = 𝑝2 = 𝑝3 = 0.5, 𝜆 = 1.3, 𝜂1 = 0.4, 𝜂2 = 0.4, 𝜂3 = 0.4, 𝛼1 = 1.4, 𝛼2 = 1.3, 𝛼3 = 1.4, 𝛽1 = 1.3, 𝛽2 = 1.2, 𝛽3 = 1.1, and it follows from

3

> (1 + 𝜆) 𝜂2 + 𝛼2 + 𝜏𝛽2 + ∑𝜂𝑗 𝜇𝑗2 𝑗=1

3 3 󵄨 󵄨 󵄨 󵄨 + ∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑎𝑗2 󵄨󵄨󵄨󵄨 𝐿 2 + 𝜂2 ∑ 󵄨󵄨󵄨󵄨𝑎2𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 𝑗=1

𝑗=1

3 3 󵄨 󵄨 󵄨 󵄨 + 𝜂2 ∑ 󵄨󵄨󵄨󵄨𝑏2𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 + 𝜂2 ∑ 󵄨󵄨󵄨󵄨𝑐2𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 𝑗=1

𝑗=1

(42)

Mathematical Problems in Engineering

9 𝛽2 = 1.2

3 3 󵄨 󵄨 󵄨 󵄨 + ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑑𝑗2 󵄨󵄨󵄨󵄨 𝐿 2 + 𝜂2 ∑ 󵄨󵄨󵄨󵄨𝑑2𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 𝑗=1

𝑗=1

3

3

𝑗=1

𝑗=1

≥

󵄨 󵄨 󵄨 󵄨 + 𝜂2 ∑ 󵄨󵄨󵄨󵄨𝑒2𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 + 𝜂2 ∑ 󵄨󵄨󵄨󵄨𝑓2𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 = 5.46372, (43) 2𝜂3 𝑑3 = 8

(49)

= 0.28908, 𝛽3 = 1.1

3

> (1 + 𝜆) 𝜂3 + 𝛼3 + 𝜏𝛽3 + ∑ 𝜂𝑗 𝜇𝑗3

≥

𝑗=1

3 3 󵄨 󵄨 󵄨 󵄨 + ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑎𝑗3 󵄨󵄨󵄨󵄨 𝐿 3 + 𝜂3 ∑ 󵄨󵄨󵄨󵄨𝑎3𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 𝑗=1

𝑗=1

(44)

𝑗=1

3 3 󵄨 󵄨 󵄨 󵄨 + ∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑑𝑗3 󵄨󵄨󵄨󵄨 𝐿 3 + 𝜂3 ∑ 󵄨󵄨󵄨󵄨𝑑3𝑗 󵄨󵄨󵄨󵄨 𝐿 𝑗 𝑗=1

3 3 󵄨 󵄨 󵄨 󵄨 + 𝜂3 ∑ 󵄨󵄨󵄨󵄨𝑒3𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 + 𝜂3 ∑ 󵄨󵄨󵄨󵄨𝑓3𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗 = 5.59841, 𝑗=1

𝛼1 = 1.4

Hence, all the conditions of Theorem 4 are satisfied. System (35)-(36) is mean-square exponentially input-to-state stable. Obviously, system (22)-(23) is mean-square exponentially stable when 𝑢(𝑡) ≡ 0.

5. Concluding Remarks

3 3 󵄨 󵄨 ≥ ∑𝑒𝜆𝜏 𝜂𝑗 ]𝑗1 + ∑𝑒𝜆𝜏 𝜂𝑗 󵄨󵄨󵄨󵄨𝑏𝑗1 󵄨󵄨󵄨󵄨 𝑀1 𝑗=1

𝑗=1

(45)

3 󵄨 󵄨 + ∑ 𝑒𝜆𝜏 𝜂𝑗 󵄨󵄨󵄨󵄨𝑒𝑗1 󵄨󵄨󵄨󵄨 𝑀1 = 1.29802, 𝑗=1

𝛼2 = 1.3 3 3 󵄨 󵄨 ≥ ∑𝑒𝜆𝜏 𝜂𝑗 ]𝑗2 + ∑𝑒𝜆𝜏 𝜂𝑗 󵄨󵄨󵄨󵄨𝑏𝑗2 󵄨󵄨󵄨󵄨 𝑀2 𝑗=1

𝑗=1

(46)

3 󵄨 󵄨 + ∑ 𝑒𝜆𝜏 𝜂𝑗 󵄨󵄨󵄨󵄨𝑒𝑗2 󵄨󵄨󵄨󵄨 𝑀2 = 1.08332,

𝛼3 = 1.4 3 3 󵄨 󵄨 ≥ ∑𝑒𝜆𝜏 𝜂𝑗 ]𝑗3 + ∑𝑒𝜆𝜏 𝜂𝑗 󵄨󵄨󵄨󵄨𝑏𝑗3 󵄨󵄨󵄨󵄨 𝑀3 𝑗=1

𝑗=1

In this paper, we have studied mean-square exponential input-to-state stability of a class of stochastic fuzzy recurrent neural networks with multiproportional delays and distributed delays. A key characteristics of this paper is that the nonlinear transformation 𝑦(𝑡) = 𝑥(𝑒𝑡 ) is employed to transform the considered system into stochastic recurrent neural networks with constant delays and variable coefficient, which overcomes the difficulty from multiproportional delays. Moreover, we also consider the effects of distributed delays and fuzzy. In our future works, we will apply the method developed in this paper to study some other important problems such as the stability of multiagent systems.

Data Availability

𝑗=1

The data used to support the findings of this study are available from the corresponding author upon request. (47)

3 󵄨 󵄨 + ∑ 𝑒𝜆𝜏 𝜂𝑗 󵄨󵄨󵄨󵄨𝑒𝑗3 󵄨󵄨󵄨󵄨 𝑀3 = 1.31969,

Conflicts of Interest The authors declare that there are no conflicts of interest regarding the publication of this paper.

𝑗=1

𝛽1 = 1.3

Acknowledgments 3

3

󵄨 󵄨 󵄨 󵄨 𝜏 (∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑐𝑗1 󵄨󵄨󵄨󵄨 𝑁1 + ∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑓𝑗1 󵄨󵄨󵄨󵄨 𝑁1 ) (1 − 𝛿) 𝑗=1 𝑗=1

= 0.48781,

(50)

Remark 3. Obviously, the obtained results in [44, 46] do not apply to Examples 1 and 2 since many factors such as fuzzy logic, multiproportional delays, and distributed delays are considered in Examples 1 and 2.

𝑗=1

𝑗=1

3 3 𝜏 󵄨 󵄨 󵄨 󵄨 (∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑐𝑗3 󵄨󵄨󵄨󵄨 𝑁3 + ∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑓𝑗3 󵄨󵄨󵄨󵄨 𝑁3 ) (1 − 𝛿) 𝑗=1 𝑗=1

= 0.36134.

𝑗=1

3 3 󵄨 󵄨 󵄨 󵄨 + 𝜂3 ∑ 󵄨󵄨󵄨󵄨𝑏3𝑗 󵄨󵄨󵄨󵄨 𝑀𝑗 + 𝜂3 ∑ 󵄨󵄨󵄨󵄨𝑐3𝑗 󵄨󵄨󵄨󵄨 𝑁𝑗

≥

3 3 󵄨 󵄨 󵄨 󵄨 𝜏 (∑ 𝜂𝑗 󵄨󵄨󵄨󵄨𝑐𝑗2 󵄨󵄨󵄨󵄨 𝑁2 + ∑𝜂𝑗 󵄨󵄨󵄨󵄨𝑓𝑗2 󵄨󵄨󵄨󵄨 𝑁2 ) (1 − 𝛿) 𝑗=1 𝑗=1

(48)

This work was jointly supported by the National Natural Science Foundation of China (61773217 and 61374080), the Natural Science Foundation of Jiangsu Province (BK20161552), and Qing Lan Project of Jiangsu Province.

10

References [1] G. Wang, “Existence-stability theorems for strong vector setvalued equilibrium problems in reflexive Banach spaces,” Journal of Inequalities and Applications, vol. 239, pp. 1–14, 2015. [2] Y. Bai and X. Mu, “Global asymptotic stability of a generalized SIRS epidemic model with transfer from infectious to susceptible,” Journal of Applied Analysis and Computation, vol. 8, no. 2, pp. 402–412, 2018. [3] L. Gao, D. Wang, and G. Wang, “Further results on exponential stability for impulsive switched nonlinear time-delay systems with delayed impulse effects,” Applied Mathematics and Computation, vol. 268, pp. 186–200, 2015. [4] Y. Li, Y. Sun, and F. Meng, “New criteria for exponential stability of switched time-varying systems with delays and nonlinear disturbances,” Nonlinear Analysis: Hybrid Systems, vol. 26, pp. 284–291, 2017. [5] L. G. Wang, K. P. Xu, and Q. W. Liu, “On the stability of a mixed functional equation deriving from additive, quadratic and cubic mappings,” Acta Mathematica Sinica, vol. 30, no. 6, pp. 1033– 1049, 2014. [6] L. G. Wang and B. Liu, “Fuzzy stability of a functional equation deriving from additive, quadratic, cubic and quartic functions,” Acta Mathematica Sinica, vol. 55, no. 5, pp. 841–854, 2012. [7] F. Li and Y. Bao, “Uniform stability of the solution for a memorytype elasticity system with nonhomogeneous boundary control condition,” Journal of Dynamical and Control Systems, vol. 23, no. 2, pp. 301–315, 2017. [8] Y.-H. Feng and C.-M. Liu, “Stability of steady-state solutions to Navier-Stokes-Poisson systems,” Journal of Mathematical Analysis and Applications, vol. 462, no. 2, pp. 1679–1694, 2018. [9] J. Hu and A. Xu, “On stability of F-Gorenstein flat categories,” Algebra Colloquium, vol. 23, no. 2, pp. 251–262, 2016. [10] Q. X. Zhu and Q. Y. Zhang, “Pth moment exponential stabilisation of hybrid stochastic differential equations by feedback controls based on discrete-time state observations with a time delay,” IET Control Theory & Applications, vol. 11, no. 12, pp. 1992–2003, 2017. [11] W. W. Sun, “Stabilization analysis of time-delay Hamiltonian systems in the presence of saturation,” Applied Mathematics and Computation, vol. 217, no. 23, pp. 9625–9634, 2011. [12] X. Zheng, Y. Shang, and X. Peng, “Orbital stability of solitary waves of the coupled Klein-Gordon-Zakharov equations,” Mathematical Methods in the Applied Sciences, vol. 40, no. 7, pp. 2623–2633, 2017. [13] C. Liu and Y.-J. Peng, “Stability of periodic steady-state solutions to a non-isentropic Euler-Maxwell system,” Zeitschrift f¨ur Angewandte Mathematik und Physik, vol. 68, no. 5, Art. 105, 17 pages, 2017. [14] Q. Zhu and H. Wang, “Output feedback stabilization of stochastic feedforward systems with unknown control coefficients and unknown output function,” Automatica, vol. 87, pp. 166–175, 2018. [15] Q. Zhu, “Razumikhin-type theorem for stochastic functional differential equations with L´evy noise and Markov switching,” International Journal of Control, vol. 90, no. 8, pp. 1703–1712, 2017. [16] Y. Yu, M. Meng, J.-e. Feng, and P. Wang, “Stabilizability analysis and switching signals design of switched Boolean networks,” Nonlinear Analysis: Hybrid Systems, vol. 30, pp. 31–44, 2018.

Mathematical Problems in Engineering [17] M. Li and J. Wang, “Representation of solution of a RiemannLiouville fractional differential equation with pure delay,” Applied Mathematics Letters, vol. 85, pp. 118–124, 2018. [18] S. Jiao, H. Shen, Y. Wei, X. Huang, and Z. Wang, “Further results on dissipativity and stability analysis of Markov jump generalized neural networks with time-varying interval delays,” Applied Mathematics and Computation, vol. 336, pp. 338–350, 2018. [19] J. Zhang and J. Wang, “Numerical analysis for Navier-Stokes equations with time fractional derivatives,” Applied Mathematics and Computation, vol. 336, pp. 481–489, 2018. [20] X. Cao and J. Wang, “Finite-time stability of a class of oscillating systems with two delays,” Mathematical Methods in the Applied Sciences, vol. 41, no. 13, pp. 4943–4954, 2018. [21] L. Li, W. Qi, X. Chen, Y. Kao, X. Gao, and Y. Wei, “Stability analysis and control synthesis for positive semi-Markov jump systems with time-varying delay,” Applied Mathematics and Computation, vol. 332, pp. 363–375, 2018. [22] J. Wang, B. Zhang, Z. Sun, W. Hao, and Q. Sun, “A novel conjugate gradient method with generalized Armijo search for efficient training of feedforward neural networks,” Neurocomputing, vol. 275, pp. 308–316, 2018. [23] J. Wang, Y. Wen, Z. Ye, L. Jian, and H. Chen, “Convergence analysis of BP neural networks via sparse response regularization,” Applied Soft Computing, vol. 61, pp. 354–363, 2017. [24] X. Zheng, Y. Shang, and X. Peng, “Orbital stability of periodic traveling wave solutions to the generalized Zakharov equations,” Acta Mathematica Scientia, vol. 37, no. 4, pp. 998–1018, 2017. [25] Y. Guo, “Mean square global asymptotic stability of stochastic recurrent neural networks with distributed delays,” Applied Mathematics and Computation, vol. 215, no. 2, pp. 791–795, 2009. [26] Q. Zhu, S. Song, and T. Tang, “Mean square exponential stability of stochastic nonlinear delay systems,” International Journal of Control, vol. 90, no. 11, pp. 2384–2393, 2017. [27] W. Sun and L. Peng, “Observer-based robust adaptive control for uncertain stochastic Hamiltonian systems with state and input delays,” Lithuanian Association of Nonlinear Analysts. Nonlinear Analysis: Modelling and Control, vol. 19, no. 4, pp. 626– 645, 2014. [28] L. G. Wang and B. Liu, “The Hyers-Ulam stability of a functional equation deriving from quadratic and cubic functions in quasi𝛽-normed spaces,” Acta Mathematica Sinica, vol. 26, no. 12, pp. 2335–2348, 2010. [29] Y. Guo, “Nontrivial periodic solutions of nonlinear functional differential systems with feedback control,” Turkish Journal of Mathematics, vol. 34, no. 1, pp. 35–44, 2010. [30] L. Li, F. Meng, and P. Ju, “Some new integral inequalities and their applications in studying the stability of nonlinear integrodifferential equations with time delay,” Journal of Mathematical Analysis and Applications, vol. 377, no. 2, pp. 853–862, 2011. [31] M. Li and J. Wang, “Exploring delayed Mittag-Leffler type matrix functions to study finite time stability of fractional delay differential equations,” Applied Mathematics and Computation, vol. 324, pp. 254–265, 2018. [32] G. Liu, S. Xu, Y. Wei, Z. Qi, and Z. Zhang, “New insight into reachable set estimation for uncertain singular time-delay systems,” Applied Mathematics and Computation, vol. 320, pp. 769– 780, 2018. [33] W. Sun, Y. Wang, and R. Yang, “L2 disturbance attenuation for a class of time delay Hamiltonian systems,” Journal of Systems Science & Complexity, vol. 24, no. 4, pp. 672–682, 2011.

Mathematical Problems in Engineering [34] Y. Guo, “Globally robust stability analysis for stochastic CohenGrossberg neural networks with impulse and time-varying delays,” Ukrainian Mathematical Journal, vol. 69, no. 8, pp. 1049–1060, 2017. [35] W. Qi, Y. Kao, X. Gao, and Y. Wei, “Controller design for timedelay system with stochastic disturbance and actuator saturation via a new criterion,” Applied Mathematics and Computation, vol. 320, pp. 535–546, 2018. [36] Y. Guo, “Global stability analysis for a class of Cohen-Grossberg neural network models,” Bulletin of the Korean Mathematical Society, vol. 49, no. 6, pp. 1193–1198, 2012. [37] D. Li and Q. Zhu, “Comparison principle and stability of stochastic delayed neural networks with Markovian switching,” Neurocomputing, vol. 123, pp. 436–442, 2014. [38] Y. Guo, “Global asymptotic stability analysis for integro-differential systems modeling neural networks with delays,” Zeitschrift f¨ur Angewandte Mathematik und Physik, vol. 61, no. 6, pp. 971–978, 2010. [39] Q. Zhu, J. Cao, T. Hayat, and F. Alsaadi, “Robust stability of Markovian jump stochastic neural networks with time delays in the leakage terms,” Neural Processing Letters, vol. 41, no. 1, pp. 1–27, 2013. [40] Y. Guo, “Mean square exponential stability of stochastic delay cellular neural networks,” Electronic Journal of Qualitative Theory of Differential Equations, vol. 34, pp. 1–10, 2013. [41] Q. Zhu and J. Cao, “Exponential stability of stochastic neural networks with both Markovian jump parameters and mixed time delays,” IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 41, no. 2, pp. 341–353, 2011. [42] Q. Zhu and J. Cao, “Stability analysis of markovian jump stochastic BAM neural networks with impulse control and mixed time delays,” IEEE Transactions on Neural Networks and Learning Systems, vol. 23, no. 3, pp. 467–479, 2012. [43] Y. Guo, “Exponential stability analysis of travelling waves solutions for nonlinear delayed cellular neural networks,” Dynamical Systems. An International Journal, vol. 32, no. 4, pp. 490–503, 2017. [44] Q. Zhu and X. Li, “Exponential and almost sure exponential stability of stochastic fuzzy delayed Cohen–Grossberg neural networks,” Fuzzy Sets and Systems, vol. 203, pp. 74–94, 2012. [45] P. Balasubramaniam and M. S. Ali, “Robust exponential stability of uncertainfuzzy Cohen-Grossberg neural networks with timevarying delays,” Fuzzy Sets and Systems, vol. 161, no. 4, pp. 608– 618, 2010. [46] L. Zhou and X. Liu, “Mean-square exponential input-to-state stability of stochastic recurrent neural networks with multi-proportional delays,” Neurocomputing, vol. 219, pp. 396–403, 2017.

11

Advances in

Operations Research Hindawi www.hindawi.com

Volume 2018

Advances in

Decision Sciences Hindawi www.hindawi.com

Volume 2018

Journal of

Applied Mathematics Hindawi www.hindawi.com

Volume 2018

The Scientific World Journal Hindawi Publishing Corporation http://www.hindawi.com www.hindawi.com

Volume 2018 2013

Journal of

Probability and Statistics Hindawi www.hindawi.com

Volume 2018

International Journal of Mathematics and Mathematical Sciences

Journal of

Optimization Hindawi www.hindawi.com

Hindawi www.hindawi.com

Volume 2018

Volume 2018

Submit your manuscripts at www.hindawi.com International Journal of

Engineering Mathematics Hindawi www.hindawi.com

International Journal of

Analysis

Journal of

Complex Analysis Hindawi www.hindawi.com

Volume 2018

International Journal of

Stochastic Analysis Hindawi www.hindawi.com

Hindawi www.hindawi.com

Volume 2018

Volume 2018

Advances in

Numerical Analysis Hindawi www.hindawi.com

Volume 2018

Journal of

Hindawi www.hindawi.com

Volume 2018

Journal of

Mathematics Hindawi www.hindawi.com

Mathematical Problems in Engineering

Function Spaces Volume 2018

Hindawi www.hindawi.com

Volume 2018

International Journal of

Differential Equations Hindawi www.hindawi.com

Volume 2018

Abstract and Applied Analysis Hindawi www.hindawi.com

Volume 2018

Discrete Dynamics in Nature and Society Hindawi www.hindawi.com

Volume 2018

Advances in

Mathematical Physics Volume 2018

Hindawi www.hindawi.com

Volume 2018