Hindawi Mathematical Problems in Engineering Volume 2018, Article ID 6289019, 11 pages https://doi.org/10.1155/2018/6289019
Research Article Mean-Square Exponential Input-to-State Stability of Stochastic Fuzzy Recurrent Neural Networks with Multiproportional Delays and Distributed Delays Tianyu Wang,1,2 Quanxin Zhu
,1 and Jingwei Cai3
1
Key Laboratory of HPC-SIP (MOE), College of Mathematics and Statistics, Hunan Normal University, Changsha 410081, Hunan, China 2 School of Mathematical Sciences and Institute of Finance and Statistics, Nanjing Normal University, Nanjing 210023, Jiangsu, China 3 Basis Department, Jiangsu Polytechnic College of Agriculture and Forestry, Zhenjiang 212400, Jiangsu, China Correspondence should be addressed to Quanxin Zhu;
[email protected] Received 15 May 2018; Revised 21 July 2018; Accepted 11 September 2018; Published 4 October 2018 Academic Editor: Sabri Arik Copyright Š 2018 Tianyu Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. We are interested in a class of stochastic fuzzy recurrent neural networks with multiproportional delays and distributed delays. By constructing suitable Lyapunov-Krasovskii functionals and applying stochastic analysis theory, ItĚ đâs formula and Dynkinâs formula, we derive novel sufficient conditions for mean-square exponential input-to-state stability of the suggested system. Some remarks and discussions are given to show that our results extend and improve some previous results in the literature. Finally, two examples and their simulations are provided to illustrate the effectiveness of the theoretical results.
1. Introduction Since the theory and application of cellular neural networks was proposed by L.O. Chua and L. Yang in 1998, neural networks have become a hot topic. They can be applied to the analysis of static images and signal processing, optimization, pattern recognition, and image processing. Usually, a neural network is an information processing system. Its characteristic is local connections between cells, and its output functions are piecewise linear. Clearly, it is easy to realize large-scale nonlinear analog signals in real time and parallel processing, which improves the running speed. As is well-known, the stability is an important theoretical problem in the field of dynamics systems (e.g., see [1â24]). Thus, it is interesting to investigate the stability of nonlinear neural networks. On one hand, the switching speed of amplifier is limited and the errors occur in electronic components. As a consequence, delays happen to dynamics systems, and the delays often destroy the stability of dynamics systems, even cause the heavy oscillation (e.g., see [25â35]). So it is significant to study the stability of delayed neural networks. For example,
[36] discussed the global stability analysis for a class of Cohen-Grossberg neural network models. A new comparison principle is firstly introduced to study the stability of stochastic delayed neural networks in [37]. Global asymptotic stability analysis for integrodifferential systems modeling neural networks with delays was investigated in [38]. In [39], Zhu et al. considered the robust stability of Markovian jump stochastic neural networks with delays in the leakage terms. For more related results we refer the authors to [25, 40â43] and references therein. It is worthy to point out that all of the works aforementioned were focused on the traditional types of delays such as constant delays, time-varying bounded delays, and bounded distribute delays. However, delays in real lives may be unbounded. In this case, a class of so-called proportional delays can be used to describe the model of human brain, where delays give information of history and the entire history affects the present. Thus, it is interesting to study the stability of neural networks with proportional delays. On the other hand, all of the works mentioned above were focused on the traditional neural networks models,
2
Mathematical Problems in Engineering đ
which did not consider fuzzy logic. But in the factual operations, we always encounter some inconveniences such as the complicity and the uncertainty or vagueness. In fact, vagueness always opposite to exactness. Therefore, vagueness can not be avoided in the human way of regarding the world. So fuzzy theory is regarded as the best suitable setting to take vagueness into consideration. It is reported that there have appeared many results on the stability analysis of fuzzy neural networks in the literatures. For example, Li and Zhu introduced a new way to study the stability of stochastic fuzzy delayed Cohen-Grossberg neural networks [44]. They used Lyapunov functional, stochastic analysis technique and nonnegative semimartingale convergence theorem to solve the problem. In [45], Balasubramaniam and Ali studied the robust exponential stability of uncertain fuzzy Cohen-Grossberg neural networks with time-varying delays. However, to the best of our knowledge, until now, there have been no works on the stability of fuzzy neural networks with proportional delays. Motivated by the above discussion, in this paper we investigate the problem of the input-to-state stability analysis for a class of the stochastic fuzzy delayed recurrent neural networks with multiproportional delays and distributed delays. Some novel sufficient conditions are derived to ensure the mean-square exponential input-to-state stability of the suggested system based on constructing suitable LyapunovKrasovskii functionals and stochastic analysis theory, ItĚ đâs formula and Dynkinâs formula. Several remarks and discussions are presented to show that our results extend and improve some previous results in the literature. Finally, two examples and their simulations are given to show the effectiveness of the obtained results. The rest of the paper is as follows. In Section 2, we introduce the model, some necessary assumptions, and preliminaries. In Section 3, we investigate the mean-square exponential stability of the considered model. In Section 4, we provide two examples to illustrate the effectiveness of the obtained results. Finally, we conclude the paper in Section 5.
for all đĄ ⼠0, đ = 1, 2, . . . , đ, where đĽđ (đĄ) represents the state variable of the đth neuron at time đĄ; đđ is the self-feedback connection weight strength. The constants đđđ , đđđ , đđđ ,đđđ ,đđđ , and đđđ are the connection weights of the đth neuron to the đth neuron at time đĄ or đđ đĄ. đđ (đĽđ (đĄ)), đđ (đĽđ (đđ đĄ)), and âđ (đĽđ (đĄ)) are the đth neuron activation functions at time đĄ or đđ đĄ. đ˘đ (đĄ) is the control input of the ith neuron at time đĄ, and đ˘ = (đ˘1 (đĄ), đ˘2 (đĄ), . . . , đ˘đ (đĄ)) â đâ . â and â denote the fuzzy AND and fuzzy OR operation, respectively. The noise perturbation đđđ : RĂR is a Borel measurable function, and đ¤đ (đĄ), đ ⼠0, đ = 1, 2, . . . , đ are scalar standard Brownian motions defined on a complete probability space (Ί, F, P) with a natural filtration {FđĄ }đĄâĽ1 . The constants đđ , đ = 1, 2, . . . , đ are proportional delay factors and satisfy 0 < đđ đĄ = đĄ â (1 â đđ )đĄ, where (1 â đđ )đĄ are time-varying continuous functions that satisfy (1 â đđ )đĄ ół¨â +â as đĄ ół¨â +â and đđ ≠ 1. đ(đĄ) is the timeĚ â¤ đż < 1. varying delay, which satisfies 0 ⤠đ(đĄ) ⤠đĚ and đ(đĄ) Throughout this paper, we assume that the following conditions are satisfied.
2. Model Formulation and Preliminaries
Assumption 1. There exist positive constants đż đ and đđ such that
Let đś([đ, 1]; Rđ ) denote the family of continuous functions đ from [đ, 1] to Rđ with the uniform norm âđâ = supđâ¤đâ¤1 |đ(đ)|. Denote by L2F0 ([đ, 1]; Rđ ) the family of all F0 measurable, đś([đ, 1]; Rđ ) valued stochastic variables đ = 1 đ(đ ) : đ ⤠đ ⤠1 satisfying âŤđ E|đ(đ )|2 đđ < â, and đś([âđ, 1]; Rđ ) valued stochastic variables đ = đ(đ ) : âđ ⤠đ ⤠0 0 satisfying âŤâđ E|đ(đ )|2 đđ < â, in which E stands for the correspondent expectation operator with respect to the given probability measure đ. đâ denotes the class of essentially bounded functions đ˘ from [1, â) to R with âđ˘ââ = ess supđĄâĽ1 |đ˘(đĄ)| < â, R denotes real number. Rđ denotes đ dimensions Euclidean space. In this section, we consider the following class of stochastic fuzzy delayed recurrent neural networks with multiproportional delays and distributed delays:
đđĽđ (đĄ) = [âđđ đĽđ (đĄ) + âđđđ đđ (đĽđ (đĄ)) đ=1 [ đ
đ
đ=1
đ=1
+ âđđđ đđ (đĽđ (đđ đĄ)) + âđđđ ⍠đ
đ
đ=1
đ=1
đĄ
đĄâđ(đĄ)
âđ (đĽđ (đ )) đđ
+ âđđđ đđ (đĽđ (đĄ)) + âđđđ đđ (đĽđ (đđ đĄ)) đ
+ âđđđ ⍠đ=1
đĄ
đĄâđ(đĄ)
(1)
âđ (đĽđ (đ )) đđ + đ˘đ (đĄ)] đđĄ ]
đ
+ âđđđ (đĽđ (đĄ) , đĽđ (đđ đĄ)) đđ¤đ (đĄ) , đ=1
đĽđ (đĄ) = đđ (đĄ) ,
đ ⤠đĄ ⤠0,
(2)
óľ¨óľ¨ óľ¨ óľ¨óľ¨đđ (đ˘) â đđ (V)óľ¨óľ¨óľ¨ ⤠đż đ |đ˘ â V| , óľ¨ óľ¨óľ¨ óľ¨óľ¨đđ (đ˘) â đđ (V)óľ¨óľ¨óľ¨ ⤠đđ |đ˘ â V| ,
(3)
for all đ˘, V â R and đ = 1, 2, . . . , đ. Assumption 2. There exist nonnegative constants đđđ , ]đđ , and đżđđ such that [đđđ (đĽ, đŚ, đ§) â đđđ (đĽó¸ , đŚó¸ , đ§ó¸ )] 2
2 2
2
⤠đđđ (đĽ â đĽó¸ ) + ]đđ (đŚ â đŚó¸ ) + đżđđ (đ§ â đ§ó¸ ) for all đĽ, đĽó¸ , đŚ, đŚó¸ , đ§, đ§ó¸ â R and đ, đ = 1, 2, . . . , đ.
(4)
Mathematical Problems in Engineering
3 đ đ óľ¨ óľ¨ 2đđ đđ ⼠(1 + đ) đđ + đźđ + đđ˝đ + âđđ đđđ + âđđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ
Assumption 3.
đ=1
đđ (0) = đđ (0) = âđ (0) = đ˘đ (0) = 0, đđđ (0, 0) = 0,
đ, đ = 1, 2, . . . , đ.
đ đ óľ¨ óľ¨ óľ¨ óľ¨ + đđ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ + đđ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ
(5)
đ=1
Obviously, under Assumptions 1â3 we see that there exists a unique solution of system (1)-(2). Let đĽ(đĄ, đ) denote the solution from the initial data đĽ(đ ) = đ(đ ) on đ â [đ, 1] in L2F0 ([đ, 1]; Rđ ). It is clear that system (1)-(2) has a trivial solution or zero solution đĽ(đĄ; 0) ⥠0 corresponding to the initial data đ(đ ) = 0. By applying the following variable Ěđ (đĄ) = transformations đŚđ (đĄ) = đĽđ (đđĄ ), Vđ (đĄ) = đ˘đ (đđĄ ), đ¤ đĄ đĄ đ¤đ (đ ), đđ (đĄ) = đđ (đ ), then system (1)-(2) is equivalently transformed into the following stochastic recurrent neural networks with constant delays and time-varying coefficients
đ=1
đ đ óľ¨ óľ¨ óľ¨ óľ¨ + đđ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ + âđđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ đ=1
đ=1
đ=1
đ óľ¨ óľ¨ + đđ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ , đ=1
đ đ óľ¨ óľ¨ đźđ ⼠â đđđ đđ ]đđ + âđđđ đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ đ=1
+ âđđđ đđ (đŚđ (đĄ â đđ )) + âđđđ ⍠đ=1
đ=1
đ
đ
đ=1
đ=1
đĄ
đĄâđ(đĄ)
đ=1
đ˝đ âĽ
đŚđ (đĄ) = đđ (đĄ) ,
(11)
Proof. Since system (6)-(7) is equivalent to system (1)-(2), we only need to prove that the trivial solution of system (6)-(7) is mean-square exponentially input-to-state stable. To this end, we let đ(đĄ) = (đđđ (đĄ))đĂđ , đđđ = đđđ (đŚđ (đĄ), đŚđ (đĄâđđ )), for the sake of simplicity.
âđ (đŚđ (đ )) đđ + ]đ (đĄ)] đđĄ + âđđđ đĄâđ(đĄ) đ=1 ]
â
(đŚđ (đĄ) , đŚđ (đĄ â đđ )) đĚ đ¤đ (đĄ) ,
đ đ đ óľ¨ óľ¨ óľ¨ óľ¨ (âđđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ + âđđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ ) , (1 â đż) đ=1 đ=1
(6)
đ
đĄ
đđ
đ=1
+ âđđđ đđ (đŚđ (đĄ)) + âđđđ đđ (đŚđ (đĄ â đđ )) đ
(10)
óľ¨ óľ¨ + âđ đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ ,
âđ (đŚđ (đ )) đđ
+ âđđđ âŤ
đ=1
đ
đđŚđ (đĄ) = đđĄ [âđđ đŚđ (đĄ) + âđđđ đđ (đŚđ (đĄ)) đ=1 [ đ
(9)
đ=1
đ đ óľ¨ óľ¨ óľ¨ óľ¨ + đđ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ + đđ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ
đ
đ
đ=1
đĄ ⼠0,
đ
â đ ⤠đĄ ⤠0,
(7)
where
đĄ
đ (đĄ, đŚ (đĄ)) = đ(đâ1)đĄ âđđ đŚđ2 (đĄ) + âŤ
đĄâđđ
đ=1
+âŤ
0
đĄ
âŤ
âđ(đĄ) đĄ+đ
đđ
đ
đ
âđ˝đ đŚđ2 đ=1
đ
đđđ âđźđ đŚđ2 (đ ) đđ đ=1
(12)
(đ ) đđ .
đđ = â log đđ , đ = 1, 2, . . . , đ, đ = max {Ě đ, max đđ } , 1â¤đâ¤đ
Then by ItËoâs formula, we have the following stochastic differential equation: (8)
đđ (đĄ) â đś ([âđ, 0] ; R) ,
đđ (đĄ, đŚ (đĄ)) = Lđ (đĄ, đŚ (đĄ)) đđĄ đ
đ (đĄ) = (đ1 (đĄ) , đ2 (đĄ) , . . . , đđ (đĄ)) â đś ([âđ, 0] ; Rđ ) .
3. Main Results In this section, we will discuss the mean-square exponential input-to-state stability of the trivial solution for system (1)-(2) under Assumptions 1â3. Theorem 4. Let Assumptions 1â3 hold. The trivial solution of system (1)-(2) is mean-square exponentially input-to-state stable, if there exist positive scalars đđ , đźđ , đ˝đ , (đ = 1, 2, ..., đ), đ > 1 such that
+ đđŚ (đĄ, đŚ (đĄ)) đ (đĄ) đđ¤ (đĄ) ,
(13)
where đđŚ (đĄ, đŚ(đĄ)) = (đđ(đĄ, đŚ(đĄ))/đđŚ1 , . . . , đđ(đĄ, đŚ(đĄ))/đđŚđ ) and L is the weak infinitesimal operator such that đ
Lđ (đĄ, đŚ (đĄ)) = (đ â 1) đ(đâ1)đĄ âđđ đŚđ2 (đĄ) đ=1
đ
+ 2đ(đâ1)đĄ âđđ đŚđ (đĄ) đđđĄ [âđđ đŚđ (đĄ) đ=1 [
4
Mathematical Problems in Engineering đ
đ
đ
đ=1
đ=1
đ=1
+ âđđđ đđ (đŚđ (đĄ)) + âđđđ đđ (đŚđ (đĄ â đđ )) + âđđđ â
âŤ
đĄâđ(đĄ)
âđ (đŚđ (đ )) đđ + âđđđ đđ (đŚđ (đĄ)) đ=1
đ
đ
đ=1
đ=1
+ âđđđ đđ (đŚđ (đĄ â đđ )) + âđđđ ⍠đ
đ
đ=1
đ=1
đĄ
âđ (đŚđ (đ )) đđ
đĄâđ(đĄ)
+ ]đ (đĄ)] + đ(đâ1)đĄ âđđ âđ2 (đŚđ (đĄ) , đŚđ (đĄ â đđ )) ] đđĄ
+đ
đ
âđźđ đŚđ2 đ=1
đ(đĄâđ)
(đĄ) â đ
đđ
đ
âđźđ đŚđ2 đ=1
đ
âđ˝đ đŚđ2 (đ ) đđ
đĄâđ(đĄ) đ=1
đ=1 đ
đ
đ
đ=1
đ=1
đ=1
⤠đđđđĄ âđđ đŚđ2 (đĄ) â 2đđđĄ âđđ đđ đŚđ2 (đĄ) + 2đđđĄ âđđ đŚđ (đĄ) đ
đ
đ=1
đ=1
â
âđđđ đđ (đŚđ (đĄ)) + 2đđđĄ âđđ đŚđ (đĄ) đ
đ
đ=1
đ=1
đ=1
đ=1
đ
đ=1
đ=1
đ
đ
đ=1
đ=1
đ=1
đ
đ=1
đ
đ
đ
đ=1
đ=1
đ=1
đ=1
đ
đ=1
đ=1
đ
đ
đ=1
đ=1
đ
đ=1
đ=1
đ=1
đ
đ
đ=1
đ=1
đ=1
đ=1
đ=1
đ
âđ˝đ đŚđ2 (đ ) đđ
⤠đ â (â2đđ đđ + đđđ + âđđ đđđ + đźđ + đ
đ=1
đ đ óľ¨ đĄ óľ¨óľ¨ óľ¨ óľ¨ óľ¨ óľ¨óľ¨ óľ¨ óľ¨ (đŚđ (đ )) đđ óľ¨óľ¨óľ¨ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ óľ¨óľ¨óľ¨âŤ óľ¨ óľ¨óľ¨ đĄâđ(đĄ) óľ¨ đ=1 đ=1
đĄâđ(đĄ) đ=1
đ=1
đđĄ
đĄ
đ
âđ˝đ đŚđ2 (đ ) đđ
đ đ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ óľ¨óľ¨óľ¨óľ¨đŚđ (đĄ â đđ )óľ¨óľ¨óľ¨óľ¨
+ đđđĄ âđźđ đŚđ2 (đĄ) â đđ(đĄâđ) âđźđ đŚđ2 (đĄ â đđ ) đ
đĄ
đ
đ=1
+ đđđđĄ âđ˝đ đŚđ2 (đĄ) â (1 â đż) đđđĄ âŤ
đ=1
đ
đ=1
+ đđđĄ âđđ âđđđ đŚđ2 + đđđĄ âđđ â]đđ đŚđ2 (đĄ â đđ ) đ
đ=1
đ đ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨
âđ (đŚđ (đ )) đđ + 2đđđĄ âđđ đŚđ (đĄ) ]đ (đĄ)
đ
đ
đ đ óľ¨ đĄ óľ¨óľ¨ óľ¨ óľ¨ óľ¨ óľ¨óľ¨ óľ¨ óľ¨ (đŚđ (đ )) đđ óľ¨óľ¨óľ¨ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ óľ¨óľ¨óľ¨âŤ óľ¨ óľ¨óľ¨ óľ¨ đĄâđ(đĄ) đ=1 đ=1
â
âđđđ đđ (đŚđ (đĄ â đđ )) + 2đđđĄ âđđ đŚđ (đĄ) âđđđ
đĄâđ(đĄ)
đ
óľ¨ óľ¨ â
óľ¨óľ¨óľ¨]đ (đĄ)óľ¨óľ¨óľ¨ + đđđĄ â (âđâđđ đźđ + âđđ ]đđ ) đŚđ2 (đĄ â đđ )
đ=1
đ
đĄ
đ óľ¨óľ¨ óľ¨óľ¨ đĄ óľ¨ óľ¨ óľ¨ óľ¨ âđ (đŚđ (đ )) đđ â âđ (0)óľ¨óľ¨óľ¨ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ â
óľ¨óľ¨óľ¨âŤ óľ¨óľ¨ óľ¨óľ¨ đĄâđ(đĄ) đ=1
đ đ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ óľ¨óľ¨óľ¨óľ¨đŚđ (đĄ â đđ )óľ¨óľ¨óľ¨óľ¨
â
âđđđ đđ (đŚđ (đĄ)) + 2đđđĄ âđđ đŚđ (đĄ)
â
âŤ
đ=1
óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨
âđ (đŚđ (đ )) đđ + 2đđđĄ âđđ đŚđ (đĄ)
đ
đ=1
⤠đđđĄ â (â2đđ đđ + đđđ + âđđ đđđ + đźđ + đđ˝đ ) đŚđ2 (đĄ)
đ
đĄâđ(đĄ)
đ=1
đĄâđ(đĄ) đ=1
đ
đĄ
đ đ óľ¨ óľ¨ óľ¨ óľ¨óľ¨ óľ¨ â đđ (0)óľ¨óľ¨óľ¨óľ¨ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ óľ¨óľ¨óľ¨óľ¨đđ (đŚđ (đĄ â đđ ))
+ đđđĄ (đż â 1) âŤ
â
âđđđ đđ (đŚđ (đĄ â đđ )) + 2đđđĄ âđđ đŚđ (đĄ) âđđđ â
âŤ
đ đ óľ¨óľ¨ óľ¨ óľ¨ óľ¨ óľ¨óľ¨ óľ¨ â âđ (0)óľ¨óľ¨óľ¨ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ óľ¨óľ¨óľ¨óľ¨đđ (đŚđ (đĄ)) óľ¨óľ¨ đ=1 đ=1
đ=1
đ
đĄ
đ=1
đ đ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ â đđ (0)óľ¨óľ¨óľ¨óľ¨ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨
(đĄ â đđ ) + đ (đĄ)
â
đđđĄ âđ˝đ đŚđ2 (đĄ) â (1 â đĚ (đĄ)) đđđĄ âŤ
đ=1
đ đ óľ¨ đĄ óľ¨ óľ¨ óľ¨ óľ¨óľ¨ óľ¨ â (đŚ (đ )) đđ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ óľ¨óľ¨óľ¨âŤ óľ¨óľ¨ đĄâđ(đĄ) đ đ đ=1 đ=1
đ
đĄ
đ đ óľ¨ óľ¨ óľ¨ óľ¨óľ¨ óľ¨ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ óľ¨óľ¨óľ¨óľ¨đđ (đŚđ (đĄ â đđ )) â đđ (0)óľ¨óľ¨óľ¨óľ¨
đđ˝đ ) đŚđ2
óľ¨ óľ¨ óľ¨ óľ¨óľ¨ óľ¨ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ óľ¨óľ¨óľ¨óľ¨đđ (đŚđ (đĄ)) â đđ (0)óľ¨óľ¨óľ¨óľ¨
(đĄ)
đ
đ
đ=1
đ=1
óľ¨óľ¨ óľ¨ óľ¨ + 2đđđĄ âđđ óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ óľ¨óľ¨óľ¨]đ (đĄ)óľ¨óľ¨óľ¨ + đđđĄ â (âđâđđ đźđ đ
đĄ
+ âđđ ]đđ ) đŚđ2 (đĄ â đđ ) + đđđĄ (đż â 1) ⍠đ=1
â
(đ ) đđ
đ
âđ˝đ đŚđ2
đĄâđ(đĄ) đ=1
Mathematical Problems in Engineering đ
đ
đ=1
đ=1
đđĄ
5
⤠đ â (â2đđ đđ + đđđ + âđđ đđđ + đźđ +
đđ˝đ ) đŚđ2
(đĄ)
đ đ óľ¨ óľ¨ óľ¨2 óľ¨ óľ¨2 óľ¨ + đđđĄ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ (óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ + óľ¨óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨óľ¨ )
2
đ đ đĄ óľ¨ óľ¨ + â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ ) â (⍠đ=1
đĄâđ(đĄ)
đ=1
đŚđ (đ ) đđ )
+ đđđĄ max đđ â]â2â . 1â¤đâ¤đ
đ=1 đ=1
đ đ óľ¨ óľ¨ óľ¨2 óľ¨ óľ¨2 óľ¨ + đđđĄ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ (óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ + óľ¨óľ¨óľ¨óľ¨đŚđ (đĄ â đđ )óľ¨óľ¨óľ¨óľ¨ ) đ=1 đ=1
(14) Letting đ˝ = min đ˝đ , then we obtain đ
Lđ (đĄ, đŚ (đĄ)) ⤠âđđđĄ min [2đđ đđ â đđđ â â đđ đđđ 1â¤đâ¤đ
óľ¨ đĄ óľ¨óľ¨2 óľ¨ óľ¨ óľ¨ óľ¨2 óľ¨óľ¨ óľ¨ + đđđĄ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ (óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ + óľ¨óľ¨óľ¨âŤ đŚđ (đ ) đđ óľ¨óľ¨óľ¨ ) óľ¨ óľ¨óľ¨ đĄâđ(đĄ) óľ¨ đ=1 đ=1
đ=1
đ
đ
đ đ óľ¨ óľ¨ óľ¨ óľ¨ â đźđ â đđ˝đ â đđ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ đ=1
đ
đ
óľ¨ óľ¨ óľ¨2 óľ¨ óľ¨2 óľ¨ + đđđĄ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ (óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ + óľ¨óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨óľ¨ ) đ=1 đ=1 đ
đ
óľ¨ óľ¨ óľ¨2 óľ¨ óľ¨2 óľ¨ + đ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ (óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ + óľ¨óľ¨óľ¨óľ¨đŚđ (đĄ â đđ )óľ¨óľ¨óľ¨óľ¨ ) đđĄ
đ=1 đ=1
óľ¨ đĄ óľ¨óľ¨óľ¨2 óľ¨ óľ¨ óľ¨ óľ¨2 óľ¨óľ¨ + đ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ (óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ + óľ¨óľ¨óľ¨âŤ đŚđ (đ ) đđ óľ¨óľ¨óľ¨ ) óľ¨óľ¨ đĄâđ(đĄ) óľ¨óľ¨ đ
đ
đđĄ
đ đ đ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ â đđ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ đ=1
đ
đ=1
đ=1
óľ¨ óľ¨2 óľ¨ óľ¨2 + đđđĄ âđđ ( óľ¨óľ¨óľ¨đŚđ (đĄ)óľ¨óľ¨óľ¨ + óľ¨óľ¨óľ¨]đ (đĄ)óľ¨óľ¨óľ¨ + đđđĄ â (âđâđđ đźđ +
đ
âđđ ]đđ ) đŚđ2 đ=1
đ=1
â
âŤ
đ
đ
1â¤đâ¤đ
đ=1
đ
đ
đ
đ=1
đ=1
đ=1
óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ â đđ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ đ đ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ â đđ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ â â đđ óľ¨óľ¨óľ¨óľ¨đđđóľ¨óľ¨óľ¨óľ¨ đż đ đ
đ=1
đ=1
đ
đ đ óľ¨ óľ¨ óľ¨ óľ¨ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ â đđ ] âđŚđ2 (đĄ) đ=1
đ=1
1â¤đâ¤đ
đ=1
đ
đ
đ=1
đ=1
óľ¨ óľ¨ đźđ â â đđ ]đđ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ
đĄ
đĄâđ(đĄ)
đ=1
đ óľ¨ óľ¨ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ ] âđŚđ2 (đĄ â đđ ) â đđđĄ min (1 â đż) 1â¤đâ¤đ
â
âŤ
đĄ
đ=1
đ đ óľ¨ óľ¨ â đ˝đ đŚđ2 (đ ) đđ + đđđĄ min ( â đđ óľ¨óľ¨óľ¨óľ¨đđđóľ¨óľ¨óľ¨óľ¨ đđ
đĄâđ(đĄ) đ=1
đĄ
đ
â đŚđ2 (đ ) đđ
đĄâđ(đĄ) đ=1
1â¤đâ¤đ
đ=1
đ=1
đ=1
1đđ â
âŤ
đĄ
đĄâđ(đĄ)
đŚđ2 (đ ) đđ )
+ đđđĄ max đđ â]â2â 1â¤đâ¤đ
đ
⤠âđđđĄ min [2đđ đđ â đđđ â â đđ đđđ â đźđ â đđ˝đ 1â¤đâ¤đ
đ=1
đ đ đ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ â đđ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ đ=1
đ=1
đ=1
đ đ đ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ â đđ â óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ â â đđ óľ¨óľ¨óľ¨óľ¨đđđóľ¨óľ¨óľ¨óľ¨ đż đ đ=1
đ=1
đ=1
đ đ đ óľ¨ óľ¨ óľ¨ óľ¨ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ â đđ ] âđŚđ2 (đĄ) đ=1
đ
đ=1
đ=1
đ=1
â
â (âŤ
⤠âđđđĄ min [2đđ đđ â đđđ â â đđ đđđ â đźđ â đđ˝đ
â đ min [đ
đ=1
đ đ đ óľ¨ óľ¨ óľ¨ óľ¨ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ ] âđŚđ2 (đĄ â đđ )
1â¤đâ¤đ
đ
âđđ
1â¤đâ¤đ
đ=1
đ đ óľ¨ óľ¨ óľ¨ óľ¨ + đđđĄ min ( â đđ óľ¨óľ¨óľ¨óľ¨đđđóľ¨óľ¨óľ¨óľ¨ đđ + â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ )
đĄâđ(đĄ) đ=1
đđĄ
đ
1â¤đâ¤đ
âđ˝đ đŚđ2 (đ ) đđ
đ=1
đ=1
â đđ ] âđŚđ2 (đĄ) â đđđĄ min [đâđđ đźđ â â đđ ]đđ
(đĄ â đđ ) + đ (đż â 1)
đ=1
đ=1
đ=1
â đđđĄ min (1 â đż) đ˝ âŤ
đđĄ
đ
đĄ
đ=1
đ đ đ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ óľ¨ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đż đ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ
đ=1 đ=1 đ
đ=1
đ=1
đ=1
đ đ óľ¨ óľ¨ â đđđĄ min [đâđđ đźđ â â đđ ]đđ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ 1â¤đâ¤đ
đ=1
đ=1
đ đ óľ¨ óľ¨ â â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ ] âđŚđ2 (đĄ â đđ ) â đđđĄ min [(1 â đż) đ˝ đ=1
đ=1
1â¤đâ¤đ
6
Mathematical Problems in Engineering óľ¨ óľ¨2 E óľ¨óľ¨óľ¨đŚ (đĄ)óľ¨óľ¨óľ¨
đ đ óľ¨ óľ¨ óľ¨ óľ¨ â ( â đđ óľ¨óľ¨óľ¨óľ¨đđđóľ¨óľ¨óľ¨óľ¨ đđ + â đđ óľ¨óľ¨óľ¨óľ¨đđđ óľ¨óľ¨óľ¨óľ¨ đđ ) đ] đ=1
â
âŤ
đ=1
â¤
đ
đĄ
â đŚđ2 (đ ) đđ + đđđĄ max đđ â]â2â . 1â¤đâ¤đ
đĄâđ(đĄ) đ=1
(15) Now we define a Markov time as follows: đ fl inf {đ ⼠0 : |đĽ (đ )| ⼠đ} .
(16)
By using the Dynkin formula, we have Eđ (đĄ ⧠đđ , đŚ (đĄ ⧠đđ )) â Eđ (0, đŚ (0)) = E [âŤ
đĄâ§đđ
0
(17)
Lđ (đ , đŚ (đ )) đđ ] ,
which implies Eđ (đĄ ⧠đđ , đŚ (đĄ ⧠đđ )) đĄâ§đđ
= Eđ (0, đŚ (0)) + E [âŤ
0
Lđ (đ , đŚ (đ )) đđ ] .
(18)
Letting đ ół¨â â on both sides (18), it follows from the monotone convergence theorem, (9), (10), and (12) that đĄ
Eđ (đĄ, đŚ (đĄ)) ⤠Eđ (0, đĽ (0)) + âđâ2â max đđ ⍠đđđ đđ 1â¤đâ¤đ
0
0
đ
đ=1
âđđ
đ=1
0
âŤ
đĄ
âđ(đĄ) đĄ+đ
+
đ
Eđđđ âđ˝đ đŚđ2 (đ ) đđ
Remark 8. Compared with the result in [46], our model is also more general than that in [46] since distributed delays and fuzzy factor in this paper were ignored in [46].
2
2
2
đ=1
đ=1
đĄ
2
2
đ=1
đ=1
đĄâđ(đĄ)
âđ (đĽđ (đ )) đđ (22)
+ âđđđ đđ (đĽđ (đĄ)) + âđđđ đđ (đĽđ (đđ đĄ))
On the other hand, from the definition of đ(đĄ, đĽ(đĄ), we have Eđ (đĄ, đŚ (đĄ)) ⼠Eđ
Remark 7. Compared with the result in [44], our model is more general than that in [44]. In fact, multiproportional delays and distributed delays are considered in this paper and they yield much difficulty in the proof of our result, whereas only a simple constant delay was discussed in [44].
+ âđđđ đđ (đĽđ (đđ đĄ)) + âđđđ âŤ
1 2 đđĄ â]ââ max đđ (đ â 1) . 1â¤đâ¤đ đ
âđđ đŚđ2 đ=1
Remark 6. If we ignore the effects of delays, then system (1)(2) becomes a stochastic recurrent neural network without delays. The results obtained in this paper are also applicable to the case of stochastic recurrent neural networks without delays.
đđĽđ (đĄ) = [âđđ đĽđ (đĄ) + âđđđ đđ (đĽđ (đĄ)) đ=1 [
1 2 đđĄ â]ââ max đđ (đ â 1) 1â¤đâ¤đ đ
đ
Corollary 5. Assume that all the conditions of Theorem 4 hold. Then the trivial solution of system (1)-(2) with đ˘(đĄ) ⥠0 is meansquare exponentially stable.
(19)
đ=1
(đâ1)đĄ
From (21) we see that the trivial solution of system (6)-(7) is mean-square exponentially input-to-state stable. The proof of Theorem 4 is completed.
Example 1 (2-dimension case). Consider the case of 2-dimension stochastic recurrent neural networks with multiproportional delays
óľŠ óľŠ2 ⤠(max đđ + đ max đźđ + đ max đ˝đ ) E óľŠóľŠóľŠđóľŠóľŠóľŠ 1â¤đâ¤đ 1â¤đâ¤đ 1â¤đâ¤đ +
max1â¤đâ¤đ đđ óľŠ óľŠ2 â
đâ(đâ1)đĄ E óľŠóľŠóľŠđóľŠóľŠóľŠ + âđâ2â . đ max1â¤đâ¤đ đđ
In this section, we will use two examples to show the effectiveness of the obtained result.
= âđđ EđŚđ2 (0) + ⍠Eđđđ âđźđ đŚđ2 (đ ) đđ +âŤ
(21)
4. Illustrative Examples
1 ⤠Eđ (0, đĽ (0)) + â]â2â max đđ (đđđĄ â 1) 1â¤đâ¤đ đ đ
max1â¤đâ¤đ đđ + đ max1â¤đâ¤đ đźđ + đ max1â¤đâ¤đ đ˝đ min1â¤đâ¤đ đđ
2
(đĄ)
óľ¨ óľ¨2 ⼠đ(đâ1)đĄ min đđ E óľ¨óľ¨óľ¨đŚ (đĄ)óľ¨óľ¨óľ¨ .
(20)
1â¤đâ¤đ
Combining (19) and (20), the following inequation holds:
đ=1
2
đĄ
âđ (đĽđ (đ )) đđ + đ˘đ (đĄ)] đđĄ + âđđđ đĄâđ(đĄ) đ=1 ]
+ âđđđ âŤ
â
(đĽđ (đĄ) , đĽđ (đđ đĄ)) đđ¤đ (đĄ) , đĽđ (đĄ) = đđ (đĄ) ,
đ ⤠đĄ ⤠1, đ = 1, 2,
(23)
Mathematical Problems in Engineering
7 2đ2 đ2 = 8
where {0.1đĽ, đ (đĽ) = đ (đĽ) = â (đĽ) = { 0.1 tan (đĽ) , {
2
if đĽ ⤠0, if đĽ > 0,
> (1 + đ) đ2 + đź2 + đđ˝2 + âđđ đđ2 đ=1
(24)
2 2 óľ¨ óľ¨ óľ¨ óľ¨ + âđđ óľ¨óľ¨óľ¨óľ¨đđ2 óľ¨óľ¨óľ¨óľ¨ đż 2 + đ2 â óľ¨óľ¨óľ¨óľ¨đ2đ óľ¨óľ¨óľ¨óľ¨ đż đ
đ˘ (đĄ) = 0.08 sin (đĄ) ,
đ=1
and
2 2 óľ¨ óľ¨ óľ¨ óľ¨ + đ2 â óľ¨óľ¨óľ¨óľ¨đ2đ óľ¨óľ¨óľ¨óľ¨ đđ + đ2 â óľ¨óľ¨óľ¨óľ¨đ2đ óľ¨óľ¨óľ¨óľ¨ đđ
(đđđ (đĽđ (đĄ)) , (đđ đĄ))2Ă2 0.4đĽ1 (đĄ) 0.2 (đĽ2 (đĄ) + đĽ2 (đ2 đĄ)) =( ). 0.2đĽ1 (đ1 đĄ) 0.1 (đĽ2 (đĄ) + đĽ2 (đ2 đĄ))
đ=1
(25)
(đđđ )2Ă2
0.6 0.3 ), (đđđ )2Ă2 = ( 0.8 0.2 0.4 0.3 ), (đđđ )2Ă2 = ( 0.2 0.5 0.5 0.7 ), (đđđ )2Ă2 = ( 0.3 0.8 (đđđ )2Ă2
0.4 0.6 =( ), 0.5 0.3
(đđđ )2Ă2
0.3 0.2 =( ), 0.5 0.4
(30)
đ=1
2 2 óľ¨ óľ¨ óľ¨ óľ¨ + âđđ óľ¨óľ¨óľ¨óľ¨đđ2 óľ¨óľ¨óľ¨óľ¨ đż 2 + đ2 â óľ¨óľ¨óľ¨óľ¨đ2đ óľ¨óľ¨óľ¨óľ¨ đż đ đ=1
Other parameters of system (22)-(23) are given as follows: 0.5 0.3 =( ), 0.4 0.8
đ=1
đ=1
2 2 óľ¨ óľ¨ óľ¨ óľ¨ + đ2 â óľ¨óľ¨óľ¨óľ¨đ2đ óľ¨óľ¨óľ¨óľ¨ đđ + đ2 â óľ¨óľ¨óľ¨óľ¨đ2đ óľ¨óľ¨óľ¨óľ¨ đđ = 4.39327, đ=1
(26)
đ=1
đź1 = 2.6 2 2 óľ¨ óľ¨ âĽ âđđđ đđ ]đ1 + â đđđ đđ óľ¨óľ¨óľ¨óľ¨đđ1 óľ¨óľ¨óľ¨óľ¨ đ1 đ=1
đ=1
(31)
2 óľ¨ óľ¨ + âđđđ đđ óľ¨óľ¨óľ¨óľ¨đđ1 óľ¨óľ¨óľ¨óľ¨ đ1 = 0.889511,
(27)
đ=1
đź2 = 1.6
(28)
2 2 óľ¨ óľ¨ âĽ âđđđ đđ ]đ2 + â đđđ đđ óľ¨óľ¨óľ¨óľ¨đđ2 óľ¨óľ¨óľ¨óľ¨ đ2 đ=1
đ=1
(32)
2 óľ¨ óľ¨ + âđđđ đđ óľ¨óľ¨óľ¨óľ¨đđ2 óľ¨óľ¨óľ¨óľ¨ đ2 = 0.471548,
đ1 = 9, đ2 = 8. Take đ1 = đ2 = 0.5, đ = 1.1, đ1 = 0.5, đ2 = 0.5, đź1 = 2.6, đź2 = 1.6, đ˝1 = 1.3, đ˝2 = 1.7, and from the definition of đ, we obtain đ = 0.6931. It is easy to check that Assumptions 1â3 are satisfied. Moreover, a simple computation yields 2đ1 đ1 = 9
đ=1
đ˝1 = 1.3 âĽ
2 2 đ óľ¨ óľ¨ óľ¨ óľ¨ (â đđ óľ¨óľ¨óľ¨óľ¨đđ1 óľ¨óľ¨óľ¨óľ¨ đ1 + âđđ óľ¨óľ¨óľ¨óľ¨đđ1 óľ¨óľ¨óľ¨óľ¨ đ1 ) (1 â đż) đ=1 đ=1
(33)
= 0.1581, 2
đ˝2 = 1.7
> (1 + đ) đ1 + đź1 + đđ˝1 + â đđ đđ1 đ=1
2
2
đ=1
đ=1
âĽ
óľ¨ óľ¨ óľ¨ óľ¨ + â đđ óľ¨óľ¨óľ¨óľ¨đđ1 óľ¨óľ¨óľ¨óľ¨ đż 1 + đ1 â óľ¨óľ¨óľ¨óľ¨đ1đ óľ¨óľ¨óľ¨óľ¨ đż đ 2
2 óľ¨ óľ¨ óľ¨ óľ¨ + đ1 â óľ¨óľ¨óľ¨óľ¨đ1đ óľ¨óľ¨óľ¨óľ¨ đđ + đ1 â óľ¨óľ¨óľ¨óľ¨đ1đ óľ¨óľ¨óľ¨óľ¨ đđ đ=1
đ=1
2 2 óľ¨ óľ¨ óľ¨ óľ¨ + â đđ óľ¨óľ¨óľ¨óľ¨đđ1 óľ¨óľ¨óľ¨óľ¨ đż 1 + đ1 â óľ¨óľ¨óľ¨óľ¨đ1đ óľ¨óľ¨óľ¨óľ¨ đż đ đ=1
đ=1
2 2 óľ¨ óľ¨ óľ¨ óľ¨ + đ1 â óľ¨óľ¨óľ¨óľ¨đ1đ óľ¨óľ¨óľ¨óľ¨ đđ + đ1 â óľ¨óľ¨óľ¨óľ¨đ1đ óľ¨óľ¨óľ¨óľ¨ đđ = 5.14103, đ=1
đ=1
(29)
2 2 đ óľ¨ óľ¨ óľ¨ óľ¨ (â đđ óľ¨óľ¨óľ¨óľ¨đđ2 óľ¨óľ¨óľ¨óľ¨ đ2 + âđđ óľ¨óľ¨óľ¨óľ¨đđ2 óľ¨óľ¨óľ¨óľ¨ đ2 ) (1 â đż) đ=1 đ=1
(34)
= 0.1581. Hence, all the conditions of Theorem 4 are satisfied. System (22)-(23) is mean-square exponentially input-to-state stable. Obviously, system (22)-(23) is mean-square exponentially stable when đ˘(đĄ) ⥠0. Example 2 (3-dimension case). Consider the case of 3dimension stochastic recurrent neural networks with multiproportional delays
8
Mathematical Problems in Engineering 3
3
đđĽđ (đĄ) = [âđđ đĽđ (đĄ) + âđđđ đđ (đĽđ (đĄ)) đ=1 [ 3
3
đ=1
đ=1
+ âđđđ đđ (đĽđ (đđ đĄ)) + âđđđ âŤ
+ âđđđ âŤ
đĄ
đĄâđ(đĄ)
đ=1
âđ (đĽđ (đ )) đđ + đ˘đ (đĄ)] đđĄ ]
3
đĄ
đĄâđ(đĄ)
+ âđđđ (đĽđ (đĄ) , đĽđ (đđ đĄ)) đđ¤đ (đĄ) ,
âđ (đĽđ (đ )) đđ
đ=1
(35) 3
3
đ=1
đ=1
đĽđ (đĄ) = đđ (đĄ) ,
+ âđđđ đđ (đĽđ (đĄ)) + âđđđ đđ (đĽđ (đđ đĄ))
đĄ â [đ, 1] , đ = 1, 2, 3,
where
đ (đĽ) = đ (đĽ) = â (đĽ) = 0.2 tanh (đĽ) ,
(37)
đ˘ (đĄ) = 0.1 sin (đĄ) , 0.2đĽ1 (đĄ)
0.2 (đĽ2 (đ2 đĄ))
(đđđ )3Ă3 = (0.4đĽ1 (đ1 đĄ) 0.4 (đĽ2 (đĄ) + đĽ2 (đ2 đĄ)) 0.2đĽ1 (đĄ)
0.1đĽ2 (đĄ)
Other parameters of system (22)-(23) are given as follows:
(đđđ )3Ă3 = ( 0.9 0.3 0.5 ) , 0.2 0.6 â0.3
0.4 0.3 â0.2 (đđđ )3Ă3 = (0.5 0.8
the definition of đ that đ = 0.6931. It is easy to check that Assumptions 1â3 are satisfied. A direct computation gives
đ=1
3 3 óľ¨ óľ¨ óľ¨ óľ¨ + âđđ óľ¨óľ¨óľ¨óľ¨đđ1 óľ¨óľ¨óľ¨óľ¨ đż 1 + đ1 â óľ¨óľ¨óľ¨óľ¨đ1đ óľ¨óľ¨óľ¨óľ¨ đż đ đ=1
đ=1
3 3 óľ¨ óľ¨ óľ¨ óľ¨ + đ1 â óľ¨óľ¨óľ¨óľ¨đ1đ óľ¨óľ¨óľ¨óľ¨ đđ + đ1 â óľ¨óľ¨óľ¨óľ¨đ1đ óľ¨óľ¨óľ¨óľ¨ đđ
0.2 0.3) ,
đ=1
(40)
(đđđ )3Ă3 = (0.6 0.2 0.3) ,
đ=1
3 3 óľ¨ óľ¨ óľ¨ óľ¨ + âđđ óľ¨óľ¨óľ¨óľ¨đđ1 óľ¨óľ¨óľ¨óľ¨ đż 1 + đ1 â óľ¨óľ¨óľ¨óľ¨đ1đ óľ¨óľ¨óľ¨óľ¨ đż đ đ=1
0.3 0.2 0.4
đ=1
3 3 óľ¨ óľ¨ óľ¨ óľ¨ + đ1 â óľ¨óľ¨óľ¨óľ¨đ1đ óľ¨óľ¨óľ¨óľ¨ đđ + đ1 â óľ¨óľ¨óľ¨óľ¨đ1đ óľ¨óľ¨óľ¨óľ¨ đđ = 5.60903,
0.5 0.2 0.2
đ=1
(đđđ )3Ă3 = (0.4 0.3 0.5) , 0.5 0.3 0.2
(38)
0.4 (đĽ3 (đĄ) + đĽ3 (đ3 (đĄ)))
(39)
0.6 â0.4 0.5
0.7 0.1 0.3
),
3
0.9 â0.3 0.6
0.7 0.4 0.5
0.2đĽ3 (đ3 (đĄ))
> (1 + đ) đ1 + đź1 + đđ˝1 + âđđ đđ1
0.4 ) ,
â0.6 0.2 0.5
0.3đĽ3 (đĄ)
2đ1 đ1 = 6.4
â0.8 0.5 0.5
(đđđ )3Ă3 = ( 0.3
(36)
đ=1
2đ2 đ2 = 7.2 (41)
(đđđ )3Ă3 = (0.4 0.1 0.2) , 0.3 0.4 0.3
đ1 = 8, đ2 = 9, đ3 = 10. Take đ1 = đ2 = đ3 = 0.5, đ = 1.3, đ1 = 0.4, đ2 = 0.4, đ3 = 0.4, đź1 = 1.4, đź2 = 1.3, đź3 = 1.4, đ˝1 = 1.3, đ˝2 = 1.2, đ˝3 = 1.1, and it follows from
3
> (1 + đ) đ2 + đź2 + đđ˝2 + âđđ đđ2 đ=1
3 3 óľ¨ óľ¨ óľ¨ óľ¨ + âđđ óľ¨óľ¨óľ¨óľ¨đđ2 óľ¨óľ¨óľ¨óľ¨ đż 2 + đ2 â óľ¨óľ¨óľ¨óľ¨đ2đ óľ¨óľ¨óľ¨óľ¨ đż đ đ=1
đ=1
3 3 óľ¨ óľ¨ óľ¨ óľ¨ + đ2 â óľ¨óľ¨óľ¨óľ¨đ2đ óľ¨óľ¨óľ¨óľ¨ đđ + đ2 â óľ¨óľ¨óľ¨óľ¨đ2đ óľ¨óľ¨óľ¨óľ¨ đđ đ=1
đ=1
(42)
Mathematical Problems in Engineering
9 đ˝2 = 1.2
3 3 óľ¨ óľ¨ óľ¨ óľ¨ + â đđ óľ¨óľ¨óľ¨óľ¨đđ2 óľ¨óľ¨óľ¨óľ¨ đż 2 + đ2 â óľ¨óľ¨óľ¨óľ¨đ2đ óľ¨óľ¨óľ¨óľ¨ đż đ đ=1
đ=1
3
3
đ=1
đ=1
âĽ
óľ¨ óľ¨ óľ¨ óľ¨ + đ2 â óľ¨óľ¨óľ¨óľ¨đ2đ óľ¨óľ¨óľ¨óľ¨ đđ + đ2 â óľ¨óľ¨óľ¨óľ¨đ2đ óľ¨óľ¨óľ¨óľ¨ đđ = 5.46372, (43) 2đ3 đ3 = 8
(49)
= 0.28908, đ˝3 = 1.1
3
> (1 + đ) đ3 + đź3 + đđ˝3 + â đđ đđ3
âĽ
đ=1
3 3 óľ¨ óľ¨ óľ¨ óľ¨ + â đđ óľ¨óľ¨óľ¨óľ¨đđ3 óľ¨óľ¨óľ¨óľ¨ đż 3 + đ3 â óľ¨óľ¨óľ¨óľ¨đ3đ óľ¨óľ¨óľ¨óľ¨ đż đ đ=1
đ=1
(44)
đ=1
3 3 óľ¨ óľ¨ óľ¨ óľ¨ + â đđ óľ¨óľ¨óľ¨óľ¨đđ3 óľ¨óľ¨óľ¨óľ¨ đż 3 + đ3 â óľ¨óľ¨óľ¨óľ¨đ3đ óľ¨óľ¨óľ¨óľ¨ đż đ đ=1
3 3 óľ¨ óľ¨ óľ¨ óľ¨ + đ3 â óľ¨óľ¨óľ¨óľ¨đ3đ óľ¨óľ¨óľ¨óľ¨ đđ + đ3 â óľ¨óľ¨óľ¨óľ¨đ3đ óľ¨óľ¨óľ¨óľ¨ đđ = 5.59841, đ=1
đź1 = 1.4
Hence, all the conditions of Theorem 4 are satisfied. System (35)-(36) is mean-square exponentially input-to-state stable. Obviously, system (22)-(23) is mean-square exponentially stable when đ˘(đĄ) ⥠0.
5. Concluding Remarks
3 3 óľ¨ óľ¨ âĽ âđđđ đđ ]đ1 + âđđđ đđ óľ¨óľ¨óľ¨óľ¨đđ1 óľ¨óľ¨óľ¨óľ¨ đ1 đ=1
đ=1
(45)
3 óľ¨ óľ¨ + â đđđ đđ óľ¨óľ¨óľ¨óľ¨đđ1 óľ¨óľ¨óľ¨óľ¨ đ1 = 1.29802, đ=1
đź2 = 1.3 3 3 óľ¨ óľ¨ âĽ âđđđ đđ ]đ2 + âđđđ đđ óľ¨óľ¨óľ¨óľ¨đđ2 óľ¨óľ¨óľ¨óľ¨ đ2 đ=1
đ=1
(46)
3 óľ¨ óľ¨ + â đđđ đđ óľ¨óľ¨óľ¨óľ¨đđ2 óľ¨óľ¨óľ¨óľ¨ đ2 = 1.08332,
đź3 = 1.4 3 3 óľ¨ óľ¨ âĽ âđđđ đđ ]đ3 + âđđđ đđ óľ¨óľ¨óľ¨óľ¨đđ3 óľ¨óľ¨óľ¨óľ¨ đ3 đ=1
đ=1
In this paper, we have studied mean-square exponential input-to-state stability of a class of stochastic fuzzy recurrent neural networks with multiproportional delays and distributed delays. A key characteristics of this paper is that the nonlinear transformation đŚ(đĄ) = đĽ(đđĄ ) is employed to transform the considered system into stochastic recurrent neural networks with constant delays and variable coefficient, which overcomes the difficulty from multiproportional delays. Moreover, we also consider the effects of distributed delays and fuzzy. In our future works, we will apply the method developed in this paper to study some other important problems such as the stability of multiagent systems.
Data Availability
đ=1
The data used to support the findings of this study are available from the corresponding author upon request. (47)
3 óľ¨ óľ¨ + â đđđ đđ óľ¨óľ¨óľ¨óľ¨đđ3 óľ¨óľ¨óľ¨óľ¨ đ3 = 1.31969,
Conflicts of Interest The authors declare that there are no conflicts of interest regarding the publication of this paper.
đ=1
đ˝1 = 1.3
Acknowledgments 3
3
óľ¨ óľ¨ óľ¨ óľ¨ đ (âđđ óľ¨óľ¨óľ¨óľ¨đđ1 óľ¨óľ¨óľ¨óľ¨ đ1 + âđđ óľ¨óľ¨óľ¨óľ¨đđ1 óľ¨óľ¨óľ¨óľ¨ đ1 ) (1 â đż) đ=1 đ=1
= 0.48781,
(50)
Remark 3. Obviously, the obtained results in [44, 46] do not apply to Examples 1 and 2 since many factors such as fuzzy logic, multiproportional delays, and distributed delays are considered in Examples 1 and 2.
đ=1
đ=1
3 3 đ óľ¨ óľ¨ óľ¨ óľ¨ (â đđ óľ¨óľ¨óľ¨óľ¨đđ3 óľ¨óľ¨óľ¨óľ¨ đ3 + âđđ óľ¨óľ¨óľ¨óľ¨đđ3 óľ¨óľ¨óľ¨óľ¨ đ3 ) (1 â đż) đ=1 đ=1
= 0.36134.
đ=1
3 3 óľ¨ óľ¨ óľ¨ óľ¨ + đ3 â óľ¨óľ¨óľ¨óľ¨đ3đ óľ¨óľ¨óľ¨óľ¨ đđ + đ3 â óľ¨óľ¨óľ¨óľ¨đ3đ óľ¨óľ¨óľ¨óľ¨ đđ
âĽ
3 3 óľ¨ óľ¨ óľ¨ óľ¨ đ (â đđ óľ¨óľ¨óľ¨óľ¨đđ2 óľ¨óľ¨óľ¨óľ¨ đ2 + âđđ óľ¨óľ¨óľ¨óľ¨đđ2 óľ¨óľ¨óľ¨óľ¨ đ2 ) (1 â đż) đ=1 đ=1
(48)
This work was jointly supported by the National Natural Science Foundation of China (61773217 and 61374080), the Natural Science Foundation of Jiangsu Province (BK20161552), and Qing Lan Project of Jiangsu Province.
10
References [1] G. Wang, âExistence-stability theorems for strong vector setvalued equilibrium problems in reflexive Banach spaces,â Journal of Inequalities and Applications, vol. 239, pp. 1â14, 2015. [2] Y. Bai and X. Mu, âGlobal asymptotic stability of a generalized SIRS epidemic model with transfer from infectious to susceptible,â Journal of Applied Analysis and Computation, vol. 8, no. 2, pp. 402â412, 2018. [3] L. Gao, D. Wang, and G. Wang, âFurther results on exponential stability for impulsive switched nonlinear time-delay systems with delayed impulse effects,â Applied Mathematics and Computation, vol. 268, pp. 186â200, 2015. [4] Y. Li, Y. Sun, and F. Meng, âNew criteria for exponential stability of switched time-varying systems with delays and nonlinear disturbances,â Nonlinear Analysis: Hybrid Systems, vol. 26, pp. 284â291, 2017. [5] L. G. Wang, K. P. Xu, and Q. W. Liu, âOn the stability of a mixed functional equation deriving from additive, quadratic and cubic mappings,â Acta Mathematica Sinica, vol. 30, no. 6, pp. 1033â 1049, 2014. [6] L. G. Wang and B. Liu, âFuzzy stability of a functional equation deriving from additive, quadratic, cubic and quartic functions,â Acta Mathematica Sinica, vol. 55, no. 5, pp. 841â854, 2012. [7] F. Li and Y. Bao, âUniform stability of the solution for a memorytype elasticity system with nonhomogeneous boundary control condition,â Journal of Dynamical and Control Systems, vol. 23, no. 2, pp. 301â315, 2017. [8] Y.-H. Feng and C.-M. Liu, âStability of steady-state solutions to Navier-Stokes-Poisson systems,â Journal of Mathematical Analysis and Applications, vol. 462, no. 2, pp. 1679â1694, 2018. [9] J. Hu and A. Xu, âOn stability of F-Gorenstein flat categories,â Algebra Colloquium, vol. 23, no. 2, pp. 251â262, 2016. [10] Q. X. Zhu and Q. Y. Zhang, âPth moment exponential stabilisation of hybrid stochastic differential equations by feedback controls based on discrete-time state observations with a time delay,â IET Control Theory & Applications, vol. 11, no. 12, pp. 1992â2003, 2017. [11] W. W. Sun, âStabilization analysis of time-delay Hamiltonian systems in the presence of saturation,â Applied Mathematics and Computation, vol. 217, no. 23, pp. 9625â9634, 2011. [12] X. Zheng, Y. Shang, and X. Peng, âOrbital stability of solitary waves of the coupled Klein-Gordon-Zakharov equations,â Mathematical Methods in the Applied Sciences, vol. 40, no. 7, pp. 2623â2633, 2017. [13] C. Liu and Y.-J. Peng, âStability of periodic steady-state solutions to a non-isentropic Euler-Maxwell system,â Zeitschrift f¨ur Angewandte Mathematik und Physik, vol. 68, no. 5, Art. 105, 17 pages, 2017. [14] Q. Zhu and H. Wang, âOutput feedback stabilization of stochastic feedforward systems with unknown control coefficients and unknown output function,â Automatica, vol. 87, pp. 166â175, 2018. [15] Q. Zhu, âRazumikhin-type theorem for stochastic functional differential equations with L´evy noise and Markov switching,â International Journal of Control, vol. 90, no. 8, pp. 1703â1712, 2017. [16] Y. Yu, M. Meng, J.-e. Feng, and P. Wang, âStabilizability analysis and switching signals design of switched Boolean networks,â Nonlinear Analysis: Hybrid Systems, vol. 30, pp. 31â44, 2018.
Mathematical Problems in Engineering [17] M. Li and J. Wang, âRepresentation of solution of a RiemannLiouville fractional differential equation with pure delay,â Applied Mathematics Letters, vol. 85, pp. 118â124, 2018. [18] S. Jiao, H. Shen, Y. Wei, X. Huang, and Z. Wang, âFurther results on dissipativity and stability analysis of Markov jump generalized neural networks with time-varying interval delays,â Applied Mathematics and Computation, vol. 336, pp. 338â350, 2018. [19] J. Zhang and J. Wang, âNumerical analysis for Navier-Stokes equations with time fractional derivatives,â Applied Mathematics and Computation, vol. 336, pp. 481â489, 2018. [20] X. Cao and J. Wang, âFinite-time stability of a class of oscillating systems with two delays,â Mathematical Methods in the Applied Sciences, vol. 41, no. 13, pp. 4943â4954, 2018. [21] L. Li, W. Qi, X. Chen, Y. Kao, X. Gao, and Y. Wei, âStability analysis and control synthesis for positive semi-Markov jump systems with time-varying delay,â Applied Mathematics and Computation, vol. 332, pp. 363â375, 2018. [22] J. Wang, B. Zhang, Z. Sun, W. Hao, and Q. Sun, âA novel conjugate gradient method with generalized Armijo search for efficient training of feedforward neural networks,â Neurocomputing, vol. 275, pp. 308â316, 2018. [23] J. Wang, Y. Wen, Z. Ye, L. Jian, and H. Chen, âConvergence analysis of BP neural networks via sparse response regularization,â Applied Soft Computing, vol. 61, pp. 354â363, 2017. [24] X. Zheng, Y. Shang, and X. Peng, âOrbital stability of periodic traveling wave solutions to the generalized Zakharov equations,â Acta Mathematica Scientia, vol. 37, no. 4, pp. 998â1018, 2017. [25] Y. Guo, âMean square global asymptotic stability of stochastic recurrent neural networks with distributed delays,â Applied Mathematics and Computation, vol. 215, no. 2, pp. 791â795, 2009. [26] Q. Zhu, S. Song, and T. Tang, âMean square exponential stability of stochastic nonlinear delay systems,â International Journal of Control, vol. 90, no. 11, pp. 2384â2393, 2017. [27] W. Sun and L. Peng, âObserver-based robust adaptive control for uncertain stochastic Hamiltonian systems with state and input delays,â Lithuanian Association of Nonlinear Analysts. Nonlinear Analysis: Modelling and Control, vol. 19, no. 4, pp. 626â 645, 2014. [28] L. G. Wang and B. Liu, âThe Hyers-Ulam stability of a functional equation deriving from quadratic and cubic functions in quasiđ˝-normed spaces,â Acta Mathematica Sinica, vol. 26, no. 12, pp. 2335â2348, 2010. [29] Y. Guo, âNontrivial periodic solutions of nonlinear functional differential systems with feedback control,â Turkish Journal of Mathematics, vol. 34, no. 1, pp. 35â44, 2010. [30] L. Li, F. Meng, and P. Ju, âSome new integral inequalities and their applications in studying the stability of nonlinear integrodifferential equations with time delay,â Journal of Mathematical Analysis and Applications, vol. 377, no. 2, pp. 853â862, 2011. [31] M. Li and J. Wang, âExploring delayed Mittag-Leffler type matrix functions to study finite time stability of fractional delay differential equations,â Applied Mathematics and Computation, vol. 324, pp. 254â265, 2018. [32] G. Liu, S. Xu, Y. Wei, Z. Qi, and Z. Zhang, âNew insight into reachable set estimation for uncertain singular time-delay systems,â Applied Mathematics and Computation, vol. 320, pp. 769â 780, 2018. [33] W. Sun, Y. Wang, and R. Yang, âL2 disturbance attenuation for a class of time delay Hamiltonian systems,â Journal of Systems Science & Complexity, vol. 24, no. 4, pp. 672â682, 2011.
Mathematical Problems in Engineering [34] Y. Guo, âGlobally robust stability analysis for stochastic CohenGrossberg neural networks with impulse and time-varying delays,â Ukrainian Mathematical Journal, vol. 69, no. 8, pp. 1049â1060, 2017. [35] W. Qi, Y. Kao, X. Gao, and Y. Wei, âController design for timedelay system with stochastic disturbance and actuator saturation via a new criterion,â Applied Mathematics and Computation, vol. 320, pp. 535â546, 2018. [36] Y. Guo, âGlobal stability analysis for a class of Cohen-Grossberg neural network models,â Bulletin of the Korean Mathematical Society, vol. 49, no. 6, pp. 1193â1198, 2012. [37] D. Li and Q. Zhu, âComparison principle and stability of stochastic delayed neural networks with Markovian switching,â Neurocomputing, vol. 123, pp. 436â442, 2014. [38] Y. Guo, âGlobal asymptotic stability analysis for integro-differential systems modeling neural networks with delays,â Zeitschrift f¨ur Angewandte Mathematik und Physik, vol. 61, no. 6, pp. 971â978, 2010. [39] Q. Zhu, J. Cao, T. Hayat, and F. Alsaadi, âRobust stability of Markovian jump stochastic neural networks with time delays in the leakage terms,â Neural Processing Letters, vol. 41, no. 1, pp. 1â27, 2013. [40] Y. Guo, âMean square exponential stability of stochastic delay cellular neural networks,â Electronic Journal of Qualitative Theory of Differential Equations, vol. 34, pp. 1â10, 2013. [41] Q. Zhu and J. Cao, âExponential stability of stochastic neural networks with both Markovian jump parameters and mixed time delays,â IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 41, no. 2, pp. 341â353, 2011. [42] Q. Zhu and J. Cao, âStability analysis of markovian jump stochastic BAM neural networks with impulse control and mixed time delays,â IEEE Transactions on Neural Networks and Learning Systems, vol. 23, no. 3, pp. 467â479, 2012. [43] Y. Guo, âExponential stability analysis of travelling waves solutions for nonlinear delayed cellular neural networks,â Dynamical Systems. An International Journal, vol. 32, no. 4, pp. 490â503, 2017. [44] Q. Zhu and X. Li, âExponential and almost sure exponential stability of stochastic fuzzy delayed CohenâGrossberg neural networks,â Fuzzy Sets and Systems, vol. 203, pp. 74â94, 2012. [45] P. Balasubramaniam and M. S. Ali, âRobust exponential stability of uncertainfuzzy Cohen-Grossberg neural networks with timevarying delays,â Fuzzy Sets and Systems, vol. 161, no. 4, pp. 608â 618, 2010. [46] L. Zhou and X. Liu, âMean-square exponential input-to-state stability of stochastic recurrent neural networks with multi-proportional delays,â Neurocomputing, vol. 219, pp. 396â403, 2017.
11
Advances in
Operations Research Hindawi www.hindawi.com
Volume 2018
Advances in
Decision Sciences Hindawi www.hindawi.com
Volume 2018
Journal of
Applied Mathematics Hindawi www.hindawi.com
Volume 2018
The Scientific World Journal Hindawi Publishing Corporation http://www.hindawi.com www.hindawi.com
Volume 2018 2013
Journal of
Probability and Statistics Hindawi www.hindawi.com
Volume 2018
International Journal of Mathematics and Mathematical Sciences
Journal of
Optimization Hindawi www.hindawi.com
Hindawi www.hindawi.com
Volume 2018
Volume 2018
Submit your manuscripts at www.hindawi.com International Journal of
Engineering Mathematics Hindawi www.hindawi.com
International Journal of
Analysis
Journal of
Complex Analysis Hindawi www.hindawi.com
Volume 2018
International Journal of
Stochastic Analysis Hindawi www.hindawi.com
Hindawi www.hindawi.com
Volume 2018
Volume 2018
Advances in
Numerical Analysis Hindawi www.hindawi.com
Volume 2018
Journal of
Hindawi www.hindawi.com
Volume 2018
Journal of
Mathematics Hindawi www.hindawi.com
Mathematical Problems in Engineering
Function Spaces Volume 2018
Hindawi www.hindawi.com
Volume 2018
International Journal of
Differential Equations Hindawi www.hindawi.com
Volume 2018
Abstract and Applied Analysis Hindawi www.hindawi.com
Volume 2018
Discrete Dynamics in Nature and Society Hindawi www.hindawi.com
Volume 2018
Advances in
Mathematical Physics Volume 2018
Hindawi www.hindawi.com
Volume 2018