Exponential Stability of Periodic Solution to Wilson-Cowan Networks ...

1 downloads 0 Views 2MB Size Report
Feb 12, 2014 - We present stability analysis of delayed Wilson-Cowan networks on time scales. ... solution for delayed Wilson-Cowan type model has been.
Hindawi Publishing Corporation Advances in Articial Neural Systems Volume 2014, Article ID 750532, 10 pages http://dx.doi.org/10.1155/2014/750532

Research Article Exponential Stability of Periodic Solution to Wilson-Cowan Networks with Time-Varying Delays on Time Scales Jinxiang Cai, Zhenkun Huang, and Honghua Bin School of Science, Jimei University, Xiamen 361021, China Correspondence should be addressed to Zhenkun Huang; [email protected] Received 31 December 2013; Accepted 12 February 2014; Published 2 April 2014 Academic Editor: Songcan Chen Copyright Š 2014 Jinxiang Cai et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. We present stability analysis of delayed Wilson-Cowan networks on time scales. By applying the theory of calculus on time scales, the contraction mapping principle, and Lyapunov functional, new sufficient conditions are obtained to ensure the existence and exponential stability of periodic solution to the considered system. The obtained results are general and can be applied to discretetime or continuous-time Wilson-Cowan networks.

1. Introduction The activity of a cortical column may be mathematically described through the model developed by Wilson and Cowan [1, 2]. Such a model consists of two nonlinear ordinary differential equations representing the interactions between two populations of neurons that are distinguished by the fact that their synapses are either excitatory or inhibitory [2]. A comprehensive paper has been done by Destexhe and Sejnowski [3] which summarized all important development and theoretical results for Wilson-Cowan networks. Its extensive applications include pattern analysis and image processing [4]. Theoretical results about the existence of asymptotic stable limit cycle and chaos have been reported in [5, 6]. Exponential stability of a unique almost periodic solution for delayed Wilson-Cowan type model has been reported in [7]. However, few investigations are fixed on the periodicity of Wilson-Cowan model [8] and it is troublesome to study the stability and periodicity for continuous and discrete system with oscillatory coefficients, respectively. Therefore, it is significant to study Wilson-Cowan networks on time scales [9, 10] which can unify the continuous and discrete situations.

Motivated by recent results [11–13], we consider the following dynamic Wilson-Cowan networks on time scale T: 𝑋𝑃Δ (𝑡) = −𝑎𝑃 (𝑡) 𝑋𝑃 (𝑡) + [𝑘𝑃 (𝑡) − 𝑟𝑃 (𝑡) 𝑋𝑃 (𝑡)] × 𝐺 [𝑤𝑃1 (𝑡) 𝑋𝑃 (𝑡 − 𝜏𝑃 (𝑡)) 1 −𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑃 (𝑡)] , Δ 𝑋𝑁 (𝑡) = −𝑎𝑁 (𝑡) 𝑋𝑁 (𝑡) + [𝑘𝑁 (𝑡) − 𝑟𝑁 (𝑡) 𝑋𝑁 (𝑡)]

× 𝐺 [𝑤𝑃2 (𝑡) 𝑋𝑃 (𝑡 − 𝜏𝑃 (𝑡)) 2 −𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑁 (𝑡)] ,

(1) 𝑡 ∈ T, where 𝑋𝑃 (𝑡), 𝑋𝑁(𝑡) represent the proportion of excitatory and inhibitory neurons firing per unit time at the instant 𝑡, respectively. 𝑎𝑃 (𝑡) > 0 and 𝑎𝑁(𝑡) > 0 represent the function of the excitatory and inhibitory neurons with natural decay over time, respectively. 𝑟𝑃 (𝑡) and 𝑟𝑁(𝑡) are related to the duration of the refractory period; 𝑘𝑃 (𝑡) and 𝑘𝑁(𝑡) are 1 2 positive scaling coefficients. 𝑤𝑃1 (𝑡), 𝑤𝑁 (𝑡), 𝑤𝑃2 (𝑡), and 𝑤𝑁 (𝑡) are the strengths of connections between the populations. 𝐼𝑃 (𝑡), 𝐼𝑁(𝑡) are the external inputs to the excitatory and

2

Advances in Artificial Neural Systems

the inhibitory populations. 𝐺(⋅) is the response function of neuronal activity. 𝜏𝑃 (𝑡), 𝜏𝑁(𝑡) correspond to the transmission time-varying delays. The main aim of this paper is to unify the discrete and continuous Wilson-Cowan networks with periodic coefficients and time-varying delays under one common framework and to obtain some generalized results to ensure the existence and exponential stability of periodic solution on time scales. The main technique is based on the theory of time scales, the contraction mapping principle, and the Lyapunov functional method.

Definition 5. A function 𝑓 : T → R is called rightdense continuous provided that it is continuous at rightdense points of T and the left-side limit exists (finite) at leftdense continuous functions on T. The set of all right-dense continuous functions on T is defined by 𝐶rd = 𝐶rd (T, R).

2. Preliminaries

Definition 6. A function 𝑝 : T → T is called a regressive function if and only if 1 + 𝑝(𝑡)𝜇(𝑡) ≠ 0.

In this section, we give some definitions and lemmas on time scales which can be found in books [14, 15]. Definition 1. A time scale T is an arbitrary nonempty closed subset of the real set R. The forward and backward jump operators 𝜎, 𝜌 : T → T and the graininess 𝜇 : T → R+ are defined, respectively, by 𝜎 (𝑡) := inf {𝑠 ∈ T : 𝑠 > 𝑡} ,

𝜌 (𝑡) := sup {𝑠 ∈ T : 𝑠 < 𝑡} , (2)

These jump operators enable us to classify the point {𝑡} of a time scale as right-dense, right-scattered, left-dense, or left-scattered depending on whether 𝜎 (𝑡) > 𝑡,

𝜌 (𝑡) = 𝑡,

𝐷+ 𝑢Δ (𝑡) =

𝑢 (𝜎 (𝑡)) − 𝑢 (𝑡) . 𝜎 (𝑡) − 𝑡

𝜌 (𝑡) < 𝑡,

respectively, for any 𝑡 ∈ T.

Definition 7 (Bohner and Peterson [14]). If 𝑝 ∈ 𝐶rd is a regressive function, then the generalized exponential function 𝑒𝑝 (𝑡, 𝑠) is defined by

Definition 2. One can say that a time scale T is periodic if there exists 𝑝 > 0 such that 𝑡 ∈ T; then 𝑡 ± 𝑝 ∈ T; the smallest positive number 𝑝 is called the period of the time scale. Clearly, if T is a 𝑝-periodic time scale, then 𝜎(𝑡 + 𝑛𝑝) = 𝜎(𝑡)+𝑛𝑝 and 𝜇(𝑡+𝑛𝑝) = 𝜇(𝑡). So, 𝜇(𝑡) is a 𝑝-periodic function. Definition 3. Let T( ≠ R) be a periodic time scale with period 𝑝. One can say that the function 𝑓 : T → R is periodic with period 𝜔 > 0 if there exists a natural number 𝑛 such that 𝜔 = 𝑛𝑝, 𝑓(𝑡 + 𝜔) = 𝑓(𝑡) for all 𝑡 ∈ T and 𝜔 is the smallest number such that 𝑓(𝑡 + 𝜔) = 𝑓(𝑡). If T = R, one can say that 𝑓 is periodic with period 𝜔 > 0 if 𝜔 is the smallest positive number such that 𝑓(𝑡 + 𝜔) = 𝑓(𝑡) for all 𝑡 ∈ R. Definition 4 (Lakshmikantham and Vatsala [16]). For each 𝑡 ∈ T, let 𝑁 be a neighborhood of 𝑡. Then, one defines the generalized derivative (or Dini derivative), 𝐷+ 𝑢Δ (𝑡), to mean that, given 𝜀 > 0, there exists a right neighborhood 𝑁(𝜀) ⊂ 𝑁 of 𝑡 such that 𝑢 (𝜎 (𝑡)) − 𝑢 (𝑠) < 𝐷+ 𝑢Δ (𝑡) + 𝜀 𝑢 (𝑡, 𝑠) for each 𝑠 ∈ 𝑁(𝜀), 𝑠 > 𝑡, where 𝜇(𝑡, 𝑠) = 𝜎(𝑡) − 𝑠.

𝑒𝑝 (𝑡, 𝑠) = exp {∫ 𝜉𝜇(𝜏) (𝑝 (𝜏)) Δ𝜏} , 𝑠

(4)

𝑠, 𝑡 ∈ T,

(6)

with the cylinder transformation { Log (1 + ℎ𝑧) , ℎ ≠ 0, 𝜉ℎ (𝑧) = { ℎ ℎ = 0. {𝑧,

(3)

The notation [𝑎, 𝑏]T means that [𝑎, 𝑏]T := {𝑡 ∈ T : 𝑎 ≤ 𝑡 ≤ 𝑏}. Denote T := {𝑡 ∈ T : 𝑡 ≥ 0}.

(5)

The set of all regressive and right-dense continuous functions is denoted by R. Let R+ := {𝑝 ∈ 𝐶rd : 1 + 𝑝(𝑡)𝜇(𝑡) > 0 for all 𝑡 ∈ T}. Next, we give the definition of the exponential function and list its useful properties.

𝑡

𝜇 (𝑡) := 𝜎 (𝑡) − 𝑡.

𝜎 (𝑡) = 𝑡,

In case 𝑡 is right-scattered and 𝑢(𝑡) is continuous at 𝑡, one gets

(7)

Definition 8. The periodic solution ⊤

∗ 𝑍∗ (𝑡) = (𝑋𝑃∗ (𝑡) , 𝑋𝑁 (𝑡))

(8)

of (1) is said to be globally exponentially stable if there exists a positive constant 𝜀 and 𝑁 = 𝑁(𝜀) > 0 such that all solutions 𝑍 (𝑡) = (𝑋𝑃 (𝑡) , 𝑋𝑁 (𝑡))

⊤

(9)

of (1) satisfy 󵄨 󵄨 󵄨 󵄨󵄨 ∗ ∗ 󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃 (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡)󵄨󵄨󵄨 󵄨 󵄨 ≤ 𝑁 (𝜖) 𝑒⊖𝜖 (𝑡, 𝛼) ( sup 󵄨󵄨󵄨𝑋𝑃 (𝑠) − 𝑋𝑃∗ (𝑠)󵄨󵄨󵄨 𝑠∈[−𝜏0 ,0]T

󵄨 󵄨 ∗ + sup 󵄨󵄨󵄨𝑋𝑁 (𝑠) − 𝑋𝑁 (𝑠)󵄨󵄨󵄨) , 𝑠∈[−𝜏0 ,0]T

𝑡 ∈ T. (10) Lemma 9 (Bohner and Peterson [15]). If 𝑝, 𝑞 ∈ R, then (i) 𝑒0 (𝑡, 𝑠) ≡ 1 and 𝑒𝑝 (𝑡, 𝑡) ≡ 1; (ii) 𝑒𝑝 (𝜎(𝑡), 𝑠) = (1 + 𝜇(𝑡)𝑝(𝑡))𝑒𝑝 (𝑡, 𝑠); (iii) 1/𝑒𝑝 (𝑡, 𝑠) = 𝑒⊖𝑝 (𝑡, 𝑠), where ⊖𝑝(𝑡) = −𝑝(𝑡)/(1 + 𝜇(𝑡)𝑝(𝑡));

Advances in Artificial Neural Systems

3

(iv) 𝑒𝑝 (𝑡, 𝑠) = 1/𝑒𝑝 (𝑠, 𝑡) = 𝑒⊖𝑝 (𝑠, 𝑡);

Proof. Let 𝑍(𝑡) = (𝑋𝑃 (𝑡), 𝑋𝑁(𝑡))⊤ be a solution of (1); we can rewrite (1) as follows:

(v) 𝑒𝑝 (𝑡, 𝑠)𝑒𝑝 (𝑠, 𝑟) = 𝑒𝑝 (𝑡, 𝑟);

𝑋𝑃Δ (𝑡) + 𝑎𝑃 (𝑡) (𝑋𝑃𝜎 (𝑡) − 𝜇 (𝑡) 𝑋𝑃Δ (𝑡))

(vi) 𝑒𝑝 (𝑡, 𝑠)𝑒𝑞 (𝑡, 𝑠) = 𝑒𝑝⊕𝑞 (𝑡, 𝑠); (vii) 𝑒𝑝 (𝑡, 𝑠)/𝑒𝑞 (𝑡, 𝑠) = 𝑒𝑝⊖𝑞 (𝑡, 𝑠);

= [𝑘𝑃 (𝑡) − 𝑟𝑃 (𝑡) 𝑋𝑃 (𝑡)]

(viii) (1/𝑒𝑝 (⋅, 𝑠))Δ = −𝑝(𝑡)/𝑒𝑝𝜎 (⋅, 𝑠). Lemma 10 (contraction mapping principle [17]). If Ω is a closed subset of a Banach space 𝑋 and F : Ω → Ω is a contraction, then F has a unique fixed point in Ω. For any 𝜔-periodic function V defined on T, denote V = max𝑡∈[0,𝜔] V(𝑡), V = min𝑡∈[0,𝜔] V(𝑡), |V| = max𝑡∈[0,𝜔] |V(𝑡)|, and |V| = min𝑡∈[0,𝜔] |V(𝑡)|. Throughout this paper, we make the following assumptions: 1 2 (𝐴 1 ) 𝑘𝑃 (𝑡), 𝑘𝑁(𝑡), 𝑟𝑃 (𝑡), 𝑟𝑁(𝑡), 𝑤𝑃1 (𝑡), 𝑤𝑃2 (𝑡), 𝑤𝑁 (𝑡), 𝑤𝑁 (𝑡), 𝑎𝑃 (𝑡), 𝑎𝑁(𝑡), 𝜏𝑃 (𝑡), 𝜏𝑁(𝑡), 𝐼𝑃 (𝑡), and 𝐼𝑁(𝑡) are 𝜔periodic functions defined on T, −𝑎𝑃 (𝑡), −𝑎𝑁(𝑡) ∈ R+ .

(𝐴 2 ) 𝐺(⋅) : R → R is Lipschitz continuous; that is, |𝐺(𝑢) − 𝐺(V)| ≤ 𝐿|𝑢 − V|, for all 𝑢, V ∈ R, and 𝐺(0) = 0, supV∈R |𝐺(V)| ≤ 𝑀. For simplicity, take the following denotations: 𝑅 = max {𝑟𝑃 , 𝑟𝑁} ,

1 , 𝐿𝑤2 , 𝐿𝑤2 } , 𝑊 = max {𝐿𝑤𝑃1 , 𝐿𝑤𝑁 𝑃 𝑁

(11)

Lemma 11. Suppose (𝐴 1 ) holds; then 𝑍(𝑡) is an 𝜔-periodic solution of (1) if and only if 𝑍(𝑡) is the solution of the following system: 1 𝑋𝑃 (𝑡) = 𝑒⊖(−𝑎𝑃 ) (𝜔, 0) − 1 ×∫

𝑒⊖(−𝑎𝑃 ) (𝑠, 𝑡) 1 − 𝜇 (𝑠) 𝑎𝑃 (𝑠)

𝑡

[𝑘𝑃 (𝑠) − 𝑟𝑃 (𝑠) 𝑋𝑃 (𝑠)]

1 −𝑤𝑁 (𝑠) 𝑋𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑃 (𝑠)] Δ𝑠,

1 𝑒⊖(−𝑎𝑁 ) (𝜔, 0) − 1 ×∫

𝑡+𝜔

𝑡

𝑒⊖(−𝑎𝑁 ) (𝑠, 𝑡) 1 − 𝜇 (𝑠) 𝑎𝑁 (𝑠)

(13)

= [𝑘𝑁 (𝑡) − 𝑟𝑁 (𝑡) 𝑋𝑁 (𝑡)] × 𝐺 [𝑤𝑃2 (𝑡) 𝑋𝑃 (𝑡 − 𝜏𝑃 (𝑡)) 2 −𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑁 (𝑡)] ,

which leads to 𝑋𝑃Δ (𝑡) + ⊖ (−𝑎𝑃 ) (𝑡) 𝑋𝑃𝜎 (𝑡) = [𝑘𝑃 (𝑡) − 𝑟𝑃 (𝑡) 𝑋𝑃 (𝑡)] × 𝐺 [𝑤𝑃1 (𝑡) 𝑋𝑃 (𝑡 − 𝜏𝑃 (𝑡)) 1 , 1 − 𝜇 (𝑡) 𝑎𝑃 (𝑡)

= [𝑘𝑁 (𝑡) − 𝑟𝑁 (𝑡) 𝑋𝑁 (𝑡)] × 𝐺 [𝑤𝑃2 (𝑡) 𝑋𝑃 (𝑡 − 𝜏𝑃 (𝑡)) 2 −𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑁 (𝑡)]

1 . 1 − 𝜇 (𝑡) 𝑎𝑁 (𝑡) (14)

Multiplying both sides of the above equalities by 𝑒⊖(−𝑎𝑃 ) (𝑡, 0) and 𝑒⊖(−𝑎𝑁 ) (𝑡, 0), respectively, we have Δ

× 𝐺 [𝑤𝑃1 (𝑠) 𝑋𝑃 (𝑠 − 𝜏𝑃 (𝑠))

𝑋𝑁 (𝑡) =

Δ 𝜎 Δ 𝑋𝑁 (𝑡) + 𝑎𝑁 (𝑡) (𝑋𝑁 (𝑡) − 𝜇 (𝑡) 𝑋𝑁 (𝑡))

Δ 𝜎 𝑋𝑁 (𝑡) + ⊖ (−𝑎𝑁) (𝑡) 𝑋𝑁 (𝑡)

𝜏0 = min {|𝜏𝑃 |, |𝜏𝑁|} .

𝑡+𝜔

1 −𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑃 (𝑡)] ,

1 −𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑃 (𝑡)]

󵄨 󵄨 󵄨 󵄨 𝐼 = max {𝐿󵄨󵄨󵄨𝐼𝑃 󵄨󵄨󵄨, 𝐿󵄨󵄨󵄨𝐼𝑁󵄨󵄨󵄨} ,

𝐾 = max {𝑘𝑃 , 𝑘𝑁} ,

× 𝐺 [𝑤𝑃1 (𝑡) 𝑋𝑃 (𝑡 − 𝜏𝑃 (𝑡))

[𝑒⊖(−𝑎𝑃 ) (𝑡, 0) 𝑋𝑃 (𝑡)]

= [𝑘𝑃 (𝑡) − 𝑟𝑃 (𝑡) 𝑋𝑃 (𝑡)] × 𝐺 [𝑤𝑃1 (𝑡) 𝑋𝑃 (𝑡 − 𝜏𝑃 (𝑡)) 1 −𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑃 (𝑡)] 𝑒⊖(−𝑎𝑃 ) (𝜎 (𝑡) , 0) ,

[𝑒⊖(−𝑎𝑁 ) (𝑡, 0) 𝑋𝑁 (𝑡)]

Δ

= [𝑘𝑁 (𝑡) − 𝑟𝑁 (𝑡) 𝑋𝑁 (𝑡)] [𝑘𝑁 (𝑠) − 𝑟𝑁 (𝑠) 𝑋𝑁 (𝑠)]

× 𝐺 [𝑤𝑃2 (𝑠) 𝑋𝑃 (𝑠 − 𝜏𝑃 (𝑠)) 2 −𝑤𝑁 (𝑠) 𝑋𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑁 (𝑠)] Δ𝑠. (12)

× 𝐺 [𝑤𝑃2 (𝑡) 𝑋𝑃 (𝑡 − 𝜏𝑃 (𝑡)) 2 −𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑁 (𝑡)]

× 𝑒⊖(−𝑎𝑁 ) (𝜎 (𝑡) , 0) . (15)

4

Advances in Artificial Neural Systems

Integrating both sides of the above equalities from 𝑡 to 𝑡 + 𝜔 and using 𝑋𝑃 (𝑡 + 𝜔) = 𝑋𝑃 (𝑡) and 𝑋𝑁(𝑡 + 𝜔) = 𝑋𝑁(𝑡), we have 𝑋𝑃 (𝑡) = ∫

𝑡+𝜔

𝑡

𝑋𝑁 (𝑡) =

1 𝑒⊖(−𝑎𝑁 ) (𝜔, 0) − 1 𝑡+𝜔

×∫

[ [𝑘𝑃 (𝑠) − 𝑟𝑃 (𝑠) 𝑋𝑃 (𝑠)]

𝑒⊖(−𝑎𝑁 ) (𝑠, 𝑡) 1 − 𝜇 (𝑠) 𝑎𝑁 (𝑠)

𝑡

× 𝐺 [𝑤𝑃2 (𝑠) 𝑋𝑃 (𝑠 − 𝜏𝑃 (𝑠))

× 𝐺 [𝑤𝑃1 (𝑠) 𝑋𝑃 (𝑠 − 𝜏𝑃 (𝑠))

2 −𝑤𝑁 (𝑠) 𝑋𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑁 (𝑠)] Δ𝑠.

1 −𝑤𝑁 (𝑠) 𝑋𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑃 (𝑠)]]

× =∫

𝑡+𝜔

𝑡

𝑒⊖(−𝑎𝑃 ) (𝜎 (𝑠) , 0) 𝑒⊖(−𝑎𝑃 ) (𝑡 + 𝜔, 0) − 𝑒⊖(−𝑎𝑃 ) (𝑡, 0)

Δ𝑠

3. Main Results In this section, we prove the existence and uniqueness of the periodic solution to (1).

× 𝐺 [𝑤𝑃1 (𝑠) 𝑋𝑃 (𝑠 − 𝜏𝑃 (𝑠)) 1 −𝑤𝑁 (𝑠) 𝑋𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑃 (𝑠)]]

𝑋𝑁 (𝑡) = ∫

𝑡+𝜔

𝑡

𝑒⊖(−𝑎𝑃 ) (𝜎 (𝑠) , 𝑡) 𝑒⊖(−𝑎𝑃 ) (𝑡 + 𝜔, 𝑡) − 1

Δ𝑠,

[ [𝑘𝑁 (𝑠) − 𝑟𝑁 (𝑠) 𝑋𝑁 (𝑠)] × 𝐺 [𝑤𝑃2 (𝑠) 𝑋𝑃 (𝑠 − 𝜏𝑃 (𝑠)) 2 −𝑤𝑁 (𝑠) 𝑋𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑁 (𝑠)]]

× =∫

𝑡+𝜔

𝑡

𝑒⊖(−𝑎𝑁 ) (𝜎 (𝑠) , 0) 𝑒⊖(−𝑎𝑁 ) (𝑡 + 𝜔, 0) − 𝑒⊖(−𝑎𝑁 ) (𝑡, 0)

Δ𝑠

𝑒⊖(−𝑎𝑁 ) (𝜎 (𝑠) , 𝑡) 𝑒⊖(−𝑎𝑁 ) (𝑡 + 𝜔, 𝑡) − 1

𝜔󵄨 󵄨 𝜔 exp (∫0 󵄨󵄨󵄨󵄨𝜉𝜇(𝜏) ⊖ (−𝑎𝑃 (𝜏))󵄨󵄨󵄨󵄨 Δ𝜏) (𝐾 + 𝑅𝛽 + 𝑅𝑀/𝑊) 𝛼1 := , 󵄨󵄨 󵄨 󵄨󵄨𝑒⊖(−𝑎𝑃 ) (𝜔, 0) − 1󵄨󵄨󵄨 (1 − 𝑎𝑃 𝜇) 󵄨 󵄨 𝜔󵄨 󵄨 𝜔 exp (∫0 󵄨󵄨󵄨󵄨𝜉𝜇(𝜏) ⊖ (−𝑎𝑁 (𝜏))󵄨󵄨󵄨󵄨 Δ𝜏) (𝐾 + 𝑅𝛽 + 𝑅𝑀/𝑊) 𝛼2 := , 󵄨󵄨 󵄨 󵄨󵄨𝑒⊖(−𝑎𝑁 ) (𝜔, 0) − 1󵄨󵄨󵄨 (1 − 𝑎𝑃 𝜇) 󵄨 󵄨 (19)

Proof. Let X = {𝑍(𝑡) = (𝑧𝑃 (𝑡), 𝑧𝑁(𝑡)) | 𝑍 ∈ 𝐶rd (T, R2 ), 𝑍(𝑡+ 𝜔) = 𝑍(𝑡)} with the norm ‖𝑍‖ = sup𝑡∈T {|𝑧𝑃 (𝑡)|+|𝑧𝑁(𝑡)|}; then X is a Banach space [14]. Define

× 𝐺 [𝑤𝑃2 (𝑠) 𝑋𝑃 (𝑠 − 𝜏𝑃 (𝑠))

×

Theorem 12. Suppose (𝐴 1 )-(𝐴 2 ) hold and max{𝛼, 𝑊} < 1. Then (1) has a unique 𝜔-periodic solution, where

and 𝛼 := max{𝛼1 , 𝛼2 }.

[ [𝑘𝑁 (𝑠) − 𝑟𝑁 (𝑠) 𝑋𝑁 (𝑠)]

2 −𝑤𝑁 (𝑠) 𝑋𝑁 (𝑠

(18) The proof is completed.

[ [𝑘𝑃 (𝑠) − 𝑟𝑃 (𝑠) 𝑋𝑃 (𝑠)]

×

− 𝜏𝑁 (𝑠)) + 𝐼𝑁 (𝑠)]]

Δ𝑠.

where 𝑍(𝑡) = (𝑧𝑃 (𝑡), 𝑧𝑁(𝑡)) ∈ X and (F𝑍)𝑃 (𝑡) =

Since 1 − 𝜇 (𝑠) 𝑎𝑃 (𝑠) 𝑒⊖(−𝑎𝑁 ) (𝑠, 𝑡) 1 − 𝜇 (𝑠) 𝑎𝑁 (𝑠)

×∫

𝑡

𝑡+𝜔

= 𝑒⊖(−𝑎𝑃 ) (𝜎 (𝑠) , 𝑡) , = 𝑒⊖(−𝑎𝑁 ) (𝜎 (𝑠) , 𝑡)

1 − 𝜇 (𝑠) 𝑎𝑃 (𝑠)

1 −𝑤𝑁 (𝑠) 𝑧𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑃 (𝑠)]Δ𝑠,

(F𝑍)𝑁 (𝑡) =

1 𝑒⊖(−𝑎𝑁 ) (𝜔, 0) − 1 ×∫

[𝑘𝑃 (𝑠) − 𝑟𝑃 (𝑠) 𝑋𝑃 (𝑠)]

× 𝐺 [𝑤𝑃1 (𝑠) 𝑋𝑃 (𝑠 − 𝜏𝑃 (𝑠)) 1 −𝑤𝑁 (𝑠) 𝑋𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑃 (𝑠)] Δ𝑠,

[𝑘𝑃 (𝑠) − 𝑟𝑃 (𝑠) 𝑧𝑃 (𝑠)]

× 𝐺 [𝑤𝑃1 (𝑠) 𝑧𝑃 (𝑠 − 𝜏𝑃 (𝑠))

(17)

𝑒⊖(−𝑎𝑃 ) (𝑠, 𝑡)

𝑒⊖(−𝑎𝑃 ) (𝑠, 𝑡) 1 − 𝜇 (𝑠) 𝑎𝑃 (𝑠)

𝑡

1 𝑒⊖(−𝑎𝑃 ) (𝜔, 0) − 1 𝑡+𝜔

1 𝑒⊖(−𝑎𝑃 ) (𝜔, 0) − 1 ×∫

and 𝑎𝑃 (𝑡 + 𝜔) = 𝑎𝑃 (𝑡), 𝑎𝑁(𝑡 + 𝜔) = 𝑎𝑁(𝑡), we obtain that 𝑋𝑃 (𝑡) =

(F𝑍) (𝑡) = ((F𝑍)𝑃 (𝑡) , (F𝑍)𝑁 (𝑡)) , (20)

F : X 󳨀→ X,

(16) 𝑒⊖(−𝑎𝑃 ) (𝑠, 𝑡)

[𝑘𝑁 (𝑠) − 𝑟𝑁 (𝑠) 𝑋𝑁 (𝑠)]

𝑡+𝜔

𝑡

𝑒⊖(−𝑎𝑁 ) (𝑠, 𝑡) 1 − 𝜇 (𝑠) 𝑎𝑁 (𝑠)

[𝑘𝑁 (𝑠) − 𝑟𝑁 (𝑠) 𝑧𝑁 (𝑠)]

× 𝐺 [𝑤𝑃2 (𝑠) 𝑧𝑃 (𝑠 − 𝜏𝑃 (𝑠)) 2 −𝑤𝑁 (𝑠) 𝑧𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑁 (𝑠)]Δ𝑠

(21)

Advances in Artificial Neural Systems

5

for 𝑡 ∈ T. Note that

It follows from (23) and (24) that 𝑠

∫𝑡 𝜉𝜇(𝜏) (⊖(−𝑎𝑃 )(𝜏))Δ𝜏

𝑒⊖(−𝑎𝑃 ) (𝑠, 𝑡) = 𝑒

𝑡+𝜔

∫𝑡

≤𝑒

𝜔

= 𝑒∫0

|𝜉𝜇(𝜏) (⊖(−𝑎𝑃 )(𝜏))|Δ𝜏

|𝜉𝜇(𝜏) (⊖(−𝑎𝑃 )(𝜏))|Δ𝜏

.

Let Ω = {𝑍(𝑡) | 𝑍 ∈ X, ‖𝑍‖ ≤ 𝐼/(1−𝑊)} and 𝛽 := 𝐼/(1−𝑊). Obviously, Ω is a closed nonempty subset of X. Firstly, we prove that the mapping F maps Ω into itself. In fact, for any 𝑍(𝑡) ∈ Ω, we have 󵄨 󵄨󵄨 󵄨󵄨(F𝑍)𝑃 (𝑡)󵄨󵄨󵄨 󵄨󵄨 1 󵄨 = 󵄨󵄨󵄨󵄨 󵄨󵄨 𝑒⊖(−𝑎𝑃 ) (𝜔, 0) − 1 ×∫

𝑒⊖(−𝑎𝑃 ) (𝑠, 𝑡)

𝑡+𝜔

1 − 𝜇 (𝑠) 𝑎𝑃 (𝑠)

𝑡

‖F𝑍‖ ≤ 𝛼𝐼 + 𝛼𝑊 ‖𝑍‖ ≤

(22)

[𝑘𝑃 (𝑠) − 𝑟𝑃 (𝑠) 𝑧𝑃 (𝑠)]

󵄨 󵄨󵄨 󵄨󵄨(F𝑍)𝑃 (𝑡) − (F𝑍󸀠 ) (𝑡)󵄨󵄨󵄨 󵄨 󵄨 𝑃 󵄨󵄨 1 󵄨 = 󵄨󵄨󵄨󵄨 󵄨󵄨 𝑒⊖(−𝑎𝑃 ) (𝜔, 0) − 1 ×∫

𝑡+𝜔

1 −𝑤𝑁 (𝑠) 𝑧𝑁 (𝑠

𝑡+𝜔

𝑡

1 −𝑤𝑁 (𝑠) 𝑧𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑃 (𝑠)] Δ𝑠

−

− 𝜏𝑃 (𝑠))

󵄨 − 𝜏𝑁 (𝑠)) + 𝐼𝑃 (𝑠)] Δ𝑠󵄨󵄨󵄨󵄨

Similarly, we have 󵄨󵄨 󵄨 󵄨󵄨(F𝑍)𝑁 (𝑡)󵄨󵄨󵄨 󵄨󵄨 1 󵄨 = 󵄨󵄨󵄨󵄨 󵄨󵄨 𝑒⊖(−𝑎𝑁 ) (𝜔, 0) − 1

1 − 𝜇 (𝑠) 𝑎𝑁 (𝑠)

[𝑘𝑃 (𝑠) − 𝑟𝑃 (𝑠) 𝑧𝑃󸀠 (𝑠)]

󵄨󵄨 󵄨 1 󸀠 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑃 (𝑠)] Δ𝑠󵄨󵄨󵄨󵄨 −𝑤𝑁 (𝑠) 𝑧𝑁 󵄨󵄨 𝜔󵄨 󵄨 exp (∫0 󵄨󵄨󵄨󵄨𝜉𝜇(𝜏) (⊖ (−𝑎𝑃 ) (𝜏))󵄨󵄨󵄨󵄨 Δ𝜏) (𝐾𝑊 + 𝑅𝑊𝛽 + 𝑅𝑀) ≤ 󵄨󵄨 󵄨 󵄨󵄨𝑒⊖(−𝑎𝑃 ) (𝜔, 0) − 1󵄨󵄨󵄨 (1 − 𝑎𝑃 𝜇) 󵄨 󵄨 ×∫

𝑡+𝜔

𝑡

󵄨󵄨 󵄨󵄨𝑧𝑃 (𝑠 − 𝜏𝑃 (𝑠)) − 𝑧𝑃󸀠 (𝑠 − 𝜏𝑃 (𝑠)) + 𝑧𝑁 (𝑠 − 𝜏𝑁 (𝑠)) 󵄨 󵄨 󸀠 −𝑧𝑁 (𝑠 − 𝜏𝑁 (𝑠))󵄨󵄨󵄨󵄨 Δ𝑠

(23)

𝑡

𝑒⊖(−𝑎𝑃 ) (𝑠, 𝑡) 1 − 𝜇 (𝑠) 𝑎𝑃 (𝑠)

× 𝐺 [𝑤𝑃1 (𝑠) 𝑧𝑃󸀠 (𝑠 − 𝜏𝑃 (𝑠))

𝑡∈T

×∫

𝑡+𝜔

𝑡

󵄨 󵄨 󵄨 󵄨 ≤ 𝛼1 (𝐼 + 𝑊sup (󵄨󵄨󵄨𝑧𝑃 (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨𝑧𝑁 (𝑡)󵄨󵄨󵄨)) .

𝑒⊖(−𝑎𝑁 ) (𝑠, 𝑡)

1 𝑒⊖(−𝑎𝑃 ) (𝜔, 0) − 1

×∫

󵄨 󵄨󵄨 󵄨󵄨[𝑊𝑧𝑃 (𝑠 − 𝜏𝑃 (𝑠)) + 𝑊𝑧𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼]󵄨󵄨󵄨 Δ𝑠

𝑡+𝜔

[𝑘𝑃 (𝑠) − 𝑟𝑃 (𝑠) 𝑧𝑃 (𝑠)]

× 𝐺 [𝑤𝑃1 (𝑠) 𝑧𝑃 (𝑠 − 𝜏𝑃 (𝑠))

𝜔󵄨 󵄨 exp (∫0 󵄨󵄨󵄨󵄨𝜉𝜇(𝜏) (⊖ (−𝑎𝑃 ) (𝜏))󵄨󵄨󵄨󵄨 Δ𝜏) (𝐾 + 𝑅𝛽) ≤ 󵄨󵄨 󵄨󵄨 󵄨󵄨󵄨𝑒⊖(−𝑎𝑃 ) (𝜔, 0) − 1󵄨󵄨󵄨 (1 − 𝑎𝑃 𝜇)

×∫

𝑒⊖(−𝑎𝑃 ) (𝑠, 𝑡) 1 − 𝜇 (𝑠) 𝑎𝑃 (𝑠)

𝑡

󵄨󵄨 󵄨 1 −𝑤𝑁 (𝑠) 𝑧𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑃 (𝑠)] Δ𝑠󵄨󵄨󵄨󵄨 󵄨󵄨 𝜔 󵄨󵄨 󵄨󵄨 exp (∫ 󵄨󵄨𝜉𝜇(𝜏) (⊖ (−𝑎𝑃 ) (𝜏))󵄨󵄨󵄨 Δ𝜏) ≤ 󵄨󵄨 0 󵄨 󵄨 󵄨󵄨𝑒⊖(−𝑎𝑃 ) (𝜔, 0) − 1󵄨󵄨󵄨 (1 − 𝑎𝑃 𝜇) 󵄨 󵄨 𝑡+𝜔 󵄨󵄨 × ∫ 󵄨󵄨󵄨 [𝑘𝑃 (𝑠) − 𝑟𝑃 (𝑠) 𝑧𝑃 (𝑠)] 𝑡 ×

(25)

Hence, F𝑍 ∈ Ω. Next, we prove that F is a contraction mapping. For any 󸀠 𝑍(𝑡) = (𝑧𝑃 (𝑡), 𝑧𝑁(𝑡)) ∈ Ω, 𝑍󸀠 (𝑡) = (𝑧𝑃󸀠 (𝑡), 𝑧𝑁 (𝑡)) ∈ Ω, we have

× 𝐺 [𝑤𝑃1 (𝑠) 𝑧𝑃 (𝑠 − 𝜏𝑃 (𝑠))

𝐺 [𝑤𝑃1 (𝑠) 𝑧𝑃 (𝑠

𝐼 . 1−𝑊

󵄨 󵄨 󵄨 󵄨 󸀠 ≤ 𝛼1 𝑊sup [󵄨󵄨󵄨󵄨𝑧𝑃 (𝑡) − 𝑧𝑃󸀠 (𝑡)󵄨󵄨󵄨󵄨 + 󵄨󵄨󵄨󵄨𝑧𝑁 (𝑡) − 𝑧𝑁 (𝑡)󵄨󵄨󵄨󵄨] . 𝑡∈T

(26) Similarly, we have

[𝑘𝑁 (𝑠) − 𝑟𝑁 (𝑠) 𝑧𝑃 (𝑠)]

× 𝐺 [𝑤𝑃2 (𝑠) 𝑧𝑁 (𝑠 − 𝜏𝑃 (𝑠))

󵄨󵄨 󵄨 2 −𝑤𝑁 (𝑠) 𝑧𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑁 (𝑠)] Δ𝑠󵄨󵄨󵄨󵄨 󵄨󵄨

󵄨 󵄨 󵄨 󵄨 ≤ 𝛼2 (𝐼 + 𝑊sup (󵄨󵄨󵄨𝑧𝑃 (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨𝑧𝑁 (𝑡)󵄨󵄨󵄨)) .

󵄨󵄨 󵄨 󵄨󵄨(F𝑍)𝑁 (𝑡) − (F𝑍󸀠 ) (𝑡)󵄨󵄨󵄨 󵄨 󵄨 𝑁 󵄨󵄨 1 󵄨 = 󵄨󵄨󵄨󵄨 󵄨󵄨 𝑒⊖(−𝑎𝑁 ) (𝜔, 0) − 1 ×∫

𝑡+𝜔

𝑡

𝑒⊖(−𝑎𝑁 ) (𝑠, 𝑡) 1 − 𝜇 (𝑠) 𝑎𝑁 (𝑠)

[𝑘𝑁 (𝑠) − 𝑟𝑁 (𝑠) 𝑧𝑁 (𝑠)]

× 𝐺 [𝑤𝑃2 (𝑠) 𝑧𝑃 (𝑠 − 𝜏𝑃 (𝑠))

𝑡∈T

(24)

2 −𝑤𝑁 (𝑠) 𝑧𝑁 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑁 (𝑠)] Δ𝑠

6

Advances in Artificial Neural Systems −

1 𝑒⊖(−𝑎𝑁 ) (𝜔, 0) − 1

×∫

𝑡+𝜔

𝑡

𝑒⊖(−𝑎𝑁 ) (𝑠, 𝑡) 1 − 𝜇 (𝑠) 𝑎𝑁 (𝑠)

Δ

∗ (𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡))

∗ = −𝑎𝑁 (𝑡) (𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡))

󸀠 [𝑘𝑁 (𝑠) − 𝑟𝑁 (𝑠) 𝑧𝑁 (𝑠)]

+ 𝑘𝑁 (𝑡) 𝐺 [𝑤𝑃2 (𝑡) 𝑋𝑃 (𝑡 − 𝜏𝑃 (𝑡))

× 𝐺 [𝑤𝑃2 (𝑠) 𝑧𝑃󸀠 (𝑠 − 𝜏𝑃 (𝑠)) 󵄨󵄨 󵄨 2 󸀠 (𝑠 − 𝜏𝑁 (𝑠)) + 𝐼𝑁 (𝑠)] Δ𝑠󵄨󵄨󵄨󵄨 −𝑤𝑁 (𝑠) 𝑧𝑁 󵄨󵄨 󵄨 󵄨 󵄨 󵄨 󸀠 ≤ 𝛼2 𝑊sup [󵄨󵄨󵄨󵄨𝑧𝑃 (𝑡) − 𝑧𝑃󸀠 (𝑡)󵄨󵄨󵄨󵄨 + 󵄨󵄨󵄨󵄨𝑧𝑁 (𝑡) − 𝑧𝑁 (𝑡)󵄨󵄨󵄨󵄨] . 𝑡∈T (27) From (26) and (27), we can get 󵄩󵄩 󵄩󵄩 󸀠 󵄩 󸀠󵄩 󵄩 󵄩 (28) 󵄩󵄩󵄩(F𝑍) − (F𝑍 )󵄩󵄩󵄩 ≤ 𝛼𝑊 󵄩󵄩󵄩𝑍 − 𝑍 󵄩󵄩󵄩 . Note that 𝛼𝑊 < 1. Thus, F is a contraction mapping. By the fixed point theorem in the Banach space, F possesses a unique fixed point. The proof is completed. Theorem 13. Under the conditions of Theorem 12, suppose further the following. (𝐴 3 ) There exist some constants 𝜖 > 0, 𝜉 > 0, 𝜉󸀠 > 0 such that 𝜉󸀠 (1 + 𝜖𝜇 (𝑡 + 𝜏0 )) (𝐾 + 𝑅𝛽) 𝑊 𝑒𝜖 (𝑡 + 𝜏0 , 𝑡) < 1, (1 + ) 𝜉 (𝑎𝑃 − 𝑅𝑀) (1 + 𝜖𝜇 (𝑡)) − 𝜖 𝜉 (1 + 𝜖𝜇 (𝑡 + 𝜏0 )) (𝐾 + 𝑅𝛽) 𝑊 𝑒𝜖 (𝑡 + 𝜏0 , 𝑡) < 1; (1 + 󸀠 ) 𝜉 (𝑎𝑁 − 𝑅𝑀) (1 + 𝜖𝜇 (𝑡)) − 𝜖 (29) then the periodic solution of (1) is globally exponentially stable. Proof. It follows from Theorem 12 that (1) has an 𝜔-periodic ∗ solution 𝑍∗ = (𝑋𝑃∗ (𝑡), 𝑋𝑁 (𝑡))⊤ . Let 𝑍(𝑡) = (𝑋𝑃 (𝑡), 𝑋𝑁(𝑡))⊤ be any solution of (1); then we have (𝑋𝑃 (𝑡) −

2 −𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑁 (𝑡)]

− 𝑘𝑁 (𝑡) 𝐺 [𝑤𝑃2 (𝑡) 𝑋𝑃∗ (𝑡 − 𝜏𝑃 (𝑡)) 2 ∗ (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑁 (𝑡)] −𝑤𝑁 (𝑡) 𝑋𝑁

− 𝑟𝑁 (𝑡) 𝑋𝑁 (𝑡) 𝐺 [𝑤𝑃2 (𝑡) 𝑋𝑃 (𝑡 − 𝜏𝑃 (𝑡)) 2 −𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑁 (𝑡)] ∗ + 𝑟𝑁 (𝑡) 𝑋𝑁 (𝑡) 𝐺 [𝑤𝑃2 (𝑡) 𝑋𝑃∗ (𝑡 − 𝜏𝑃 (𝑡)) 2 ∗ (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑁 (𝑡)] , −𝑤𝑁 (𝑡) 𝑋𝑁

(30) which leads to 󵄨 󵄨Δ 𝐷+ 󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨 󵄨 󵄨 ≤ − (𝑎𝑃 − 𝑅𝑀) 󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨 + (𝐾 + 𝑅𝛽) 𝑊 󵄨 󵄨 × (󵄨󵄨󵄨𝑋𝑃 (𝑡 − 𝜏0 ) − 𝑋𝑃∗ (𝑡 − 𝜏0 )󵄨󵄨󵄨 󵄨 󵄨 ∗ + 󵄨󵄨󵄨𝑋𝑁 (𝑡 − 𝜏0 ) − 𝑋𝑁 (𝑡 − 𝜏0 )󵄨󵄨󵄨) , 󵄨Δ 󵄨 ∗ 𝐷+ 󵄨󵄨󵄨𝑋𝑁(𝑡) − 𝑋𝑁 (𝑡)󵄨󵄨󵄨 󵄨 󵄨 ∗ ≤ − (𝑎𝑁 − 𝑅𝑀) 󵄨󵄨󵄨𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡)󵄨󵄨󵄨 + (𝐾 + 𝑅𝛽) 𝑊 󵄨 󵄨 × (󵄨󵄨󵄨𝑋𝑃 (𝑡 − 𝜏0 ) − 𝑋𝑃∗ (𝑡 − 𝜏0 )󵄨󵄨󵄨 󵄨 󵄨 ∗ + 󵄨󵄨󵄨𝑋𝑁 (𝑡 − 𝜏0 ) − 𝑋𝑁 (𝑡 − 𝜏0 )󵄨󵄨󵄨) .

Δ 𝑋𝑃∗ (𝑡))

= −𝑎𝑃 (𝑡) (𝑋𝑃 (𝑡) −

(31) 𝑋𝑃∗ (𝑡))

For any 𝛼 ∈ [−𝜏0 , 0]T , construct the Lyapunov functional 𝑉(𝑡) = 𝑉1 (𝑡) + 𝑉2 (𝑡) + 𝑉3 (𝑡) + 𝑉4 (𝑡), where

+ 𝑘𝑃 (𝑡) 𝐺 [𝑤𝑃1 (𝑡) 𝑋𝑃 (𝑡 − 𝜏𝑃 (𝑡)) 1 −𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑃 (𝑡)]

− 𝑘𝑃 (𝑡) 𝐺 [𝑤𝑃1 (𝑡) 𝑋𝑃∗ (𝑡 − 𝜏𝑃 (𝑡)) 1 ∗ (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑃 (𝑡)] −𝑤𝑁 (𝑡) 𝑋𝑁

󵄨 󵄨 𝑉1 (𝑡) = 𝜉𝑒𝜖 (𝑡, 𝛼) 󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨 , 󵄨 󵄨 ∗ 𝑉3 (𝑡) = 𝜉󸀠 𝑒𝜖 (𝑡, 𝛼) 󵄨󵄨󵄨𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡)󵄨󵄨󵄨 , 𝑉2 (𝑡) = 𝜉 ∫

𝑡

𝑡−𝜏0

− 𝑟𝑃 (𝑡) 𝑋𝑃 (𝑡) 𝐺 [𝑤𝑃1 (𝑡) 𝑋𝑃 (𝑡 − 𝜏𝑃 (𝑡)) 1 −𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 𝜏𝑁 (𝑡)) + 𝐼𝑃 (𝑡)]

+

𝑟𝑃 (𝑡) 𝑋𝑃∗ (𝑡) 𝐺 [𝑤𝑃1 (𝑡) 𝑋𝑃∗ (𝑡

− 𝜏𝑃 (𝑡))

1 ∗ −𝑤𝑁 (𝑡 (𝑡) 𝑋𝑁

− 𝜏𝑁 (𝑡)) + 𝐼𝑃 (𝑡)] ,

(1 + 𝜖𝜇 (𝑠 + 𝜏0 )) 𝑒𝜖 (𝑠 + 𝜏0 , 𝛼) (𝐾 + 𝑅𝛽) 𝑊 󵄨 󵄨 󵄨 󵄨 ∗ × (󵄨󵄨󵄨𝑋𝑃 (𝑠) − 𝑋𝑃∗ (𝑠)󵄨󵄨󵄨 + 󵄨󵄨󵄨𝑋𝑁 (𝑠) − 𝑋𝑁 (𝑠)󵄨󵄨󵄨) Δ𝑠,

𝑉4 (𝑡) = 𝜉󸀠 ∫

𝑡

𝑡−𝜏0

(1 + 𝜖𝜇 (𝑠 + 𝜏0 )) 𝑒𝜖 (𝑠 + 𝜏0 , 𝛼) (𝐾 + 𝑅𝛽) 𝑊

󵄨 󵄨 󵄨 󵄨 ∗ × (󵄨󵄨󵄨𝑋𝑃 (𝑠) − 𝑋𝑃∗ (𝑠)󵄨󵄨󵄨 + 󵄨󵄨󵄨𝑋𝑁 (𝑠) − 𝑋𝑁 (𝑠)󵄨󵄨󵄨) Δ𝑠. (32)

Advances in Artificial Neural Systems

7 󵄨 󵄨 ∗ × 󵄨󵄨󵄨𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡)󵄨󵄨󵄨 + 𝜉󸀠 (1 + 𝜖𝜇 (𝑡)) 𝑒𝜖 (𝑡, 𝛼) (𝐾 + 𝑅𝛽)

Calculating 𝐷+ 𝑉(𝑡)Δ along (1), we can get

󵄨 󵄨 × 𝑊 × (󵄨󵄨󵄨𝑋𝑃 (𝑡 − 𝜏0 ) − 𝑋𝑃∗ (𝑡 − 𝜏0 )󵄨󵄨󵄨

󵄨 𝐷+ 𝑉1 (𝑡)Δ 󵄨󵄨󵄨󵄨(1) 󵄨 󵄨 ≤ 𝜉 [𝜖𝑒𝜖 (𝑡, 𝛼) 󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨

󵄨 𝐷+ 𝑉4 (𝑡)Δ 󵄨󵄨󵄨󵄨(1)

󵄨 󵄨Δ +𝑒𝜖 (𝜎 (𝑡) , 𝛼) 𝐷+ 󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨 ]

≤ 𝜉󸀠 (1 + 𝜖𝜇 (𝑡 + 𝜏0 )) 𝑒𝜖 (𝑡 + 𝜏0 , 𝛼) (𝐾 + 𝑅𝛽) 𝑊

󵄨 󵄨 ≤ 𝜉 {𝜖𝑒𝜖 (𝑡, 𝛼) 󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨 + 𝑒𝜖 (𝜎 (𝑡) , 𝛼)

󵄨 󵄨 󵄨 󵄨 ∗ × (󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡)󵄨󵄨󵄨)

󵄨 󵄨 × [− (𝑎𝑃 − 𝑅𝑀) 󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨 + (𝐾 + 𝑅𝛽) 𝑊

− 𝜉󸀠 (1 + 𝜖𝜇 (𝑡)) 𝑒𝜖 (𝑡, 𝛼) (𝐾 + 𝑅𝛽) 𝑊 󵄨 󵄨 × (󵄨󵄨󵄨𝑋𝑃 (𝑡 − 𝜏0 ) − 𝑋𝑃∗ (𝑡 − 𝜏0 )󵄨󵄨󵄨

󵄨 󵄨 × (󵄨󵄨󵄨𝑋𝑃 (𝑡 − 𝜏0 ) − 𝑋𝑃∗ (𝑡 − 𝜏0 )󵄨󵄨󵄨

󵄨 󵄨 ∗ + 󵄨󵄨󵄨𝑋𝑁 (𝑡 − 𝜏0 ) − 𝑋𝑁 (𝑡 − 𝜏0 )󵄨󵄨󵄨) .

󵄨 󵄨 ∗ + 󵄨󵄨󵄨𝑋𝑁 (𝑡 − 𝜏0 ) − 𝑋𝑁 (𝑡 − 𝜏0 )󵄨󵄨󵄨) ]} 󵄨 󵄨 = 𝜉 [𝜖 − (𝑎𝑃 − 𝑅𝑀) (1 + 𝜖𝜇 (𝑡))] 𝑒𝜖 (𝑡, 𝛼) 󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨

(35) We have

Δ 󵄨󵄨 𝐷+ (𝑉3 (𝑡) + 𝑉4 (𝑡)) 󵄨󵄨󵄨 󵄨(1)

+ 𝜉 (1 + 𝜖𝜇 (𝑡)) 𝑒𝜖 (𝑡, 𝛼) (𝐾 + 𝑅𝛽) 𝑊 󵄨 󵄨 × (󵄨󵄨󵄨𝑋𝑃 (𝑡 − 𝜏0 ) − 𝑋𝑃∗ (𝑡 − 𝜏0 )󵄨󵄨󵄨

≤ 𝜉󸀠 [𝜖 − (𝑎𝑁 − 𝑅𝑀) (1 + 𝜖𝜇 (𝑡))]

󵄨 󵄨 ∗ + 󵄨󵄨󵄨𝑋𝑁 (𝑡 − 𝜏0 ) − 𝑋𝑁 (𝑡 − 𝜏0 )󵄨󵄨󵄨) , 󵄨 𝐷+ 𝑉2 (𝑡)Δ 󵄨󵄨󵄨󵄨(1)

󵄨 󵄨 ∗ × 𝑒𝜖 (𝑡, 𝛼) 󵄨󵄨󵄨𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡)󵄨󵄨󵄨 + 𝜉󸀠 (1 + 𝜖𝜇 (𝑡 + 𝜏0 )) 𝑒𝜖 (𝑡 + 𝜏0 , 𝛼) (𝐾 + 𝑅𝛽) 𝑊

≤ 𝜉 (1 + 𝜖𝜇 (𝑡 + 𝜏0 )) 𝑒𝜖 (𝑡 + 𝜏0 , 𝛼) (𝐾 + 𝑅𝛽) 𝑊

󵄨 󵄨 ∗ × (󵄨󵄨󵄨𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡)󵄨󵄨󵄨) + 𝜉󸀠 (1 + 𝜖𝜇 (𝑡 + 𝜏0 ))

󵄨 󵄨 󵄨 󵄨 ∗ × (󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡)󵄨󵄨󵄨)

󵄨 󵄨 × 𝑒𝜖 (𝑡 + 𝜏0 , 𝛼) (𝐾 + 𝑅𝛽) 𝑊 (󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨) . (36)

− 𝜉 (1 + 𝜖𝜇 (𝑡)) 𝑒𝜖 (𝑡, 𝛼) (𝐾 + 𝑅𝛽) 𝑊 󵄨 󵄨 × (󵄨󵄨󵄨𝑋𝑃 (𝑡 − 𝜏0 ) − 𝑋𝑃∗ (𝑡 − 𝜏0 )󵄨󵄨󵄨

󵄨 󵄨 ∗ + 󵄨󵄨󵄨𝑋𝑁 (𝑡 − 𝜏0 ) − 𝑋𝑁 (𝑡 − 𝜏0 )󵄨󵄨󵄨) , (33)

which leads to Δ 󵄨󵄨 𝐷+ (𝑉1 (𝑡) + 𝑉2 (𝑡)) 󵄨󵄨󵄨 󵄨(1)

≤ 𝜉 [𝜖 − (𝑎𝑃 − 𝑅𝑀) (1 + 𝜖𝜇 (𝑡))] 󵄨 󵄨 × 𝑒𝜖 (𝑡, 𝛼) 󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨 + 𝜉 (1 + 𝜖𝜇 (𝑡 + 𝜏0 )) 𝑒𝜖 (𝑡 + 𝜏0 , 𝛼) (𝐾 + 𝑅𝛽) 𝑊 (34) 󵄨 󵄨 × (󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨) + 𝜉 (1 + 𝜖𝜇 (𝑡 + 𝜏0 )) 𝑒𝜖 (𝑡 + 𝜏0 , 𝛼) (𝐾 + 𝑅𝛽) 𝑊 󵄨 󵄨 ∗ × (󵄨󵄨󵄨𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡)󵄨󵄨󵄨) .

From (34) and (36), we can get 󵄨 𝐷+ 𝑉(𝑡)Δ 󵄨󵄨󵄨󵄨(1) ≤ {𝜉 [𝜖 − (𝑎𝑃 − 𝑅𝑀) (1 + 𝜖𝜇 (𝑡))] + (𝜉 + 𝜉󸀠 ) (1 + 𝜖𝜇 (𝑡 + 𝜏0 )) 𝑒𝜖 (𝑡 + 𝜏0 , 𝑡) (𝐾 + 𝛽𝑅) 𝑊} 󵄨 󵄨 × 𝑒𝜖 (𝑡, 𝛼) 󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨 + {𝜉󸀠 [𝜖 − (𝑎𝑁 − 𝑅𝑀) (1 + 𝜖𝜇 (𝑡))] + (𝜉 + 𝜉󸀠 ) × (1 + 𝜖𝜇 (𝑡 + 𝜏0 )) 𝑒𝜖 (𝑡 + 𝜏0 , 𝑡) (𝐾 + 𝛽𝑅) 𝑊} 󵄨 󵄨 ∗ × 𝑒𝜖 (𝑡, 𝛼) 󵄨󵄨󵄨𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡)󵄨󵄨󵄨 .

≤ 𝜉󸀠 [𝜖 − (𝑎𝑁 − 𝑅𝑀) (1 + 𝜖𝜇 (𝑡))] 𝑒𝜖 (𝑡, 𝛼)

(37)

By assumption (𝐴 3 ), it follows that 𝑉(𝑡) ≤ 𝑉(0) for 𝑡 ∈ T + . On the other hand, we have 𝑉 (0) ≤ [𝜉𝑒𝜖 (0, 𝛼) + (𝜉 + 𝜉󸀠 )

Note that 󵄨 𝐷+ 𝑉3 (𝑡)Δ 󵄨󵄨󵄨󵄨(1)

󵄨 󵄨 ∗ + 󵄨󵄨󵄨𝑋𝑁 (𝑡 − 𝜏0 ) − 𝑋𝑁 (𝑡 − 𝜏0 )󵄨󵄨󵄨) ,

0

× ∫ (1 + 𝜖𝜇 (𝑠 + 𝜏0 )) 𝑒𝜖 (𝑠 + 𝜏0 , 𝛼) (𝐾 + 𝛽𝑅) 𝑊Δ𝑠] −𝜏0

×

󵄨 󵄨 sup 󵄨󵄨󵄨𝑋𝑃 (𝑠) − 𝑋𝑃∗ (𝑠)󵄨󵄨󵄨 𝑠∈[−𝜏0 ,0]T

8

Advances in Artificial Neural Systems Consider the following Wilson-Cowan neural network with delays on time scale T:

+ [𝜉󸀠 𝑒𝜖 (0, 𝛼) + (𝜉 + 𝜉󸀠 ) 0

×∫ (1+𝜖𝜇 (𝑠 + 𝜏0 )) 𝑒𝜖 (𝑠 + 𝜏0 , 𝛼) (𝐾 + 𝛽𝑅) 𝑊Δ𝑠] −𝜏0

×

󵄨 󵄨 ∗ sup 󵄨󵄨󵄨𝑋𝑁 (𝑠) − 𝑋𝑁 (𝑠)󵄨󵄨󵄨

= −𝑎𝑃 (𝑡) 𝑋𝑃 (𝑡) + [𝑘𝑃 (𝑡) − 𝑟𝑃 (𝑡) 𝑋𝑃 (𝑡)]

𝑠∈[−𝜏0 ,0]T

= Γ (𝜖) ( sup 𝑠∈[−𝜏0 ,0]T

1 × 𝐺 [𝑤𝑃1 (𝑡) 𝑋𝑃 (𝑡 − 2) − 𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 1) + 𝐼𝑃 (𝑡)] ,

󵄨 󵄨󵄨 ∗ 󵄨󵄨𝑋𝑃 (𝑠) − 𝑋𝑃 (𝑠)󵄨󵄨󵄨

Δ 𝑋𝑁 (𝑡)

󵄨 󵄨 ∗ + sup 󵄨󵄨󵄨𝑋𝑁 (𝑠) − 𝑋𝑁 (𝑠)󵄨󵄨󵄨) , 𝑠∈[−𝜏0 ,0]T

= −𝑎𝑁 (𝑡) 𝑋𝑁 (𝑡) + [𝑘𝑁 (𝑡) − 𝑟𝑁 (𝑡) 𝑋𝑁 (𝑡)] (38)

where Γ(𝜖) = max{Δ 1 , Δ 2 }, Δ 1 = 𝜉𝑒𝜖 (0, 𝛼) + (𝜉 + 𝜉󸀠 ) ×∫

0

−𝜏0

(1 + 𝜖𝜇 (𝑠 + 𝜏0 )) 𝑒𝜖 (𝑠 + 𝜏0 , 𝛼) (𝐾 + 𝛽𝑅) 𝑊Δ𝑠,

0

−𝜏0

2 × 𝐺 [𝑤𝑃2 (𝑡) 𝑋𝑃 (𝑡 − 2) − 𝑤𝑁 (𝑡) 𝑋𝑁 (𝑡 − 1) + 𝐼𝑁 (𝑡)] . (43)

Case 1. Consider T = R. Take (𝑎𝑃 (𝑡), 𝑎𝑁(𝑡))⊤ = (2+sin(𝑡), 2+ cos(𝑡))⊤ . Obviously, 𝑎𝑃 = 𝑎𝑁 = 1, 2𝜋

exp (∫0 𝑎𝑃 (𝑠) 𝑑𝑠)

Δ 2 = 𝜉󸀠 𝑒𝜖 (0, 𝛼) + (𝜉 + 𝜉󸀠 ) ×∫

𝑋𝑃Δ (𝑡)

=

2𝜋

(1 + 𝜖𝜇 (𝑠 + 𝜏0 )) 𝑒𝜖 (𝑠 + 𝜏0 , 𝛼) (𝐾 + 𝛽𝑅) 𝑊Δ𝑠. (39)

It is obvious that 󵄨 󵄨 󵄨 󵄨 ∗ 𝜉𝑒𝜖 (𝑡, 𝛼) 󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨 + 𝜉󸀠 𝑒𝜖 (𝑡, 𝛼) 󵄨󵄨󵄨𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡)󵄨󵄨󵄨 ≤ 𝑉 (𝑡) ≤ 𝑉 (0) , (40) which means that 󵄨 󵄨 󵄨 󵄨 ∗ min {𝜉, 𝜉󸀠 } 𝑒𝜖 (𝑡, 𝛼) (󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋𝑃∗ (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨𝑋𝑁 (𝑡) − 𝑋𝑁 (𝑡)󵄨󵄨󵄨) ≤ 𝑉 (0) . (41) Thus, we finally get 󵄨󵄨󵄨𝑋𝑃 (𝑡) − 𝑋∗ (𝑡)󵄨󵄨󵄨 + 󵄨󵄨󵄨𝑋𝑁 (𝑡) − 𝑋∗ (𝑡)󵄨󵄨󵄨 𝑃 𝑁 󵄨 󵄨 󵄨 󵄨

exp (∫0 𝑎𝑃 (𝑠) 𝑑𝑠) − 1

𝑒4𝜋 , −1

𝑒4𝜋

(44)

2𝜋

exp (∫0 𝑎𝑁 (𝑠) 𝑑𝑠) 2𝜋

exp (∫0 𝑎𝑁 (𝑠) 𝑑𝑠) − 1

=

4𝜋

𝑒

𝑒4𝜋

−1

.

Take (𝐼𝑃 (𝑡), 𝐼𝑁(𝑡))⊤ = (−1 + sin(𝑡), cos(𝑡))⊤ , 𝑘𝑃 (𝑡) = 𝑘𝑁(𝑡) = 1 2 (𝑡) = 𝑤𝑃2 (𝑡) = 𝑤𝑁 (𝑡) = 0.1, 𝑟𝑃 (𝑡) = 𝑟𝑁(𝑡) = 0.01, 𝑤𝑃1 (𝑡) = 𝑤𝑁 and 𝐺(𝑥) = (1/2)(|𝑥 + 1| − |𝑥 − 1|). We have 𝐿 = 1. Let 𝜉 = 1, 𝜉󸀠 = 2. One can easily verify that 2𝜋

exp (∫0 𝑎𝑃 (𝑠) 𝑑𝑠) 𝑅𝑀 𝛼1 = 𝜔 (𝐾 + 𝑅𝛽 + ) 𝑊 exp (∫2𝜋 𝑎 (𝑠) 𝑑𝑠) − 1 𝑃

0

≈ 0.831 < 1, 2𝜋

exp (∫0 𝑎𝑁 (𝑠) 𝑑𝑠) 𝑅𝑀 𝛼2 = 𝜔 (𝐾 + 𝑅𝛽 + ) 𝑊 exp (∫2𝜋 𝑎 (𝑠) 𝑑𝑠) − 1

Γ (𝜖) 𝑒⊖𝜖 (𝑡, 𝛼) 󵄨 󵄨 ≤ × ( sup 󵄨󵄨󵄨𝑋𝑃 (𝑠) − 𝑋𝑃∗ (𝑠)󵄨󵄨󵄨 min {𝜉, 𝜉󸀠 } 𝑠∈[−𝜏0 ,0]T

0

󵄨 󵄨 ∗ + sup 󵄨󵄨󵄨𝑋𝑁 (𝑠) − 𝑋𝑁 (𝑠)󵄨󵄨󵄨) . 𝑠∈[−𝜏0 ,0]T

(42) Therefore, the unique periodic solution of (1) is globally exponentially stable. The proof is completed.

4. Examples In this section, two numerical examples are shown to verify the effectiveness of the result obtained in the previous section.

(45)

𝑁

≈ 0.831 < 1, −𝜉 (𝑎𝑃 − 𝑅𝑀) + (𝜉 + 𝜉󸀠 ) (𝐾 + 𝛽𝑅) 𝑊 ≈ −0.980 < 0, −𝜉󸀠 (𝑎𝑁 − 𝑅𝑀) + (𝜉 + 𝜉󸀠 ) (𝐾 + 𝛽𝑅) 𝑊 ≈ −1.970 < 0. It follows from Theorems 12 and 13 that (43) has a unique 2𝜋periodic solution which is globally exponentially stable (see Figure 1).

Advances in Artificial Neural Systems

9

0.02

0.02

0.015

0.015

0.01

0.01

0.005

0.005

0

0

−0.005

−0.005

−0.01

−0.01

−0.015

−0.015

−0.02

0

2

4

6

8

10

12

14

16

18

20

−0.02

0

5

10

15

Time t

Figure 1: Globally exponentially stable periodic solution of (43).

Case 2. Consider T = Z. Equation (43) reduces to the following difference equation: 𝑋𝑃 (𝑛 + 1) − 𝑋𝑃 (𝑛)

× 𝐺 [𝑤𝑃1 (𝑛) 𝑋𝑃 (𝑛 − 2) − 1) + 𝐼𝑃 (𝑛)] ,

Figure 2: Globally exponentially stable periodic solution of (46).

𝛼2 = (𝐾 + 𝑅𝛽 +

𝑅𝑀 ) 𝑊

−𝜉󸀠 (𝑎𝑁 − 𝑅𝑀) + (𝜉 + 𝜉󸀠 ) (𝐾 + 𝛽𝑅) 𝑊 ≈ −0.970 < 0. (47)

2 × 𝐺 [𝑤𝑃2 (𝑛) 𝑋𝑃 (𝑛 − 2) − 𝑤𝑁 (𝑛) 𝑋𝑁 (𝑛 − 1) + 𝐼𝑁 (𝑛)] ,

(46) for 𝑛 ∈ Z+0 . Take (𝑎𝑃 (𝑛), 𝑎𝑁(𝑛))⊤ = (1/2, 1/2)⊤ . Obviously, 𝑎𝑃 = 𝑎𝑁 = 1/2, 𝑎𝑃 = 𝑎𝑁 = 1/2, (𝐼𝑃 (𝑡), 𝐼𝑁(𝑡))⊤ = (1 + sin(𝑛𝜋/3), cos(𝑛𝜋/3))⊤ , 𝑘𝑃 (𝑡) = 𝑘𝑁(𝑡) = 𝑟𝑃 (𝑡) = 𝑟𝑁(𝑡) = 1 2 (𝑡) = 𝑤𝑃2 (𝑡) = 𝑤𝑁 (𝑡) = 0.1, and 𝐺(𝑥) = 0.01, 𝑤𝑃1 (𝑡) = 𝑤𝑁 (1/2)(|𝑥 + 1| − |𝑥 − 1|). We have 𝐿 = 1. Let 𝜉 = 1, 𝜉󸀠 = 2. If T = Z, (𝜇(𝑡) = 1), choosing 𝜔 = 6, by simple calculation, we have 𝛼1

󵄨 󵄨󵄨 𝜔−1 󵄨󵄨 󵄨󵄨󵄨 𝜔 󵄨󵄨󵄨1 − ∏𝜔−1 𝑘=1 (1 − 𝑎𝑃 (𝑘))󵄨󵄨 exp (∑𝑘=0 󵄨󵄨Log (1 − 𝑎𝑃 (𝑘))󵄨󵄨) × 󵄨 󵄨󵄨 𝜔−1 󵄨 󵄨󵄨∏𝑘=1 (1 − 𝑎𝑃 (𝑘))󵄨󵄨󵄨 (1 − 𝑎𝑃 ) 󵄨 󵄨 ≈ 0.015 < 1,

40

−𝜉 (𝑎𝑃 − 𝑅𝑀) + (𝜉 + 𝜉󸀠 ) (𝐾 + 𝛽𝑅) 𝑊 ≈ −0.480 < 0,

= −𝑎𝑁 (𝑛) 𝑋𝑁 (𝑛) + [𝑘𝑁 (𝑛) − 𝑟𝑁 (𝑛) 𝑋𝑁 (𝑛)]

𝑅𝑀 ) 𝑊

35

≈ 0.015 < 1,

𝑋𝑁 (𝑛 + 1) − 𝑋𝑁 (𝑛)

= (𝐾 + 𝑅𝛽 +

30

󵄨󵄨 󵄨 󵄨󵄨 𝜔−1 󵄨󵄨 󵄨 𝜔 󵄨󵄨󵄨1 − ∏𝜔−1 𝑘=1 (1 − 𝑎𝑁 (𝑘))󵄨󵄨 exp (∑𝑘=0 󵄨󵄨Log (1 − 𝑎𝑁 (𝑘))󵄨󵄨) × 󵄨 󵄨󵄨 𝜔−1 󵄨 󵄨󵄨∏𝑘=1 (1 − 𝑎𝑁 (𝑘))󵄨󵄨󵄨 (1 − 𝑎𝑁) 󵄨 󵄨

= −𝑎𝑃 (𝑛) 𝑋𝑃 (𝑛) + [𝑘𝑃 (𝑛) − 𝑟𝑃 (𝑛) 𝑋𝑃 (𝑛)]

−

25

XP (n) XN (n)

XP (t) XN (t)

1 𝑤𝑁 (𝑛) 𝑋𝑁 (𝑛

20 n

It follows from Theorems 12 and 13 that (46) has a unique 6periodic solution which is globally exponentially stable (see Figure 2).

5. Conclusion Remarks In this paper, we studied the stability of delayed WilsonCowan networks on periodic time scales and obtained some more generalized results to ensure the existence, uniqueness, and global exponential stability of the periodic solution. These results can give a significant insight into the complex dynamical structure of Wilson-Cowan type model. The conditions are easily checked in practice by simple algebraic methods.

Conflict of Interests The authors declare that there is no conflict of interests regarding the publication of this paper.

10

Acknowledgments This research was supported by the National Natural Science Foundation of China (11101187 and 11361010), the Foundation for Young Professors of Jimei University, the Excellent Youth Foundation of Fujian Province (2012J06001 and NCETFJ JA11144), and the Foundation of Fujian Higher Education (JA10184 and JA11154).

References [1] H. R. Wilson and J. D. Cowan, “Excitatory and inhibitory interactions in localized populations of model neurons,” Biophysical Journal, vol. 12, no. 1, pp. 1–24, 1972. [2] H. R. Wilson and J. D. Cowan, “A mathematical theory of the functional dynamics of cortical and thalamic nervous tissue,” Kybernetik, vol. 13, no. 2, pp. 55–80, 1973. [3] A. Destexhe and T. J. Sejnowski, “The Wilson-Cowan model, 36 years later,” Biological Cybernetics, vol. 101, no. 1, pp. 1–2, 2009. [4] K. Mantere, J. Parkkinen, T. Jaaskelainen, and M. M. Gupta, “Wilson-Cowan neural-network model in image processing,” Journal of Mathematical Imaging and Vision, vol. 2, no. 2-3, pp. 251–259, 1992. [5] C. van Vreeswijk and H. Sompolinsky, “Chaos in neuronal networks with balanced excitatory and inhibitory activity,” Science, vol. 274, no. 5293, pp. 1724–1726, 1996. [6] L. H. A. Monteiro, M. A. Bussab, and J. G. Berlinck, “Analytical results on a Wilson-Cowan neuronal network modified model,” Journal of Theoretical Biology, vol. 219, no. 1, pp. 83–91, 2002. [7] S. Xie and Z. Huang, “Almost periodic solution for WilsonCowan type model with time-varying delays,” Discrete Dynamics in Nature and Society, vol. 2013, Article ID 683091, 7 pages, 2013. [8] V. W. Noonburg, D. Benardete, and B. Pollina, “A periodically forced Wilson-Cowan system,” SIAM Journal on Applied Mathematics, vol. 63, no. 5, pp. 1585–1603, 2003. [9] S. Hilger, “Analynis on measure chains-a unified approach to continuous and discrete calculus,” Results in Mathematics, vol. 18, pp. 18–56, 1990. [10] S. Hilger, “Differential and difference calculus—unified!,” Nonlinear Analysis: Theory, Methods & Applications, vol. 30, no. 5, pp. 2683–2694, 1997. [11] A. Chen and F. Chen, “Periodic solution to BAM neural network with delays on time scales,” Neurocomputing, vol. 73, no. 1–3, pp. 274–282, 2009. [12] Y. Li, X. Chen, and L. Zhao, “Stability and existence of periodic solutions to delayed Cohen-Grossberg BAM neural networks with impulses on time scales,” Neurocomputing, vol. 72, no. 7–9, pp. 1621–1630, 2009. [13] Z. Huang, Y. N. Raffoul, and C. Cheng, “Scale-limited activating sets and multiperiodicity for threshold-linear networks on time scales,” IEEE Transactions on Cybernetics, vol. 44, no. 4, pp. 488– 499, 2014. [14] M. Bohner and A. Peterson, Dynamic Equations on Time Scales: An Introduction with Applications, Birkh¨auser, Boston, Mass, USA, 2001. [15] M. Bohner and A. Peterson, Advance in Dynamic Equations on Time Scales, Birkh¨auser, Boston, Mass, USA, 2003. [16] V. Lakshmikantham and A. S. Vatsala, “Hybrid systems on time scales,” Journal of Computational and Applied Mathematics, vol. 141, no. 1-2, pp. 227–235, 2002.

Advances in Artificial Neural Systems [17] A. Ruffing and M. Simon, “Corresponding Banach spaces on time scales,” Journal of Computational and Applied Mathematics, vol. 179, no. 1-2, pp. 313–326, 2005.

Journal of

Advances in

Industrial Engineering

Multimedia

Hindawi Publishing Corporation http://www.hindawi.com

The Scientific World Journal Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Applied Computational Intelligence and Soft Computing

International Journal of

Distributed Sensor Networks Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Advances in

Fuzzy Systems Modelling & Simulation in Engineering Hindawi Publishing Corporation http://www.hindawi.com

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Volume 2014

Submit your manuscripts at http://www.hindawi.com

Journal of

Computer Networks and Communications

 Advances in 

Artificial Intelligence Hindawi Publishing Corporation http://www.hindawi.com

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

International Journal of

Biomedical Imaging

Volume 2014

Advances in

Artificial Neural Systems

International Journal of

Computer Engineering

Computer Games Technology

Hindawi Publishing Corporation http://www.hindawi.com

Hindawi Publishing Corporation http://www.hindawi.com

Advances in

Volume 2014

Advances in

Software Engineering Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

International Journal of

Reconfigurable Computing

Robotics Hindawi Publishing Corporation http://www.hindawi.com

Computational Intelligence and Neuroscience

Advances in

Human-Computer Interaction

Journal of

Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Journal of

Electrical and Computer Engineering Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Hindawi Publishing Corporation http://www.hindawi.com

Volume 2014

Suggest Documents