International Conference of Soft Computing and Pattern Recognition
Stability of stochastically perturbed Hopfield-type neural networks with mixed delays Chiraz Jendoubi
Farouk Che´rif
Sfax Faculty of Science University of Sfax Sfax-Tunisia
[email protected]
ISSATS University of Sousse Sousse-Tunisia
[email protected]
Abstract—This paper gives some new and slacker conditions about stability of the square mean pseudo almost periodic solutions of the stochastic Hopfield-type neural networks with mixed delays dx (t)
=
[−Dx (t) + Af (x(t)) + Bg (xτ (t)) Z t + K(t − s)h (x (s)) ds]dt + σ (x (t)) dω (t)
x (t)
=
ξ (t) , −τ ≤ t ≤ 0.
introduce a class of stochastic neural networks with delays and the relating notations, definitions and lemmas which would be used later, in section 3, one present some new criteria ensuring existence and uniqueness of a quadratic mean almost periodic and global exponential stability. It should be mentioned that the main results include Theorem II.15 and Theorem III.1. II. D ESCRIPTION SYSTEM AND PRELIMINARIES
t−ρ
Index Terms—Neural networks; quadratic mean almost periodic; global stability; Ito’s isometry; mixed delays.
I. I NTRODUCTION In recent years, Hopfield neural networks and their various generalizations have attracted the attention of many scientists (e.g, mathematicians, physicists, computer scientists and so on), due to their potential for the tasks of classification, associative memory, parallel computational and their ability to solve difficult optimization problems ([17], [5], [12], [9]). On the other hand, it is well known, that time delays are inevitable in the interactions between neurons. This means, that there exists time delays in the information processing of neurons due to various reasons. Consequently time delay in the neural networks make the dynamic behaviors become more complex. As we all know, many phenomena in nature have oscillatory character and their mathematical models have led to the introduction of certain classes of functions to describe them ([19], [2], [14], [13]) . Such a class form pseudo almost periodic functions which is a natural generalization of the concept of almost periodicity. In this paper, we use the notion of ”quadratic mean almost periodic” to describe coexistence of periodic orbits. To the best of our knowledge, the quadratic mean almost periodic is seldom considered for stochastic neural network. Such a notion is also of interest for applications arising in mathematical, physics and statistics. Roughly speaking and motivated by the above discussions, we shall consider existence of quadratic mean almost periodic and global exponential stability for stochastic Hopfield neural networks with delays. By employing the Ito’s isometry, we present some criteria ensuring global exponential stability of the quadratic mean almost periodic solution. The rest of this paper is organized as follows in section 2 we
978-1-4799-5934-1/14/$31.00 ©2014 IEEE
In this paper, we are concerned with the following stochastic Hopfield-type neural networks with mixed delays: n X aij (t) fj (xj (t)) + bij (t) gj (xj (t − τj )) dt dxi (t) = −di xi (t) + j=1
(II.1) +
n X j=1
Zt cij (t) t−ρ
n X
kj (t − s) hj (xj (s))ds+
σij (xj (t))dωj (t)
j=1
xi (t) = ξi (t) ; −τ ≤ t ≤ 0, 1 ≤ i ≤ n Here n ≥ 2 is the number of neurons in the network, xi (t) is the state variable of the ith neuron at time t; fj (xj (t)), gj (xj (t)) and hj (xj (t)) denote the output of the jth unit at time t; di represents the rate with which the ith unit will reset its potential to the resting state in isolation when disconnected from the network and the external stochastic perturbation; for all 1 ≤ j ≤ n the positive constant corresponds to the trasmission delay along the axon of the jth unit, aij (t) weights the stength of the jth unit on the ith unit at time t, bij (t) weights the stength of the jth unit on the ith unit at time (t − τj ), cij (t) denotes the distributively delayed connection weights and kj is the kernel. Besides, T ω(t) = (w1 (t) , . . . , wn (t)) is n−dimensional Brownian motion defined on a complete probability space (Ω, F, P ) with a natural filtration {Ft }t≥0 ( i.e. Ft = σ {ω (s) , 0 ≤ s ≤ t}), where we associate Ω with the canonical space generated by ω (t) , and denote by F the associated σ−algebra generated by {ω (s)} with the probability measure P. Furthermore, the initial conditions xi (t) = ξi (t) , 1 ≤ i ≤ n are continuous on ]−∞, 0]. Note that ξ (·) is C ([−τ, 0] ; Rn ) −valued function which is F0 −measurable Rn −value functions defined on [−τ, 0] .
465
Besides, ξ (·) ∈ L2F0 ([−τ, 0] , Rn ) , where L2F0 ([−τ, 0] , Rn ) is a Rn −valued stochastic process. Throughout this paper, (Ω, F, P ) is supposed to be a probability space, and L2 (P, Rn ) stands for a space of all strongly measurable, square-integrable Rn −valued random variables x. Note that L2 (P, Rn ) is a Banach space when it is equipped with norm 1 2
2
kf kL2 = E kf k
,
where the expectation E is defined by Z E [g] = g (ω) dP (w) .
1 lim T →+∞ 2T
Definition II.2. A stochastic process f (·) : R −→ L2 (P, Rn ) is said to be stochastically continuous whenever 2
lim E kf (t) − f (s)k = 0. Denote by SBC R, L2 (P, Rn ) the collection of stochastically bounded and continuous processes. Remark II.3. It is easy to check that SBC R, L2 (P, Rn ) is a Banach space with the norm 1 2 2 , kf k∞ = sup E kf (t)k t→s
t∈R
2
where E kf k =
R Ω
2
E kf (t)k = 0}. −T
Remark II.7. It is easy to check that SBC0 R, L2 (P,Rn ) is a linear closed subspace of SBC R, L2 (P, Rn ) and 2 n consequently SBC0 R, L (P, R ) is a Banach space with 1 2 2 . the norm kxk∞ = supt∈R E kx (t)k Definition II.8. [5] A continuous stochastic process f (·) : R −→ L2 (P, Rn ) is said to be quadratic mean pseudo almost periodic when it can be expressed as
Ω
Definition II.1. A stochastic process f (·) : R −→ L2 (P, Rn ) is said to be stochastically bounded if there exists M > 0 such that for all t ∈ R 2 E kf (t)k < M.
ZT
f = g + ϕ, where g ∈ AP R, L2 (P, Rn ) and ϕ ∈ SBC0 R, L2 (P, Rn ) . The collection of such functions will be denoted by P AP R, L2 (P, Rn ) . Remark II.9. The functions g and ϕ in above definition are respectively called the quadratic mean almost periodic component and the ergodic perturbation of the stochastic process f (·) . Denote AP (R×L2 (P, Rn ), L2 (P, Rn )) = {g(t, x) ∈ AP (R, L2 (P, Rn )), x ∈ L2 (P, Rn )} and SBC0 (R×L2 (P, Rn ), L2 (P, Rn )) = {g(t, x) ∈ SBC0 (R, L2 (P, Rn )), x ∈ L2 (P, Rn )}.
2
kf k dP .
Definition II.4. A continuous stochastic process f (·) : R −→ L2 (P, Rn ) is said to be quadratic mean almost periodic when the following property is satisfied: ∀ε > 0, ∃l > 0, ∀α ∈ R, ∃τ ∈ [α, α + lε [ ,
Definition II.10. A function f (t, x) : R × L2 (P, Rn ) → L2 (P, Rn ), which is jointly continuous, is said to be quadraticmean pseudo almost periodic in t for any x ∈ L2 (P, Rn ) when it can be expressed as f =g+ϕ
2
sup E kf (t + τ ) − f (t)k < ε. t∈R
where
Denote by AP R, L2 (P, Rn ) the collection of all stochastic processes f (·) : R −→ L2 (P, Rn ) which are quadratic mean almost periodic.
ϕ(t, x) ∈ SBC0 (R × L2 (P, Rn ), L2 (P, Rn )).
Remark II.5. Like for classical almost periodic functions, the real τ will be called an −translation of f (·) .
The collection of such functions will be denoted by P AP (R × L2 (P, Rn ), L2 (P, Rn )).
Proposition II.6. ( [7] ) For an almost periodic stochastic process f (·) : R −→ L2 (P, Rn ) one has 2 i) the mapping t −→ E kf (t)k is uniformly continuous. ii) AP R, L2 (P, Rn ) ⊂ SBC R, L2 (P, Rn ) is a closed subspace. iii) AP R, L2 (P, Rn ) , k·k∞ is a Banach space. Define the class of functions SBC0 R, L2 (P, Rn ) as follows: SBC0 R, L2 (P, Rn ) = {f ∈ SBC R, L2 (P, Rn ) /
Consider the stochastically perturbed Hopfield-type neural networks with constant mixed time delays which is the compact form of II.1.
g(t, x) ∈ AP (R × L2 (P, Rn ), L2 (P, Rn )),
dx (t)
=
[−Dx (t) + Af (x(t)) + Bg (xτ (t)) + t−ρ K(t − s)h (x (s)) ds]dt + σ (x (t)) dω (t) x (t) = ξ (t) , −τ ≤ t ≤ 0. (II.2) where D = diag(di )n×n ; di are positive constants, they denote the rate with which the cell ith will reset its potential to the resting state in isolation when isolated from the other cells and inputs; a = (aij )n×n and B = (bij )n×n , aij and Rt
466
bij are the connection weights of the neural network; τ is the transmission constant delay of the neural . The activation function f shows how neurons respond to each other. T W (u) = (w1 (u1 ) , w2 (u2 ) , . . . , wn (un )) is a Brownian motion defined on a complete probability space (Ω, F, P ) with a natural filtration {Ft }t≥0 . Here xT denotes the transpose of x, for x ∈ Rn . and T for the vector x(t) = (x1 (t) , . . . , xn (t)) and the matrix 21 n P 2 , |xi (t)| A, we define the norms as follows:|x (t)| = i=1 p kAk = sup {|Ax| , |x| = 1} = λmax (AT A), where λmax (·) (respectively, λmin (·)) means the largest ( respectively, smallest) eigenvalue of A. Throughout this paper, for equation (II.2), we suppose the following assumptions : (H1 ) 2