The invariance principle for linear processes with ...

1 downloads 0 Views 176KB Size Report
QIYING WANG, YAN-XIA LIN, AND CHANDRA M. GULATI ... have led to this much improved version of the paper+ Address correspondence to: Qiying Wang,.
University of Wollongong

Research Online Faculty of Informatics - Papers

Faculty of Informatics

2002

The invariance principle for linear processes with applications Q. Wang University of Wollongong

Y. X. Lin University of Wollongong, [email protected]

C. M. Gulati University of Wollongong, [email protected]

Recommended Citation Wang, Q.; Lin, Y. X.; and Gulati, C. M.: The invariance principle for linear processes with applications 2002. http://ro.uow.edu.au/infopapers/423

Research Online is the open access institutional repository for the University of Wollongong. For further information contact Manager Repository Services: [email protected].

The invariance principle for linear processes with applications Abstract

Let Xt be a linear process defined by [refer paper], where [refer paper] is greater than or equal to 0 is a sequence of real numbers and (ek, k = 0, plus or minus 1, plus or minus 2, ...) is a sequence of random variables. Two basic results, on the invariance principle of the partial sum process of the Xt converging to a standard Wiener process on [0,1], are presented in this paper. In the first result, we assume that the innovations ek are independent and identically distributed random variables but do not restrict [refer paper]. We note that, for the partial sum process of the Xt converging to a standard Wiener process, the condition [refer paper] or stronger conditions are commonly used in previous research. The second result is for the situation where the innovations ek form a martingale difference sequence+ For this result, the commonly used assumption of equal variance of the innovations ek is weakened+ We apply these general results to unit root testing. It turns out that the limit distributions of the Dickey–Fuller test statistic and Kwiatkowski, Phillips, Schmidt, and Shin (KPSS) test statistic still hold for the more general models under very weak conditions. Publication Details

This article was originally published as Wang, Q, Lin, YX and Gulati, CM, The invariance principle for linear processes with applications, Econometric Theory, 18, 2002, 119-139. Copyright Cambridge University Press. Original article available here.

This journal article is available at Research Online: http://ro.uow.edu.au/infopapers/423

Econometric Theory, 18, 2002, 119–139+ Printed in the United States of America+

THE INVARIANCE PRINCIPLE FOR LINEAR PROCESSES WITH APPLICATIONS QI Y I N G WA N G , YA N -XI A LI N ,

AND

CH A N D R A M. GU L A T I

University of Wollongong

Let X t be a linear process defined by X t 5 ( k50 ck et2k , where $ck , k $ 0% is a sequence of real numbers and $ek , k 5 0,61,62, + + + % is a sequence of random variables+ Two basic results, on the invariance principle of the partial sum process of the X t converging to a standard Wiener process on @0,1#, are presented in this paper+ In the first result, we assume that the innovations ek are independent ` and identically distributed random variables but do not restrict ( k50 6ck 6 , `+ We note that, for the partial sum process of the X t converging to a standard Wiener ` process, the condition ( k50 6ck 6 , ` or stronger conditions are commonly used in previous research+ The second result is for the situation where the innovations ek form a martingale difference sequence+ For this result, the commonly used assumption of equal variance of the innovations ek is weakened+ We apply these general results to unit root testing+ It turns out that the limit distributions of the Dickey–Fuller test statistic and Kwiatkowski, Phillips, Schmidt, and Shin ~KPSS! test statistic still hold for the more general models under very weak conditions+ `

1. INTRODUCTION Let $X t , t $ 1% be a sequence of random variables such that EX t 5 0+ Let n

Sn 5

( Xt

and

sn2 5 Var~Sn !+

t51

We denote by n the weak convergence of probability measures in D@0,1# , where D@0,1# is the space of all right continuous real-valued functions having finite left limits on @0,1# endowed with the sup norm+ Under appropriate conditions, it is well known that S@nt # n W~t !, sn

0 # t # 1,

(1)

where W~t ! is a standard Wiener process on @0,1# and @nt # denotes the integer part of the nt+ The result of form ~1! is commonly called the invariance princiThe authors thank a referee, Professor Tanaka, Professor Phillips, and Professor Rayner for their valuable comments, which have led to this much improved version of the paper+ Address correspondence to: Qiying Wang, School of Mathematics and Statistics, University of Wollongong, NSW, 2522, Australia; e-mail: qiying@uow+edu+au+

© 2002 Cambridge University Press

0266-4666002 $9+50

119

120

QIYING WANG ET AL.

ple or the functional limit theorem+ It is quite useful in characterizing the limit distribution of various statistics arising from the inference in economic time series+ To elaborate, let us consider a stochastic process generated according to yt 5 a yt21 1 X t ,

t 5 1,2, + + + ,

(2)

where y0 is a constant with probability one or has a certain specified distribution+ Denote the ordinary least squares ~OLS! estimator of a by a[ n 5 2 + To test a 5 1 against a , 1, a key step is to derive ( nt51 yt yt21 0 ( nt51 yt21 the limit distribution of the well-known DF ~Dickey–Fuller! test statistic ~Dickey and Fuller, 1979!:

H

n~ a[ n 2 1! 5 n21

n

( t51

yt21 ~ yt 2 yt21 !

JYH

n22

n

2 yt21 ( t51

J

+

(3)

As shown by Phillips ~1987!, in null hypothesis a 5 1, the asymptotic properties of the DF test statistic relied heavily on the invariance principle of the form ~1!+ In past decades, under different assumptions on X t , there are many articles that discuss the invariance principle of the form ~1!+ Here, we cite two basic textbooks, Billingsley ~1968! and Hall and Heyde ~1980!, for the collections of related articles for independent random variables and martingale difference sequences; the review paper for mixing sequence given by Peligrad ~1986!; and also Peligrad’s recent work ~Peligrad, 1998!+ For more general mixingale sequences, we refer to Mcleish ~1975, 1977! and Truong-van ~1995!+ In this paper, we restrict our attention to linear processes, an important case in economic time series+ In what follows, we always assume that `

Xt 5

( ck et2k , k50

(4)

where $ck , k $ 0% is a sequence of real numbers and innovations ek , k 5 0,61,62, + + + , are random variables specialized later+ On the invariance principle of the form ~1! for linear processes, this paper establishes two basic results+ In the first result, we assume that the innovations ek are independent and identically distributed ~i+i+d+! random variables, but the ` condition ( k50 6ck 6 , ` ~or stronger conditions!, commonly used in previous research given by Hannan ~1979!, Stadtmüller and Trautner ~1985!, and Phillips and Solo ~1992! ~also see Tanaka, 1996! and also by Yokoyama ~1995!, is weakened+ Only finite second moments for ek are required in this paper+ It gives an essential improvement of the previous similar results given by Davydov ~1970!+ The second result is for the situation where the innovations ek form a martingale difference sequence+ In this result, the commonly used assumption, the innovations ek having the same variance, is weakened+ This will be of interest to researchers from the viewpoint of practice+

INVARIANCE PRINCIPLE FOR LINEAR PROCESSES

121

We give the statements of main theorems and detailed remarks on the previous results in the next section+ In Section 3, the applications to the Dickey– Fuller test statistic and Kwiatkowski, Phillips, Schmidt, and Shin ~KPSS! test statistic are discussed+ We find these important statistics still have similar limit distributions for the more general models under quite weak conditions+ In Section 4, some general conclusions are drawn+ Finally in Section 5, we give the proofs of the main theorems+ 2. MAIN RESULTS AND REMARKS For brevity, we denote lim nr` a n 0bn r 1 by a n ; bn , and A, with or without subscript, is for positive constant+ In Theorem 2+1, which follows, we assume that the innovations ek are i+i+d+ random variables but, to cover some interesting cases, the ck are rather general+ Write, for j 5 1,2,3, + + + , j21

yj 5

( ck k50

n

sn2 5 ( y2j +

and

j51

THEOREM 2+1+ Let ek , k 5 0,61,62, + + + , be i.i.d. random variables with Ee0 5 0 and Ee02 5 1. Assume that c0 Þ 0, 1 sn

n

max 6yj 6 r 0

and

1#j#n

(

j50

S D `

( ck2

102

5 o~sn !+

(5)

k5j

Under these assumptions, we have that 1 sn

k n ~t !

( Xj n W~t !,

0 # t # 1,

(6)

j51

where k n ~t ! 5 sup $m : sm2 # tsn2 % . In particular, if ck 5 k 21 l~k! , where l~0!00 [ 1 and the positive function ` l~k! is slowly varying at infinity satisfying ( k51 k 21 l~k! 5 `, then k n ~t !

1 n

M n ( k l~k! 21

( Xj n W~t !,

0 # t # 1,

(7)

j51

k51

where k n ~t ! is defined as in (6). ` If 0 , 6( k50 ck 6 , ` and ` ( k50 ck Þ 0, then 1 sn

` ` ( k51 kck2 , ` or ( k50 6ck 6 , ` and

@nt #

( Xj n W~t !,

0 # t # 1,

j51

where sn2 5 n~ ( k50 ck ! 2 . `

(8)

122

QIYING WANG ET AL.

Remark 2+1+ It follows from Hall ~1992, p+ 118! that

S D S( n

sn2 5 Var

( Xj

D

2

n

;n

j51

k 21 l~k! ,

k51

provided ck 5 k 21 l~k! and ( k51 k 21 l~k! 5 `+ Hence, we can replace M n (nk51 k 21 l~k! by sn in ~7!+ It is unclear whether or not sn in ~6! can be replaced by sn + `

Remark 2+2+ Let ck 5 k 2a , where _12 , a , 1+ It is easy to show that the second condition of ~5! fails to hold+ In this case, we also know that n ~t ! X j fails to converge to W~t !+ In fact, by applying Liu ~1998! ~10sn ! ( kj51 # ~also see Marinucci and Robinson, 1998!, ~10sn ! ( @nt j51 X j converges to a fractional Brownian motion with d 5 1 2 a+ Therefore, to make the partial sum process of the X j converge to a standard Wiener process, the condition ~5! is close to the necessary condition+ Remark 2+3+ The conditions given in this theorem are different from those given by Davydov ~1970!+ Specifically, Theorem 2+1 abolishes the condition Ee04 , `, which is an essential improvement of Davydov’s result for the moment condition+ In the next theorem, the i+i+d+ assumption for the innovations ek is weakened to being a martingale difference sequence+ In this case, an excellent result is given by Hannan ~1979!, where it is required that Eek2 5 s 2 and lim nr` E~ek2 6Fk2n ! 5 s 2, a+s+ ~Fk is defined as in Theorem 2+2, which follows! for all k+ In Theorem 2+2, these conditions are moderated+ Our Corollary 2+1, which follows Theorem 2+2, also improves Theorem 3+15 in Phillips and ` Solo ~1992!, where the authors assumed ( k50 k 6ck 6 , ` and $ek % is a s+u+i+ ~strongly uniformly integrable; the definition can be found in Billingsley, 1968, p+ 32! martingale difference sequence+ THEOREM 2+2+ Let ck satisfy `

b0 [

(

`

( 6ck 6 , `,

ck Þ 0,

k50

`

and

k50

( kck2 , `+

k51

Let ek be random variables such that E~ek 6Fk21 ! 5 0,

a+s+ k 5 0,61,62, + + + ,

where Fk is the s-field generated by $ej , j # k% . If sup n$1

1 n

n

( k52n

Eek2 , `,

inf n$1

1 n

n

Eek2 . 0; ( k51

(9)

INVARIANCE PRINCIPLE FOR LINEAR PROCESSES

123

and as n r `, 1 n 1 n

n

( ~ek2 2 Eek2 ! r 0, k51

in probability;

(10)

( Eek2 I~6 e 6$dM n ! r 0, k52n

for any d . 0,

(11)

n

k

then 1 sn*

k n* ~t !

( Xj n W~t !,

0 # t # 1,

(12)

j51

where sn*2 5 b02 ( nk51 Eek2 and k n* ~t ! 5 sup $ j : ( jk51 Eek2 # t ( nk51 Eek2 % . From Theorem 2+2, we obtain the following corollary+ COROLLARY 2+1+ If conditions (9)–(11) in Theorem 2.2 are replaced by one of the following conditions (a)–(c), then (12) still holds. (a) $ek2 % is uniformly integrable and E~ek2 6Fk21 ! 5 s 2 . 0 for all k $ 1; n (b) $ek2 % is s.u.i. and ~10n! ( k51 E~ek2 6Fk21 ! r s 2 . 0, in probability; n (c) E~supk ek2 ! , ` and ~10n! ( k51 E~ek2 6Fk21 ! r s 2 . 0, in probability.

Proof+ If condition ~a! holds, then Eek2 5 E~ek2 6Fk21 ! 5 s 2+ Condition ~12! follows immediately from Theorem 2+2 by using Lemma 5+5 ~from Section 5!+ If condition ~b! holds, it follows from Lemma 5+5 that 1 n

n

( Eek2 I~6 e 6$dM n ! r 0 k52n k

and

1 n

n

( ek2 r s 2, k51

in probability+

(13)

On the other hand, it is known that $~10n! ( nk51 ek2 % is s+u+i+ if $ek2 % is s+u+i+ ~Chow and Teicher, 1988, p+ 102!+ This fact, together with the second relation of ~13!, implies that ~Chow and Teicher, 1988, p+ 100! 1 n

n

( Eek2 r s 2+

(14)

k51

In terms of ~13! and ~14!, it is easy to check that all conditions in Theorem 2+2 are satisfied and hence ~12! holds+ Finally, if condition ~c! holds, ~12! follows obviously because n E~supk ek2 ! , ` implies that $ek2 % is s+u+i+ 3. APPLICATIONS In this section, we discuss the applications of this paper to time series+ At first, we assume that the process $ yt % is generated by ~2! with a 5 1+ Phillips ~1987! investigated the limit behavor of the DF test statistic n~ a[ n 2 1! de-

124

QIYING WANG ET AL.

fined by ~3! provided $X t % is a strong mixing sequence with appropriate mixing conditions+ Here, we assume that $X t % satisfies ~4!, i+e+, $X t % forms a linear process+ Under quite general conditions for ck and ek in ~4!, it is shown that the DF test statistic n~ a[ n 2 1! has a similar distribution as that in Phillips and Xiao ~1998!, where the authors obtained the limit distribution of n~ a[ n 2 1! ` provided ( k50 k 102 6ck 6 , `+ THEOREM 3+1+ Let ek , k 5 0,61,62, + + + , be i.i.d. random variables ` ` with Ee0 5 0 and Ee02 5 s 2 . If 0 , 6( k50 ck 6 , ` and ( k51 kck2 , ` or ` ` ( k50 6ck 6 , ` and ( k50 ck Þ 0, then as n r `, n 2 ~10n 2 ! ( t51 yt21 n s 2 b02*01 W~r! 2 dr; n ~10n! ( t51 yt21 ~ yt 2 yt21 ! n ~s 2 b02 02!~W~1! 2 2 g! ; n~ a[ n 2 1! n ~ _12 !~W~1! 2 2 g!0*01 W~r! 2 dr; a[ n r 1, in probability; ta n ~ _12 g 2102 !~W~1! 2 2 g!0$*01 W~r! 2 dr% 102 ,

(a) (b) (c) (d) (e)

where `

b0 5

(

`

g5

ck ,

k50 n

a[ n 5

(

Y( n

yt yt21

t51

S( D

2 yt21 ,

dn2 5

t51

1 n

n

( ~ yt 2 a[ n yt21 ! 2,

and

t51

102

n

ta 5

( ck2 0b02 ,

k50

2 yt21

~ a[ n 2 1!0dn +

t51

As in Phillips ~1987!, the proof of Theorem 3+1 may be obtained by applying Theorem 2+1+ The details are omitted+ The limit distribution given in Theorem 3+1 depends on the unknown parameter `

g5

YS ( D

( ck2 k50

`

2

ck +

k50

As in Phillips ~1987, p+ 285!, we can construct an estimate of g as follows: g[ 5 s[ n*2 0s[ n2 ,

where s[ n*2 5

1 n

n

( Xt2

t51

n and s[ n2 5 ~10n! ( nt51 X t2 1 ~20n! ( lr51 ( nt5r11 X t X t2r + Here and subsequently, $l n , n $ 1% denotes a sequence of positive real numbers+ The following theorem shows that g[ is a consistent estimate of g for any l n satisfying l n 5 o~n! and l n r `+

INVARIANCE PRINCIPLE FOR LINEAR PROCESSES

125

THEOREM 3+2+ Let ek , k 5 0,61,62, + + + , be i.i.d. random variables with Ee0 5 0 and Ee02 5 s 2 . (a) If ( k50 ck2 , `, then s[ n*2 0s 2 r ( k50 ck2 , a+s+ ` (b) If ( k50 6ck 6 , `, then for any l n satisfying l n 5 o~n! and l n r `, s[ n2 0s 2 r ` ~ ( k50 ck ! 2 , in probability+ `

`

Proof+ Noting that $X t , t $ 1% is a stationary ergodic sequence and EX 12 5 ` s ( k50 ck2 , `, it follows from the stationary ergodic theorem ~Stout, 1974, ` p+ 181! that s[ n*2 r s 2 ( k50 ck2 , a+s+ This proves part ~a!+ It is well known that ~Brockwell and Davis, 1987, p+ 212! 2

1 E n

S D

2

n

( Xt t51

5

1 n

n

2

n21

S D

n

`

( EXt2 1 n r51 ( t5r11 ( EXt Xt2r r s 2 k50 ( ck t51

2

(15)

+

This fact, together with part ~a!, implies that, to prove s[ n2 0s 2 r ~ ( k50 ck ! 2 in probability, it suffices to show that `

1 n

ln

n

( ( ~Xt Xt2r 2 EXt Xt2r ! r 0, r51 t5r11 1 n

n21

in probability;

(16)

n

( ( EXt Xt2r r 0+ r5l 11 t5r11

(17)

n

The proofs of ~16! and ~17! appear in the Appendix+

n

If ( k50 ck 5 `, the results differ from those in Theorem 3+1+ In this case, we find that the limit distribution of the DF test statistic n~ a[ n 2 1! is free from the unknown parameters but ta diverges to ` in probability+ Explicitly, we obtain the following theorem+ `

THEOREM 3+3+ Let ek , k 5 0,61,62, + + + , be i.i.d. random variables with Ee0 5 0 and Ee02 5 s 2 . If ck 5 k 21 l~k! , where l~0!00 [ 1 and positive function ` l~k! is slowly varying at infinity satisfying ( k51 k 21 l~k! 5 `, then (a) (b) (c) (d)

n 2 ~nyn !22 ( t51 yt21 n s 2 *01 W~r! 2 dr; n 22 ~M n yn ! ( t51 yt21 ~ yt 2 yt21 ! n ~s 202!W~1! 2 ; n~ a[ n 2 1! n ~ _12 !W~1! 20*01 W~r! 2 dr; ta r `, in probability,

where yn 5 ( nk51 k 21 l~k! , a[ n , and ta are defined as in Theorem 3.1. Proof+ Recall n

sn2 5

y2j , ( j51

k n ~t ! 5 sup $m : sm2 # tsn2 %

126

QIYING WANG ET AL.

and the process $ yt % is defined by ~2!+ If a 5 1, then yt 5 ( tj51 X j ~without loss of generality, here and subsequently, we assume y0 5 0!, and hence sn2 y2t

2 yt21 5

E

st2 0sn2

2 st21 0sn2

S( D k n ~r!

2

Xj

Therefore, we obtain that n

n

y2t

n

2 2 5 ( 2 yt21 1( ( yt21 t51 t51 yn t51

5 5

sn2 y2n sn4 y2n

(E ES n

t51

1

0

dr+

j51

2 st21 0sn2

D

1 1 2 2 2 y2t yt21 y2t yn

S( D ( D

st2 0sn2

1 sn

S

k n ~r!

2

Xj

dr 1 R n ,

say,

j51

k n ~r!

2

dr 1 R n +

Xj

j51

It follows from Theorem 2+1 and the continuous mapping theorem ~see Billingsley, 1968, Sect+ 5! that

ES 1

0

1 sn

k n ~r!

( Xj j51

D

2

dr n s 2

E

1

W~r! 2 dr+

0

This fact, together with Theorem 1+4+1 given by Billingsley ~1968, p+ 25!, implies that part ~a! follows if n

sn2 5

( y2j ; ny2n j51

and

1 n 2 y2n

R n r 0,

in probability+

(18)

Because yn 5 ( nk51 k 21 l~k! is still a slowly varying function, the first relation of ~18! follows from Bingham, Goldie, and Teugels ~1987, p+ 26!+ By noting that Eyt2 ; ty2t ~recalling Remark 2+1!, we have that 1 2 2 n yn

n

2 ; ( Eyt21

t51

1 2

and

1

n

1

2 ; ( y2t Eyt21 2

n 2 y4n t51

by using the slowly varying properties of yt + Hence, by noting yt F, as t F `, it follows that 1 1 E6R n 6 5 2 2 2 2 n yn n yn

n

2 2 ( Eyt21

t51

1 2 4 n yn

n

2 y2t Eyt21 r 0+ ( t51

The second relation of ~18! follows from Markov’s inequality+ The proof of part ~a! is complete+ The proof of part ~b! follows directly from Theorem 2+1 and part ~a! of Theorem 3+2 by noting ( nt51 yt21 ~ yt 2 yt21 ! 5 1_2 yn2 2 ( nt51 X t2 ~see Phillips, 1987, Appendix!+

INVARIANCE PRINCIPLE FOR LINEAR PROCESSES

127

The proof of part ~c! is straightforward by applying parts ~a! and ~b! and the continuous mapping theorem+ To prove part ~d!, we rewrite dn2 5

1 n

5

1 n

n

~ yt 2 a[ n yt21 ! 2 ( t51 2~ a[ n 2 1! n

n

( Xt2 2

t51

n

( t51

yt21 X t 1

~ a[ n 2 1! 2 n

n

2 yt21 + ( t51

Because n2d yn r 0, for any d . 0 ~see Feller, 1971, p+ 277!, and yn r `, it follows from parts ~a!–~c! that for ∀ e . 0, as n r `, P

S*

a[ n 2 1 n

( yt21 Xt * $ e n

t51

D

# P~n6 a[ n 2 16 $ eyn ! 1P

P

S

~ a[ n 2 1! 2 n

n

2 yt21 $e ( t51

D

S

1 ny2n

* ( y X * $ ny n

t21

t

23 n

t51

D

r 0,

(19)

# P~n6 a[ n 2 16 $ eyn ! 1P

S

1 2 2 n yn

n

2 yt21 $ ny24 n ( t51

D

r 0+

(20)

In terms of ~19!, ~20!, and part ~a! of Theorem 3+2, we have that `

dn2 r s 2

( ck2 , `,

in probability+

k50

Therefore, part ~d! follows easily by applying parts ~a! and ~c!+ The proof of n Theorem 3+3 is complete+ Next we discuss another application of the present results+ Let us consider the model yt 5 c 1 rt 1 z t ,

t 5 1,2, + + + , n+

(21)

Here c is a constant, z t is a stationary error, and rt is a random walk: rt 5 rt21 1 u t

with r0 5 0,

(22)

where the u t are i+i+d+ random variables with Eu t 5 0 and Eu t2 5 su2+ To test su2 5 0, i+e+, to test whether the data generating process is stationary, the commonly used statistic ~known as the KPSS test statistic! is

128

QIYING WANG ET AL. n

t

( St2 0s 2 ~ln !,

h[ u 5 n22

where St 5 ( ej ,

t51

1 s ~l n ! 5 n 2

n

( t51

et2

2 1 n

ln

(23)

j51

n

( ( et et2s , s51 t5s11

and et 5 yt 2 ~10n! ( nt51 yt is the residual from the regression of y on intercept c+ Kwiatkowski, Phillips, Schmidt, and Shin ~1992! discussed the asymptotic distribution of the h[ u provided z t satisfies the ~strong mixing! regularity conditions given by Phillips and Perron ~1988, p+ 336! or the linear process conditions given by Phillips and Solo ~1992, Theorems 3+4 and 3+15!+ One of Phillips ` and Solo’s conditions is ( k50 k 102 6ck 6 , `+ In this paper, we only require that ` ( k50 6ck 6 , `+ In particular, we only need l n satisfying l n 5 o~n! and l n r `, which, in practice, provides more choice for s 2 ~l n !+ Therefore, our result is an extension of theirs+ THEOREM 3+4+ Let ek , k 5 0,61,62, + + + , be i.i.d. random variables with Ee0 5 0 and Ee02 5 s 2 . Assume that the data generating process is given by (21) with `

z t 5 Xt 5

( ck et2k + k50

If ( k50 6ck 6 , ` and l n r `, `

h[ u n

E

` ( k50 ck Þ 0, then for any l n satisfying l n 5 o~n! and

1

where V~r! 5 W~r! 2 rW~1!+

V~r! 2 dr,

(24)

0

Proof+ Under the hypothesis su2 5 0, it is well known that et 5 X t 2 ~10n! ( nt51 X t + By applying Theorem 2+1, we have that, for any 0 # r # 1, 1 1 S@nr# 5 sn sn

@nr#

( Xt 2

t51

@nr# nsn

n

( Xt n W~r! 2 rW~1! 5 V~r!,

t51

where sn2 5 ns 2 ~ ( k50 ck ! 2 + Hence, it follows from the continuous mapping theorem that `

n

n 22

1

( St2 5 n t51

E

1

S( D E 2

`

2 S@nr# dr n s 2

0

1

ck

k50

V~r! 2 dr+

(25)

0

On the other hand, we have that s 2 ~l n ! 5

1 n

5

1 n

n

2

ln

n

( et2 1 n s51 ( t5s11 ( et et2s t51 n

( Xt2 1

t51

2 n

ln

n

( (

s51 t5s11

X t X t2s 1 R 1n ,

(26)

INVARIANCE PRINCIPLE FOR LINEAR PROCESSES

129

where, after a simple calculation, 6R 1n 6 #

#

4l n n2 Cl n n2

S D S( D n

( Xj

2

1

j51

2 n2

ln

*(

n

(

** ( X * n

~X t 1 X t2s !

s51 t5s11

j

j51

2

n

Xj +

j51

By noting ~15!, Markov’s inequality implies that for any l n 5 o~n!, 6R 1n 6 r 0 in probability+ Therefore, by using part ~b! of Theorem 3+2, we obtain that for any l n satisfying l n 5 o~n! and l n r `,

S( D

2

`

s 2 ~l n ! r s 2

ck ,

in probability+

(27)

k50

Thus, ~24! follows immediately from ~25! and ~27!+ The proof of Theorem 3+4 n is complete+ Remark 3+1+ In terms of Theorem 2+2, Theorems 3+1, 3+3, and 3+4 still hold ` for stationary ergodic martingale difference sequence provided ( k50 6ck 6 , ` ` and ( k50 ck Þ 0+ We omitted the details here+ 4. CONCLUSION This paper derives two basic results on the invariance principle for the partial sum process of a linear process+ The first result assumes that the innovations are i+i+d+ random variables, but absolute summability of coefficients for the ` linear process ~i+e+, ( k50 6ck 6 , `! is weakened+ This relaxation of conditions is interesting because some linear processes do not have absolutely summable coefficients+ Especially, a linear process with ck 5 k 21 l ~k! where ` ( k50 k 21 l~k! 5 ` is important because it is expressed by neither a finiteorder autoregressive moving average process nor a fractional process+ The second result is for the situation where the innovations form a martingale difference sequence+ For this result, the commonly used assumption of equal variance is removed+ This is of interest to researchers from a practical point of view+ We apply these general results to unit root testing and stationarity testing+ It turns out the limit distributions of the Dickey–Fuller test statistic and KPSS test statistic still hold for the more general models under very weak conditions+ This paper also shows that the “long-run variance,” s 2 , can be consistently estimated by a nonparametric method with a lag-truncation parameter l n of o~n!+ In previous research, it was usually assumed to be of o~n 102 !+ This provides more choice for the estimation of s 2, and it is theoretically interesting+

130

QIYING WANG ET AL.

5. PROOFS OF MAIN RESULTS 5.1. Some Preliminary Lemmas In this section, we provide some lemmas that will be needed in the proofs of the main results+ Some of these lemmas are also interesting in their own right+ LEMMA 5+1+ Let $hk , k $ 0% be a sequence of arbitrary random variables and $bi , i $ 0% a sequence of real numbers. Assume that

(

kck2 , `,

sup n$1

k51

n

1 n

`

c02 1

( bi2 , `

i50

and there exists a positive constant A such that

S( D

2

`

E

cj1k hk

`

2 bk2 , ( cj1k

#A

k50

for j $ 0+

(28)

k50

Then, as n r `, 1 max M n 0#m#n

( cj1k hk* r 0, * j50( k50 m

`

in probability+

(29)

Proof+ By using E6Y 6 # ~EY 2 ! 102 for any random variable Y, it follows from ~28! that

*( ( m

E max

*

`

0#m#n j50 k50

n

cj1k hk #

(E

j50

*( `

k50

*

n

cj1k hk # A (

j50

S( D

102

`

2 cj1k bk2

+

(30)

k50

2 bk2+ For any 0 # l # m, we have that Put aj 5 ( k50 cj1k `

m

m

( aj 5 ( j5l

`

2 bk2 ( cj1k

j5l k50 `

5 ( ck2

S

min$k, m%

k5l

( j5l

2 bk2j # sup

This inequality implies that

S D HS n

( aj102 j50

2

5

# M2

S S

@M n #

k$1

1 k

HS (

@M n #

# 2 @M n # 1 k

D(

~k 1 1!ck2 +

k5l

2

aj102

2

2

n

1

aj102

j5@M n #

@M n #

(

n

aj 1 n

j50

5 o~n!+

i50

`

D J D S ( DJ ( D DS (

aj102

j50

k$1

( bi2

n

( 1 j5@M(n #11 j50

# 2 sup

k

k

( bi2 i50

aj

j5@M n #

`

~k 1 1!ck2 1 n

@M n #

k50

`

( ~k 1 1!ck2 k5@M n #

D

+

INVARIANCE PRINCIPLE FOR LINEAR PROCESSES

131

Now ~29! follows from Markov’s inequality, ~30!, and the bound established earlier+ The proof of Lemma 5+1 is complete+ n LEMMA 5+2+ Let $hn , Fn , s # n # t % be a martingale difference sequence. Then there exists a constant K such that for any constant sequence ak ,

S( D

2

n

E max s#n#t

ak hk

k5s

t

# K ( ak2 Ehk2 +

(31)

k5s

Proof+ Apply Doob and Burkholder’s inequality ~see, e+g+, Hall and Heyde, n 1980, pp+ 15 and 23, respectively!+ LEMMA 5+3+ Let $hn , Fn ,2` , n , `% be a martingale difference sek Ehi2 , `. Assume that quence satisfying supk$1 ~10k! ( i52k `

`

( 6ck 6 , `

and

k50

( kck2 , `+

k51

Then, for any d . 0, lim lim sup P lr`

nr`

H

~l !

* ( u * $ dM n J 5 0, m

max

1#m#n j51

~l ! j

(32)

where u j 5 ( k5l11 ck hj2k and l $ 21. `

Proof+ We first note that ( k50 ck hj2k , `, a+s+, for every fixed j $ 1; i+e+, ~l ! u j is well defined+ In fact, by applying Lemma 5+2, there exists a constant K such that for any j # m # m ' , `

E max m#n#m '

S(

D

m'

2

n

ck hj2k

k5m

#K

2 ( ck2 Ehj2k

k5m

S

# K sup k$1

1 k

k

(

i52k

D( `

Ehi2

~k 1 1!ck2 +

(33)

k5m

From ~33! and Markov’s inequality, it follows that for any d . 0, as n r `,

S

*(

n1i

P sup i$1

k5n

*

D

S

ck hj2k $ d # 2Kd 22 sup k$1

1 k ( Ehi2 k i52k

D( `

kck2 r 0+

k5n

So we conclude by the Cauchy criterion that ( nk50 ck hj2k converges almost ` surely; i+e+, for every fixed j $ 1, ( k50 ck hj2k , `, a+s+ ` In terms of ( k50 ck hj2k , `, a+s+, it is easy to show ~let ( lj51 5 0 for l # 0! that for m $ l 1 1,

132 m

QIYING WANG ET AL.

( uj 5

m j2~l11!

( (

~l !

j51

S( m

5

cj2k hk

j51 k52` j2~l11!

j5l12

m2~l11!

(

5

m

( k51

k51

m

(

j5l11

~l !

l

0

( 2 j51 ( k5j2l ( j51 k52`

m2k

hk

0

1(

~l !

cj 1 (

`

(

j51 k50

~l !

5 D m1 1 D m2 1 D m3 ,

D

cj2k hk l

cj1k h2k 2 (

l2j

( cj1k h2k

j51 k50

say,

(34)

~l !

where D m3 [ 0 for l 5 0 and 21+ Now ~32! follows if for any d . 0, lim lim sup P lr`

H max 6D

nr`

J

~l ! mt 6

1#m#n

$ dM n 5 0,

t 5 1,2,3+

(35)

For every fixed j $ 0, it follows from the Fatou lemma and Lemma 5+2 that

S(

D

2

`

E

cj1k h2k

k50

S( D S( D

2

n

5 E lim

nr`

ck hj2k

k5j

2

n

# lim E nr`

ck hj2k

k5j

`

2 # K ( ck2 Ehj2k k5j

`

5K

2 2 Eh2k + ( cj1k k50

2 By applying Lemma 5+1 ~choosing bk2 5 Eh2k !, ~35! holds for t 5 2+ k To prove ~35! for t 5 1, put Sk 5 ( i51 hi and S0 5 0+ We obtain that m2~l11!

~l !

D m1 5

(

m2k

~Sk 2 Sk21 !

k51

(

m2~l11!

cj 5

j5l11

(

cm2k Sk

k51

and hence, n

~l !

max 6D m1 6 #

1#m#n

max 6Sm 6+ ( 6ck 6 1#m#n k5l11

Again, it follows from Markov’s inequality and Lemma 5+2 that P

H

~l !

J

S( D S( D

2

n

max 6D m1 6 $ dM n # d 22 n21

1#m#n

6ck 6

E max Sm2

k5l11

# Kd 22

2

`

6ck 6

k5l11

sup k$1

1#m#n

1 k

k

( Ehi2 + i50

Because ( , `, we conclude that ~35! holds for t 5 1+ That ~35! holds for t 5 3 is obvious and omitted+ The proof of Lemma 5+3 is n complete+ ` k50 6ck 6

INVARIANCE PRINCIPLE FOR LINEAR PROCESSES

133

LEMMA 5+4+ Let $hk , k 5 0,61,62, + + + % be a sequence of arbitrary random variables. Assume that, as n r `, positive constant series dn r ` and 1 dn2

n

( Ehk2 I~6 h 6$dd ! r 0, k52n k

for any d . 0+

n

Then, 1 max 6hk 6 r 0, dn 2n#k#n

in probability+

Proof+ By applying ~36! in Phillips and Solo ~1992!, we obtain that

P

S

D S

n

1 1 max 6hk 6 $ d 5 P dn 2n#k#n dn2

( hk2 I~6 h 6$dd ! $ d 2 k52n k

n

D

+

Using Markov’s inequality, the result follows+

n

LEMMA 5+5+ Let $hk , k 5 0,61,62, + + + % be a sequence of arbitrary random variables. If $hk2 % is uniformly integrable, (i) for any d . 0, ~10n! ( nk52n Ehk2 I~6 hk 6$dM n ! r 0; (ii) ~10M n !max2n#k#n 6hk 6 r 0, in probability; n * (iii) ~10n! ( k51 ~hk2 2 E~hk2 6Fk21 !! r 0, in probability, where Fk* is the s-field generated by $hj , j # k% .

By definition of uniform integrability, it follows that supk Ehk2 I~6 hk 6$dn ! r 0, for any dn r `+ Therefore, the proof of Lemma 5+5 is straightforward, and details are omitted+ 5.2. Proofs of Results In this section, we provide the proofs of the main results+ Proof of Theorem 2+1+ According to ~34! ~for l 5 21!, for any 0 # t # 1, k n ~t !

(

k n ~t !

Xj 5

j51

(

k n ~t !2k

(

ek

k51

k n ~t ! `

cj 1

j50

( ( cj1k e2k +

j51 k50

Similar to ~30!, we have that k n ~t ! `

E sup 0#t#1

*( (

j51 k50

*

n

cj1k e2k # A (

j51

S D `

( ck2

102

5 o~sn !+

k5j

This, together with Markov’s inequality, implies that 1 sup sn 0#t#1

k n ~t ! `

*( (c

j1k e2k

j51 k50

* r 0,

in probability+

134

QIYING WANG ET AL.

On the other hand, it is well known ~noting that the ek are i+i+d+ random variables! that for any 0 # t # 1, k n ~t !

( k51

k n ~t !2k

( j50

ek

k n ~t ! d

cj 5

k21

( ek j50 ( cj , k51

(36)

d

where 5 denotes the same in distribution+ Therefore, by applying Theorem 1+4+1 given in Billingsley ~1968, p+ 25!, ~6! follows if 1 sn

k n ~t !

k21

( j50 ( cj n W~t !,

0 # t # 1+

ek

(37)

k51

Recall that yk 5 ( k21 j50 cj + Because max1#k#n 6yk 60sn r 0, we see that for any d . 0, 1 sn2

n

y2k Eek2 I~6 y e 6$ds ! # Ee12 IS6 e 6$ds 0 max 6 y 6D r 0+ ( k51 k k

n

1

n

(38)

k

1#k#n

It follows from Lemma 5+4 that 1 max 6yk ek 6 r 0, sn 1#k#n

in probability+

(39)

In terms of ~38! and ~39!, ~37! follows from Prokhorov’s theorem ~see Rao, 1984, p+ 343!+ This completes the proof of ~6!+ If ck 5 k 21 l~k!, where positive function l~k! is slowly varying at infinity, it is easy to check that ( nk50 k 21 l~k! still is slowly varying at infinity+ When ` ( k50 k 21 l~k! 5 `, we obtain ~see Bingham et al+, 1987, p+ 26! that n

sn2 5 n

S

`

( ( k 22 l 2 ~k! j50 k5j

D

102

S

( k50 (

j51

D S( 2

j21

k 21 l~k!

D

2

n

;n

k 21 l~k! ,

k51

n

;

( j 2102 l~ j ! ; 2n 102 l~n! 5 o~sn !+ j51

Hence ~7! follows from ~6!+ ` ` If 0 , 6( k50 ck 6 , ` and ( k51 kck2 , `, by applying Lemma 5+1 and the similar method of the proof used in ~6!, it suffices to show that 1 Mn

@nt #

k21

( j50 ( ek

k51

By noting @nt #

k21

( j50 ( ek

k51

S( D `

cj n

S D( `

cj 5

cj W~t !,

0 # t # 1+

j50

( cj

j50

@nt #

k51

@nt #

ek 2

(

k51

`

ek ( cj , j5k

(40)

INVARIANCE PRINCIPLE FOR LINEAR PROCESSES

135

~40! follows from Donsker’s theorem ~see Billingsley, 1968, p+ 137!, and, as n r `, P

S

*(

@nt #

sup

*

`

0#t#1 k51

ek ( cj $ dM n j5k

D

S( (S( D n

# C~d 2 n!21 E

k51

C1 n

#

n

j5k

D

2

2

`

cj

k51

`

ek ( cj r 0,

j5k

where we use the estimate ( j5k cj r 0 as k r `+ ` ` If ( k50 6ck 6 , ` and ( k51 ck Þ 0, the result follows from Hannan ~1979!+ The proof of Theorem 2+1 is complete+ n `

Proof of Theorem 2+2+ Generally speaking, ~36! fails to hold for martingale differences+ To prove Theorem 2+2, we need a new method+ For every fixed l $ 1, put l

~l !

Z 1j 5

( ck ej2k k50

~l !

`

Z 2j 5

and

( ck ej2k + k5l11

From Fuller ~1996, p+ 320!, we obtain that for any m $ 1, m

m

l

~l ! Z 1j 5 ( ( ck ej2k ( j51 j51 k50 l

m

l

m

l

l

l21

l

5

( ck j51 ( ej 1 s51 ( e12s j5s ( cj 2 s50 ( em2s j5s11 ( cj k50

5

( ck j51 ( ej 1 R~m, l !, k50

say+

Therefore, it follows that for every fixed l $ 1, 1 sn*

k n* ~t !

( Xj 5

j51

S

1 sn*

D

l

k n* ~t !

( ck j51 (

k50

1 1 ej 1 * R~k n* ~t !, l ! 1 * sn sn

k n* ~t !

( Z 2j~l ! +

(41)

j51

Noting ( lk50 ck r b0 , as l r `, and existing positive constants A 1 and A 2 such that A 1 n # sn*2 # A 2 n, by applying Theorem 1+4+1 given in Billingsley ~1968, p+ 25!, we only need to show for any d . 0,

* ( Z * $ dM n J 5 0; lim sup P H sup 6R~k ~t !, l !6 $ dM n J 5 0,

lim lim sup P lr`

nr`

H

k n* ~t !

sup

0#t#1

0#t#1

(42)

j51 * n

nr`

~l ! 2j

(43)

136

QIYING WANG ET AL.

for every fixed l $ 1; and 1 sn*

k n* ~t !

( ej n W~t !,

0 # t # 1,

(44)

j51

where sn*2 5 ( nj51 Eej2 + In fact, ~42! follows from Lemma 5+3 because $ek , Fn ,2` , n , `% is a martingale difference sequence+ In terms of Lemma 5+4, we have that 1

Mn

max 6ej 6 r 0,

2n#j#n

in probability+

By ~45!, ~43! holds because

( k50 6ck 6 , ` and hence `

l 1 1 sup 6R~k n* ~t !, l !6 # max 6ej 6 ( M n 0#t#1 M n 2l#j#n s50

r 0,

(45)

S( l

j5s

l

6cj 6 1

( 6cj 6 j5s11

D

in probability+

Finally, ~44! follows Brown ~1971! ~see also Tanaka, 1996, p+ 80! by using ~10! and ~11!+ The proof of Theorem 2+2 is complete+ n REFERENCES Brown, B+ ~1971! Martingale central limit theorems+ Annals of Mathematical Statistics 42, 59– 66+ Billingsley, P+ ~1968! Convergence of Probability Measure+ New York: Wiley+ Bingham, N+H+, C+M+ Goldie, & J+L+ Teugels ~1987! Regular Variation+ Cambridge, UK: Cambridge University Press+ Brockwell, P+J+ & R+A+ Davis ~1987! Time Series: Theory and Methods+ New York: Springer-Verlag+ Chow, Y+S+ & H+ Teicher ~1988! Probability Theory, 2nd ed+ New York: Springer-Verlag+ Davydov, Yu+A+ ~1970! The invariance principle for stationary processes+ Theory of Probability and Its Applications 15, 487– 498+ Dickey, D+A+ & W+A+ Fuller ~1979! Distribution of the estimators for autoregressive time series with a unit root+ Journal of the American Statistical Association 74, 427– 431+ Feller, W+ ~1971! An Introduction to the Theory of Probability and Its Applications, 2nd ed+ New York: Wiley+ Fuller, W+A+ ~1996! Introduction to Statistical Time Series, 2nd ed+ New York: Wiley+ Hall, P+ ~1992! Convergence rates in the central limit theorem for means of autoregressive and moving average sequences+ Stochastic Processes and Their Applications 43, 115–131+ Hall, P+ & C+C+ Heyde ~1980! Martingale Limit Theory and Its Applications+ New York: Academic Press+ Hannan, E+J+ ~1979! The central limit theorem for time series regression+ Stochastic Processes and Their Applications 9, 281–289+ Kwiatkowski, D+, P+C+B+ Phillips, P+ Schmidt, & Y+ Shin ~1992! Testing the null hypothesis of stationarity against the alternative of a unit root: How sure are we that economic time series have a unit root? Journal of Econometrics 54, 159–178+

INVARIANCE PRINCIPLE FOR LINEAR PROCESSES

137

Liu, M+ ~1998! Asymptotics of nonstationary fractional integrated series+ Econometric Theory 14, 641– 662+ Marinucci, D+ & P+M+ Robinson ~1998! Alternative Forms of Fractional Brownian Motion+ Discussion paper 354, London School of Economics and Political Science+ Mcleish, D+L+ ~1975! Invariance principles for dependent variables+ Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete 32, 165–178+ Mcleish, D+L+ ~1977! On the invariance principle for nonstationary mixingale+ Annals of Probability 5, 616– 621+ Peligrad, M+ ~1986! Recent advances in the central limit theorem and its weak invariance principle for mixing sequences of random variables+ Progress in Probability and Statistics 11, 193–223+ Peligrad, M+ ~1998! Maximum of partial sums and an invariance principle for a class of weak dependent random variables+ Proceedings of the American Mathematical Society 126, 1181–1189+ Phillips, P+C+B+ ~1987! Time series regression with a unit root+ Econometrica 55, 277–302+ Phillips, P+C+B+ & P+ Perron ~1988! Testing for a unit root in time series regression+ Biometrika 75, 335–346+ Phillips, P+C+B+ & V+ Solo ~1992! Asymptotics for linear processes+ Annals of Statistics 20, 971–1001+ Phillips, P+C+B+ & Z+ Xiao ~1998! A Primer on Unit Root Testing+ Manuscript+ Rao, M+M+ ~1984! Probability Theory with Applications. A series of monographs and textbooks, Z+W+ Birnbaum & E+ Lukacs ~eds+!+ New York: Academic Press+ Stadtmüller, U+ & R+ Trautner ~1985! Asymptotic behaviour of discrete linear processes+ Journal of Time Series Analysis 6~2!, 97–108+ Stout, W+F+ ~1974! Almost Sure Convergence+ New York: Academic Press+ Tanaka, K+ ~1996! Time Series Analysis: Nonstationary and Noninvertible Distribution Theory+ New York: Wiley+ Truong-van, B+ ~1995! Invariance principles for semi-stationary sequence of linear processes and applications to ARMA process+ Stochastic Processes and Their Applications 58, 155–172+ Yokoyama, R+ ~1995! On the central limit theorem and law of the iterated logarithm for stationary processes with applications to linear processes+ Stochastic Processes and Their Applications 59, 343–351+

APPENDIX Proof of (16). Recalling ~4!, we have that ~noting that Eej 5 0 for all j ! An [

5

#

1 n 1 n

E

( t5r11 ( ~Xt Xt2r 2 EXt Xt2r ! * * r51

E

*( (

ln

`

n

k50 s50

( ( ~et2k et2r2s 2 Eet2k et2r2s ! * r51 t5r11 ln

`

ck cs

n

2 2 2 Eet2k !* ( s50 ( 6ck cs 6E * t5l ( ~et2k n k50

1

1

`

1

n

`

`

`

( 6ck cs 6E

( s50 n k50

*( ln

rÞk2s r51

( et2k et2r2s*, t5r11 n

138

QIYING WANG ET AL.

where l 5 max$2, k 2 s 1 1%+ By Markov’s inequality, it suffices to show that A n r 0, as n r `+ This follows from E

n

k, s$0

( t5r11 ( et2k et2r2s* r 0, * rÞk2s ln

1

Bn [ sup

n

(A.1)

r51

where l n 5 o~n!; and E

n

k, s$0

* ( ~e

*

n

1

Cn [ sup

2 t2k

2 2 Eet2k ! r 0,

t5l

(A.2)

where l 5 max$2, k 2 s 1 1%+ Because ek are i+i+d+ random variables with Ee0 5 0 and Ee02 , `, it follows that E

S

D

ln

2

n

( (

et2k et2r2s

rÞk2s t5r11 r51, l n

5

n

( (

Hence, ~A+1! follows from, as n r `, Bn #

1 n

SS

sup E k, s$0

ln

2 2 Eet2k et2r2s # nl n ~Ee02 ! 2+

rÞk2s t5r11 r51

n

r51

DD

2 102

( ( et2k et2r2s rÞk2s t5r11

# ~l n 0n!102 ~Ee02 ! 2 r 0+ To prove ~A+2!, for every j, let e1,* j 5 ej2 I~6 ej 6#n 104 ! 2 Eej2 I~6 ej 6#n 104 !

and

e2,* j 5 ej2 I~6 ej 6.n 104 ! 2 Eej2 I~6 ej 6.n 104 ! +

After some algebra, we obtain In, k, s [ E

S

n

( e1,* t2k t5l

D

4

# A$n 2 ~Ee04 I~6 e0 6#n 104 ! ! 2 1 nEe08 I~6 e0 6#n 104 ! %+

(A.3)

The relation ~A+3! implies that, as n r `, 1

Hn1 [ sup k, s$0

#

1 n

n

*( e * n

E

* 1, t2k

t5l

sup ~In, k, s ! 104 # A$n2104 ~Ee02 ! 2 1 n2308 Ee02 % r 0, k, s$0

where we use the following estimate: E6 X6 # ~EX 4 !104 for any X+ Therefore, it follows that, as n r `, Cn 5 sup k, s$0

1 n

* t5l( ~e1,* t2k 1 e2,* t2k ! * # Hn1 1 2Ee02 I~6e 6.n n

E

The proof of ~16! is complete+

0

104

!

r 0+

n

INVARIANCE PRINCIPLE FOR LINEAR PROCESSES

139

Proof of (17). Because Eej ek 5 0 for j Þ k, we have that n21

n

( (

r5l n11 t5r11

`

EX t X t2r 5

(( `

( (

n

n

The proof of ~17! is complete+

Eet2k et2r2s

n

ck cs n

S

2 + ( Eet2k t5k2s11

( ( EXt Xt2r * # Ee12 k5l ( 6ck 6 r5l 11 t5r11 n21

n

r5l n11 t5r11

`

( ( s50 k5s1l

Therefore, as n r `, n*

ck cs

k50 s50

5

1

n

`

`

n

DS ( D `

6cs 6

r 0+

s50

n

Suggest Documents