About almost independent linear statistics - Springer Link

0 downloads 0 Views 233KB Size Report
2) a,X N' + a2X~" and btX N' + b2XN2 2 are Cz(e + 3)-independent;. 3) EIxNJ: O. Here CI, C2 are absolute constants and Mr depends only ...
Lithuanian Mathematical Jounml. I~,1.36. No. 3. 1996

ABOUT ALMOST INDEPENDENT LINEAR STATISTICS R. Y a n u s h k e v i c h i u s

One of the first results in the area of characterizations by the independence of statistics was S. Bernshtein's theorem [1] proved in 1941. According to this theorem if independent identically distributed random variables Xi and X2 are such that their sum Xi + X2 and difference Xi - X2 are independent, then Xi and Xz are normal random variables. In 1948 B. Gnedenko [2] generalized this result in the following way. If linear statistics LI = a l X l + a2X2 and L2 = blXl + b2X2 are independent, whenever albl ~ O, azb2 ~= O, then Xi and X2 are normal. By definition, random variables X and Y are called (p, e)-independent (or simply e-independent) if p(Fcx.r), Fx . Fv) = sup

IF 2hk) +4&. Hence, taking into consideration e-independence of the statistics Xi + X" and Xi - OX'2 and the relation of type (I), we have

P(IXI > hg+I)P(IOX'21 hk} U {[X~,I > hk})P({IX~[ > hk} U {IX;I > hk}) -1- 4(e + &)

(7)

~< (P(IXtl > hk) + P(IX21 > h~)) 2 + 4 ( e + 3 ) . Consequently, since X2 6 ,.7 and

P(IOX;I hk))2/(l -- P) + 4 ( e + 3 ) / ( I -- p).

(8)

Now we estimate P(IXzl > hk+l): P(IXzl > h k + l ) P ( l X l l 1 10lh~+l -- h~) + 4& 1 ht+t - hk)P(IXi -- OX~I >1 ht~+l - hk) + 4(e + 3) hk) + P(IX~_I > hk)) 2 + 4(e + 3), P(IX'I > h/~+t) ~< S2(h~:)/(l - p) q- 4(~ -}- 5)/(1 - p).

(9)

where S(u) = P(IXtl > u) + P(IX;I > u). So we get from (8) and (9) that

S(h~+l) min(lal/a21, Ibl/b21)) = P(IXII > m) < p. e(IX'21 > 1 ) = P(IX21 > la~/a21) ~< P(IXzl > min(Ibl/b2l, lal/a2D) = P(IXz[ > m) < p. Consequently, S = S(l) 3 k+') >~P(lXtl>3(log

3).

2 ~ _ 3 1 1, ~ ' o g 2 3 ) + p ( l x , l > 3 ( lOg2 l--}--')l°g2a'+8/

On the other hand, S(hk+l) Nj) x). 0

Putting s = max{[log 3 Ni], [log 3 N2]} and making use of estimate (11), we have

ElSie[ r