EXPLICIT STATIONARY DISTRIBUTIONS FOR SOME ...

3 downloads 0 Views 4MB Size Report
AR(l) process with ACnF gi"cn by (2.15). However, lInlik(' the slationary. Gaussian AR(!), the ACRF of (X" t E Z) is always posilive, and the condi tional variance of ...
~

..

COMMUN. STATlST.-STOCHASTIC MODELS, 10(2),499-517 (1994)

EXPLICIT STATIONARY DISTRIBUTIONS FOR SOME GALTON-WATSON PROCESSES WITH IMMIGRATION El11ad-Eldin A.A.. ALY and Nadjib DOUZAR Department of Statistics and Applied Probability

University of Alberta

Edmonton, Alberta, Canada T6G 2Gl

Key words: hranching process, stationarity, self-decomposability, stability,

Poisson geometric, backward regression

ABSTRACT The purpose of this pn.p

~

0, i

~

1) arC' iid Z+-va.Iued rv's, (el, t

~

]) is a sequence

of iid Z+-valued rv's and the sequen~ (Y;(l),t ~ O,i ~ 1) and (Et,t ~ 1) where the e's are iid. Furthennore, regression and correlation properties of

are independent. The branching is characterized by the common probability

stationary processes of type (1.2) are similar to those of the stationary Gaus­

density function (pdf), (pk' k

sian AR(l) process (d. AI-Osh and Alzaid [1] and Alzaid and AI-Osh [3)).

generating function (pgf) /. The immigration at eaeh t ~ I is described by

Stationary AR(]) processes of type (1.2) with binomial, Poisson, and

nega~ivc

binomial margjnals are given in McKenzie [111.

the cornmon pdf, (qk' k

~ 0), of the offspring V's, with probability

~ 0), of tbe e's, with pgf h.

If 91 denotes the pgf of

XI, then

It is clear from (Ll) and (1.2) that Z+-valued AR(l) processes of type (1.2) are Galton-Watson processes with stationary irrunigration (GWSI) in which the branching occurs according tD a Bernoulli (a) distribution (cf. Athreya

(2.2.a)

lsi

91+1(5) =9t(J(s»h(s),

~ 1.

Let

and Ney [4]). The purpose of this paper is to generalize the model of (1.2) by introducing a c!nss of GWSI processes that has a more gcncral branching

(2.2.b)

distribution. This class contains (1.2) as well as the model of AI-Osh and Aly

(2). The paper is organized as follows. In Section 2 we dcfine 'P-GWSI processes

= /'(1) = L kpk.

m

*~o

We will refer to the follOwing result which can be found in Athreya a.nd Ney )4], page 264, and Foster and WilliaDlSOD [7].

and obtain their distributional and regression properties. Stationary 'P-GWSI processes are studied and an analogy with stationary Gaussian AR(l) pro­ cesses is made. Some results that establish the connection betwccn stationary

Proposition 2.1. Let (X" t ~ 0) be a GWSI process such that m 'S: 1. Then (Xl, t

'P-GWST processes and discrete self-decomposability and stability (d. van Ham et a[. 19]) are given. In Section 3 a stationary 'P-GWSI process with a

(2.3)

Poisson geometric marginn.! is constructed and studied. The stationary geo­ metric and negative binomial 'P-GWSI processes are presented in Section 4. Finally, in Section 5 stntiona.ry 'P-GWSI processes for whirh the property of linear backward regression holds are identified. The notation

a=

1 - a for n a E [0,1) will be used throup;ho\lt the paper.

If m

~ 0) admits a proper limit distribution as t -+

if and only if

1

1

J - lJ(s) o !(s)_sds

< 00.

= 1 and 1"(1) < 00, the limit distribution, when it eXists, is degenerate

at zero. If 0 < m < 1, then (2.3) is equivalent to (2.4)

00

E(log+ e) =

L kgk < *~o

00.

AL Y AND BOUZAR

502

EXPLICIT STATIONARY DISTRIBUTIONS

503

Throughout the remainder of this paper we will whose offspring pdf P

con~ider

GWSr processes

= (pk, k ~ 0) is the gcometric scC)uence (see Harris [10],

page 9)

(J~~J, n ~ 1) be the sequence of iterates of 10,9, i.e., I~~j(s) = 1~~9-')(Jo,9(s» for 11 ~ I, with !~~~(s) == 3. Formulas givJllg ilcrates Let

fradionallincar pgf's Ci

(2.5)

if k = 0

,

Pk = { 0'( 1 - (8)(QO/-' ,

if k

~

1,

(2.9)

= 0 (2.5) becomes the Bernoulli (0')

distribution. The pgf of (Pb k ~ 0) is the fractional linear function

(2.6)

1(05)

=

1- s !0,9(.5) = 1 - 0 ' --8 1I -0:3

lsi S

be found in Harris {lO), page 9. Using these formulas

and a suitable reparameterization, we obtain

where 0' E (0,1) and 8 E [0,1). For simplicity, we will refer to these processes as 'P-GWSI processes. Note that when 8

CaIl

!~~J(.s) = 10.,,(05), I.!I S

1,

where

(2.10)

a" =

a,,(B)

={

0'"8/

(0 - CiU)" - a "e) ,

(1 + (Q/a)n) -I,

jf B E [0,1)

if () = l.

F'urther distributional and correlation properties of a P-G\VSr process are

1.

gathcred in the following Proposition.

Moreover,

~ 0) be a 'P-GWSI process such that var(X,) = E(cd < 00, and (J'~ = var(Er) < 00. Then

Proposition 2.3. Let (X" t

m=_a __ m. When 8

\'ar(XtIXt _

l)

= aa-

I

(1

I

is linear:

+ 8)m 2X t _ 1 + (J';,

= 0 (2.8)

(2.13)

"fl(k)

t 2: 1;

= m k var(X,_ k );

iy) For any t ~ 0,

2: 0) is (foA:;»" which is also the pgf of

I-I

E(Xc) == mtE(XD ) N

+ JI.. L

;=1

where N is binomial (n, a), Yi(t ~ 1) are iid (truncated at zero) geometric with parameter CiO, and N and (1'";, i ~ 1) are independent. A standard argument 0

var(X t ) =

m'

;=0

and

Y=LY'

implies then (2.8).

t ~ 1;

E(Xt/Xc_ J ) = mXI _ J + /i.,

iii) For any t,k.O < k.s t, the covariance at lag k'lt(k) = cov(Xt_k,X ), t of (Xt,t ~ 0) is

reduces to the binomial (n,a) distribution. Proof. The pgf of (P1") , k

t ~ 0, Il.

ii) The conditional variance of X t given X t -

1,

a"

of

m var(Xo) + ~(~::) 2t

t

m')-I E(X t _ j ) +17:

)=1

tm

2 ()-I).

)=1

Proof. Using the first and second momenta of the pdf (pil ) , Ie 2 0), given

by (2.8) and whose pgf is (Jo,8(S))l, we have

E«(

X'_I

E

i=1

Y;{I-l)IX, _ = f) = 1

me


.,(I)Vill"(X,-t) = m var(X t _ k ). k

=

.=1

AR(l) process. A further similarity is the property of lincilt

r('gr('s~ion of XI

on X'_I. TI10rcfore, (X" t E Z) cnn be thought of as a station;~kqk

(~II

wherc

}J;.~~ is givcn by (2.8) nnd ('7k, k ? 0) ill thl' pdf of

parameters ~ > 0 and 8 (ef. PaW and Joshi [16]). It results from the Poisson

til.m

.

p(t,J)==c

i v ) The pgf

~ ~ Ak (~)

(j ~ ~ ~ 1) plSi-l jf

k = L=I

k

(J

(k-l) ­ -1 r

.) k-i+j-U ->'0 .Aj ( t ao

(;

k

1: 0,

P..ffi(JO-1

.

{ ,(_1-S2 + l-o(sl ++ 1-

eS 2

clUe'

r /r!,

to McI\cllzie (11),

Al·O~h

IUld

Alzaid

IIJ and Alzaid

proe('s.~ arc

and AI-Osh [3J.

Next, we present t E Z) be a stationary P-GWSI process with a Poisson geometric

distrjbutjon~ of (X"

.

Proof. Since both

(}~,t

E Z) and the standa.rd Gaussian AR(l) process

arc Markovian, it is enough to prove that

~he bivariate char;}d"n~' ;"

r. __ ,'

ALY AND BOUZAR

SIQ

EXPLICIT STATIONARY DISTRIBUTIONS 51 I

Gy,_, ,Y. (tt, V) of (Yt -

It Y;)

converges to that of the standard Gaussian AR(I)

(4.2)

process. We have

log,pl(e'u" ir,\tl

-\

.

,e'v"

-I

011 + (J,

c:==L.:Zi

we have

1=1

where N is Poisson (>.),'\ = -doge! -pa), and Z,ti .2 1) are iid, independent

-

-

t~u2 t(2( - l)v + H3~ 2m)uu 1 - i([2'\(1 + OI-I/2(u + 2v) + 0(,\-1) 2

+ 0(,\-1/

2

)}

of N, with common pdf (a!, k ~ 1) given by (4.1)

a!

,.] IX"

=

Wl1ich implies that I (2

Gy._"y,(u,u)=cxp

-"2

U

If r

2

+v +2mttU)+0(,\-1/2)} 1

{

+ 0(,\-1/ 2 )

= 1,

((a8)! + pIc _ (po + a~p) "). 1- per

the following simpler representation for

(4.5)



Hence,

(3.7)

r

N

(4.3)

)=

+ O/2j1/2(u + u) -

{

(1 -(1 pa- erBs)(J + atJp)s) - ps)

(per

=

is the (infini~c1y divisiblc) pgf of

= 4>1(e'·"-' ,c'v.. -·)e-·,.. ..-'(·+v),

where ~l is given by (3.3). Letting ~ = .

= 9(o5)/9(J(>,q(s»

.

Gy,_"y,(U,v)

(3.6)

h(s)

Eo ==:

BG(I)

+ (1 -

elcists:

E:

B)G(2),

where B. GO) and G(2) arc independent, B is Bernoulli (pcr/(p _ 0'0), G(l) is geometric (aO) and G(2) is geometric (p).

E.~ Gy._"y,(u,u) = expj-~(u2 + v 2 + 2muv)}.

Proof. It is enough to prove that h(.s) is a (necessarily infinitely divisible) The right-hand side of (3.7) is the bivariate characteristic function of the stationary standard Gaussian AR(1) process with ACRF p(k) = m!.

0

4. THE GEOMETRIC AND NEGATIVE BINOMIAL GWSI

pgf (d. van Harn et at. 19), Corollary 6.2). Clearly, h(l) = 1. Moreover, it is easy to sec from (4.2) that (4.6)

I

PROCESSES

h(s) == '\' .::. og 11(0) L k !~I

Assume throughout this section that 0 E [0,1). We recall that the pgf of a negative binomial distribution with par;uncter~ r > 0 a.nd 1) E [0, I) (ND(,',p)

Hence, h( ~)

= exp( -..\( J-

(-O)k + p k_ (pO1 _+ (HIP)!) s ex

!

PO'

G(s» where G is the pgf of (ak' k

.

.2 1) of (4.4). The

repre.'lcntation (4.3) follows immediately (cr. Feller [6/). The representation (4.5) when r = I follows from

is

g(~) =

(4.1)

C!ps)" 151::;

The geometric (p) distribution is the NB(l,V) distribution. The following Lemma Lemma 4.1.

IS

useful.

Any Nfl(r,p) distribution with B

decomposable. Moreover, with g ns in (4.1),

a 11(05)=_+ _b_ 1 -aB" 1- po5'

(4.7) 1. where a

= po(1 -

O'O)(p - (8)-1 and b = P o'(p _ 9)(p _ aO)-I.

Let p E rO, 1) and r

s:

p < 1 is F(O)-self­

o

> O. Then from Proposition 2.5. and Lemma 4.1 we can

generate a stationary P-GWSr process (Xl,t E Z) with a NB(r,p) marginal distribution. The corresponding immigration process is a sequence of iid

TV'S

of the form (4.3) in general O.

513

-k-l) (r+k-I) fJ (

Suggest Documents