The most widely used estimator of the autocorrelation function is no doubt the standard sample autocorrelation function recommended by -tiox and Jenkins.
Computational North-Holland
Statistics & Data Analysis 14 (1992) 149-163
149
ai-sum Chan National Ul;illersity of Singapore, Singapore 0511, Singapore
Temple Unkersity, Philadelphia, PA 19122, USA
Received August 1990 Revised April 1991 Abstract: A new a-trimmed
estimator is proposed for the autocorrelation function in time series analysis. This estimator is designed to increase the resistance to extreme values in the observations. The performances of the new estimator and some existing estimators are compared in a simulation study. The results indicate that the new estimator is preferable to the other alternatives
when the observations also given.
are contaminated
Keywords: Additive outlier; estimate; Trimmed estimate.
by outliers.
Innovational
Comments
outlier: ARhiA
on each individual
model;
Robust
estimator
estimate;
are
Jackknife
don that an outlier-free tim e series Y* has the stationary autoregressivemoving average representation Suppose
4P)y,
(1 . 11
= qB)a,,
where B is the backshift operator such that B”Y, = y_,, qb(B)= 1 -+,B-
.a. -c&BP,
O(B) = 1 - 8,B - . . . -ol(IB’l,
4(B) has all its roots outside the unit circle, and a, is white noise with zero mean and constant variance 0;’ < 00. Without a loss of generality, we assume that (1 .2)
E[Yj = 0. Correspondence
to: Wai-Sum
Ghan,
Economics and Statistics, 10 Kent 0 I67-9473/92/$\~5.00
National LJniversity of Singapore, Ph. idge Crescent, Singapore OS1 1.
0 1992 - Elscvicr Scicncc
.v. All rights rcservcd
150
‘FK-s. Clan, W. W.S. Wei / Estimators
of time series arctocorrelatiorz
For all integers k. defining the autocovariance we have
y(k) at lag k by Cov[ Y, Y&
In particular, y(O) = Var( Y,). (Should E[Y,] = py + 0 then, instead of working with (Y), we work with the mean-corrected process (Yl* = Yl - py).) The general process in (1.1) can be fully characterized by
and ui. Alternatively, it can be represented ing the autocorrelation at lag k by P(k)
=-
precisely
by (y(O), y(l),
. . . , y(p
+
Y(k)
q)). Defin-
(1.5)
Y(O)
same information as in A is contained in y(O) and p(p + q)). The complete set of autocorrelations p(l), p(2), . -. is termed the : .ocorrelation function. Therefore, it is important to estimate the autocorrelation function from the observed time series, Y1, YZ,.. . , Y,. The most widely used estimator of the autocorrelation function is no doubt the standard sample autocorrelation function recommended by -tiox and Jenkins (1976, p.32). This estimator is employed by most of the time series computer packages to compute the sample autocorrelations. Quenouille (1949) proposed the jackknife estimator to reduce the bias in estimation of autocorrelations. Recently, Masarotto (1987) suggested the robust estimator for the autocorrelation function. In this paper, a new a-trimmed estimator of p(k) is proposed. This estimator is designed to strengthen the resistance to extreme observations in the data. A Monte Carlo study is performed to compare the new estimator with the existing estimators. IpO,
the
pm...,
e existing estimators 2.1. Tkzestam-lard estima tar (SACF) For a given time series Y1?. . . , I$ (SACF) is defined by
Aw
=
the standard sample autocorrelation
E:‘;&,(y,- j”)(y-x. -q k =0,
c;;,(v,
- Y)Z
l,...,(n
- l),
function
(2 . 1)
’
wLere t = Ly, ,Y,/iz, the sample mean of the ohserved series. The properties of this estimator are discussed by Jenkins and atts (I968, pp. 184) and Anderson (1971, pp. 488).
W.-s. Chart, W. W.S. Wei / Estimators
of time series autocorrelation
iS1
2.2. Thz correlation estimator (CACF)
The correlation estimator is another function. The CACF is defined by
possible estimator of the autocorrelation
where v(i) and Yf2) are the averages of the first n - k and the last n - k observations, respectively, and the lag k runs from 0 to n - 2. If we assume that the random vector (Y, _&, Y[>’ is bivariate normally distributed with correlation coefficient p(k), then the CACF is the maximum likelihood estimate of the true autocorrelation function (Jenkins and Watts, 1968, pp. 182). The bias in the CACF is given by Kendall, Stuart and Ord (1983, pp. 550). 2.3. The jackknife estimator UACF) A third method of estimating the autocorrelation function is to use the method of bias reduction (Quenouille, 1949). In this procedure, the time series is first divided into two halves, then the CACF is calculated half of the series, giving &i(k) and &( k 1 respectively. The jackknife (JACF) is given by b,(k)
= 2&(k)
- G,,,(k)
jackknife observed for each estimate (2.3)
+Mk)L
fork =O, l,...,(n - 1)/2. It is believed that the bias in estimation of autocorrelations can be reduced by jackknifing as in equation (2.3). Further details and reviews of the jackknife method are given in Miller (1974) and Efron (1982). 2.4. The robust estimator @ACF) l’vlasarotto (1987) proposed a robust estimator of autocorrelations by first defining the robust sample partial autocorrelation function. Anderson (1971, pp. 183-188 and 221-223) showed that the lag k partial autocorrelation function (P,J of the general process in (1.1) can be obtained by the following recursion: Ukr=Y,-n&,,,Y,_,
-
l
(2.4)
** -q-,*&,yr_k+,,
5, = y,_, -7Q_J_k+I
- *** -7Q_l,k_ ,yc+
(2.6)
Pk = Corr[ U,,, V,,] ,
71k,j =7r kTX_,/; = Pk .
1.j -
pk:Tk-
9 ( A. 5)
1.X _,
(j=
1,...,k
- 1)
(2 .7) (2.8)
‘,%te that the rk .j’s are the coefficients of t e linear one step ahead predictor of
W.-s. Chan. W. W.S. Wei / Estimators of time series autocorrelation
152
Y based on k past observations while Ukr and I’,, are the forecast errors of Y and Y_k based on the observations YI_lr...,YI_k+l. Makhoul (1981) discussed the Burg’s estimator of &,
In order to obtain a robust version of (2.9), Masarotto (1987) employed an r&List M-estimator to estimate the correlation coefficient in (2.6). T e resulting sample partial autocorrelation function is given by &=2
2
%WG(
*=.I+ 1 hr(Uk:
+ v,:,)
(2.10)
-II
t=k+l
which is essentially a’weighted version of (2.9). Several appropriate choices of the weights, wkt, are give-n in Maronna (1976) and Masarotto (1987). After obtaining the Pk, we can define the robust sam e autocorrelation function : RACF) through the well known Durbin-Levison relationship. That is k-l
&(k)
=
rk_,&(k
-j) + ik
j-1
for li= l,...,n1. The asymptotic discussed in Masarotto (1987).
k-l
1’2.11) i=l
properties
of this robust
estimator
are
It is well known that the standard sample autocorrelation function in (2.1) is not a resistant statistic. A wild observation can greatly distort the whole picture of the SACF and lead to an erroneous identification of the underlying process. Since trimming is a very simple and efficient method to increase the resistance of an estimator to extreme observations, we consider the a-trimmed sample autocorrelation function (TACF). Given an observed time series YI, . . . , Y,, let yCIj G E;,, Q < yn, be the c?l-trimmed sample autocorrelation function is defined l
l
l
(3.1)
W.-s. Ghan, W. W.S. Wei / Estimators of time series uutocorrelation
153
with
and L’“’
t
=
0
if
q
G
yts,or
Yt 3 Y&-g+l),
.
(34) 1 otherwise, where g is the integer part of [cun] and 0 \< a! < 0.5. Since {L’p)) is a deterministic sequence of real numbers (0 or 1), the asymptotic theory for time series containing amplitude modulated observations can be applied to the TACF. Under the white noise model, the &&.J k), for k > 1, are asymptotically independent normal random variates with asymptotic variances i4 k i - ‘, where 1 n v(k) = lim x Lyk Ly). (3 .a. 51 n+m n t=k+l The proof of this result can be easily obtained by applying the Theorem 2 of Dunsmuir and Robinson (1981). Therefore, we can approximate the standard ) of the TACF by
(3.6) where
Note that G(k) < 1 when cy> 0. Thus, SE[&LY(k)] > SE[&( k)] for the white noise model. This is expected because the information in the trimmed series is always less than that in the original series under the outlier-free situation. The choice of cz is very important for the TACF. If we choose a large value for cy, the TACF is no longer an efficient estimator since C(k) becomes smaller as Q! increases. On the other hand, if we select a very small value for LY,the purpose of the estimator (trim out possible outliers in the series) might not be fully fulfilled. In practice, we recommend using CY= l-2% for general series, cy= 34% for medium contaminated series, and cx= 640% for heavily contaminated series.
u tlier-free case
Time series are generated from an ~=~$~_~fa~--Oa,_,.
A( 1, 1) process
154
W.-s. Ghan, W. W.S. Wei / Edmators of time series autocorrelation
Table 1 Bias, star;rdard error (SE), and root mean squared error (RMSE) of five estimators autocorr~lation L function based on 5000 simulations of size 50, ARMA( 1, 1) process
of the
---
(4, e) = (O.&O) k=ll 2
Estimalor SAC5:
3
(&,N = w. - 0.2) k=I 2
3
Bias SE RMSE
- U.092 0.107 0.141
-0.147 0.160 0.218
- 0. ?60 0.18s 0.248
- 0.034
0.134 0.138
- 0.028 0.137 0.140
- 0.024 0.138 0.140
CACF
Bias SE RMSE
- 0.074 0.106 0.129
-0.122 0.165 0.205
-0.154 0.201 0.253
- 0.03 1 0.137 0.140
- 0.029 0.143 0.146
- 0.025 0.147 0.150
J.4CF
Bias SE RMSE
0.001 0.137 0.137
- 0.003 0.226 0.226
- 0.012 0.287 0.288
0.000 0.152 0.152
0.001 0.166 0.166
0.004 0.173 0.173
RACF
Bias SE RMSE
-0.119 0.146 0.188
-0.162 0.206 0.262
-0.186 0.238 0.302
- 0.090 0.186 0.207
0.006 0.184 0.184
0.003 0.189 0.189
TACF
Bias SE RMSE
-0.169 0.122 0.208
-0.174 0.170 0.243
- 0.183 0.200 0.271
- 0.064 0.143 0.157
- 0.026 0.150 0.152
- 0.024 0.153 0.155
We use the IMSL subroutine RNNOA to produce the white noise, a,, with mean zero and variance one. The first 100 observations are discarded to reduce the effects of starting values. We consider sample size n = 50; parameter (4, 0) = (0.8, O), (0.2, O), C-0.5, O), CO,0.81, (0, -0.2) and (0.8, 0.2); and lag k = P, 2 and 3, The bias, the standard error (SE), as well as the rooi mean squared error IRMSE) of each estimator of p(k) are computed over the 5000 replications. The Huber’s weights (Huber, 1981) are used to calculate the RACF. We set CY= 2% for the TACF. The results for different parameter combinations are very similar and hence we only report the cases of (4: 0) = (0.8, 0) and (0, -0.2) in I able 1. The SACF is downward biased, especially when the lag k is large. The rtandard errors of the SACF increase as the lag k gets larger. However, its relative performance, in terms of root mean squared error, is the best among the five estimators. Since the SACF and CACF are asymptotically equivalent, their performance are quite similar. The jackknife estimator (JACF) effectively reduces the bias but at the expense of its standard error. The overall performance of the JACF is slightly inferior to the SACF and CACF. Gnslly, the RACF and TACF are designed to dea! with aberrant observations. If there is no contamination in the series, these two estimators are usualy inferior to other alternatives. 4.2. Sirzgle-ouilier case Time series observations
are often influenced by interruptive events such as strikes, outbreaks of wars, sudden political oi* economic crises, unexpected heat
W.-s. Chan, W. W.S. Wei / Estimators of time series autocorrelation
15.5
or cold waves, or even unnoticed errors of typing and recording. The consequences of these interruptive events create spurious observations, which are inconsistent with the rest of the series. Such observations are usually referred to as outliers. There are many different ways to ,e,:scribe the effect of an outlier. In this paper, we only consider two types of outlier that may occur in a time series. They are adfditive outlier and innovational outlier. These two types of outlier were proposed by Fox (1972) and considered by many authors such as Hillmer (1984), Tsay (1986,1988), Chang, Tiao and Chen (1989), Abraham and Yatawara (1989) and Abraham and Chuang (1989). An additive outlier (AO) model is defined as 2, = yt + d,(T),
(4.2)
where Yl is an outlier-free time series as described in (1.11, o represents magnitude of the outlier, and t=T,
t+
the
(4 3) .C
T,
is the indicator variable representing the presence or absence of an outlier at time T. An innovational outlier (IQ) model is defined as
z, = yl+
(o(B)/~(B))~/~).
(A 4) 1
.
Hence, an additive outlier affects only the level ef the Tth obsei-Jation while an innovational outlier affects all observations &, &+ 1, . . . beyond time T through the memory of the system described by (O(BJ/& B)). Ir, this section, we consider the effects of an additive or innovational outlier on the estimation of the aut ocorrelation flunrtion of the underlying outlier-free process. Time series are generated from model (4.2) and (4.4) with-ARMA( 1, 1) as the outlier-free process. We use the same six parameter combinations for (4, 0) as in the previous section with sample size tz = 50, and magnitude of the outlier cr)= 0, 5, 10, 20 and 00. The case of CLI-+ m is approximated by setting o = 1000000. The outlier is located at T = 25. The experiment is repeated 5000 times and the bias, the standard error, as well as the root mean squared error of each estimator of p(l) are reported in Table 2. We first consider the additive outlier situation. The estimators, SACF, CACF and JACF, are seriously downward biased. In each case, the biases increase as w gets larger. For the limiting situation (W -+ m), both the SACF and CACF converge to a constant. IJnder the additive outlier model in (4.21, Chan (1992) showed that n+k
lim &(k)
OJ+=
= n(rz - i-y’
\ (4.5,
for 0 -3 = an For example, we have the bias of -0.821 in the S (4, 0) = (0.8, 0) from Table 2. is result agrees wit the equation (4.5). Since
156
W.-s. Ghan, W. W.S. Wei / Estimators of time series autocorrelation
Tahk
2 Bia , standard ecror (SE), and root mean squared error (RMSE3 of five estimators of the lag one .~~~*ocorrclationbased on 5000 simulations of size 50, ARMA(1, 1) process with an outlicr Additive outlier 10 3=O ,c
Estimator
20
z
Innovational outlier 10 w=o 5
20
X
~---
(a) (4, e) = (0.8,O) SACF
Bias - 0.092 - 0.227 - 0.435 - 0.657 - 0.821 - 0.092 - 0.083 - 0.069 - 0.055 - 0.048 0.099 0.077 0.045 0.000 SE 0.107 0.136 O.i47 0.113 0.000 0.107 0.129 0.103 0.071 0.048 0.265 0.459 0.666 0.821 0.141 RMSE 0.141
CACF
Bias - 0.074 - 0.216 - 0.43 1 - 0.656 - 0.821 - 0.074 - 0.070 - 0.060 - 0.050 - 0.045 0.099 0.099 0.045 0.000 SE 0.106 0.138 0.149 0.114 0.000 0.106 0.121 0.098 0.068 0.045 0.256 0.455 0.666 0.821 0.129 RMSE 0.129
JACF
Bias SE RMSE
0.112 0.086 0.141
0.206 0.118 0.237
RWF
Bias -0.119 -0.148 -0.150 -0.150 -0.151 -0.119 -0.080 -0.040 -0.001 SE 0.146 0.152 0.151 0.151 0.151 0.146 0.131 0.115 0.097 RMSE ii. 188 0.212 0.213 0.213 0.213 0.188 0.153 0.121 0.097
0.137 0.016 0.138
TACF
Bias -0.169 -0.146 -0.121 -0.121 -0.121 -0.169 -0.124 -0.047 SE 0.122 0.130 0.121 0.121 o.121 0.122 0.112 0.084 RMSE 0.208 0.195 0.191 0.191 0.191 0.208 0.169 0.096
0.039 0.005 0.039
0.001 -0.218 -0.569 -0.944 - 1.192 0.139 0.191 0.210 0.150 0.144 0.139 0.290 0.607 0.956 1.181
0.001 0.137 0.137
0.026 0.125 0.128
0.060 0.104 0.120
0.004 0.043 0.043
(b) (@,@I = (O.2,O) SACF
Bias -0.036 -0.096 -0.158 -0.200 -0.221 -0.036 -0.034 -0.030 -0.037 -0.025 SE 0.139 0.133 0.108 0.067 0.000 0.139 0.131 0.104 0.064 0.000 RMSE 0.143 0.164 0.191 0.211 0.221 0.143 0.135 0.108 0.069 0.025
CACF
Bias -0.033 -0.095 -0.:58 -0.200 -0.221 -0.033 -0.031 -0.029 -0.026 -0.025 SE 0.142 0.135 0.108 0.067 0.000 0.142 0.133 0.105 0.064 0.000 RMSE 0.145 0.165 0.192 0.211 0.221 0.145 0.136 0.109 0.069 0.025
JACF
Bias SE RMSE
RACF
Bias -0.093 -0.104 -0.104 -0.104 -0.104 -0.093 -0.089 -0.086 -0.081 -0.028 SE 0.190 0.182 0.180 0.199 0.199 0.190 0.181 0.178 0.175 0.143 RMSE 0.212 0.210 0.208 0.209 0.209 0.212 0.202 0.199 0.193 0.145
TACF
Bias -0.067 -0.051 -0.051 -0.051 -0.051 -0.067 -0.050 - 0.049 -0.038 SE 0.147 0.148 0.148 0.148 0.148 0.149 0.146 0.142 0.129 RMSE 0.162 0.157 0.156 0.156 0.156 0.162 0.155 0.150 0.134
0.000 -0.105 -0.214 -0.284 - 0.311 0.158 0.192 0.150 0.120 0.145 0.158 0.201 0.262 0.309 0.343
0.000 0.158 0.158
0.019 0.164 0.165
0.034 !U% 0.140
0.028 -0.352 3.103 0.106 O.lC9 0.368
0.011 0.001 0.811
(c) (qb,to= ( - 05,O) SACF
Bias SE RMSE
0.019 0.122 0.124
0.145 0.131 0.196
0.297 0.117 0.319
0.414 0.078 0.422
0.499 0.000 0.499
0.019 0.122 0.124
0.013 0.116 0.119
0.004 -0.005 -0.010 0.091 0.056 O.000 0.091 0.056 0.010
CACF
Bias SE SE
0.009 0.124 0.124
0.140 0.133 0.193
0.295 0.118 0.327
0.414 0.079 0.421
0.479 0.000 0.479
0.009 0.124 .I24
0.006 0.117 0.117
0.000 - 0.006 - 0.010 0.092 0.056 0.000 0.092 0.056 0.010
004
P=>P c-L-0 NW0 I
~~%
c-0
PPO I
i
PPP
kKG%
--0
P P P --0 PWwl N-P
-\DQ\ -.lww
000 Lbb
P-W POvl
I
0 P 0 b-s
158
W.-s. Chan, W. W.S. Wei / Estimators of time series autocorrelation
Table 2 (continued) Additive Outlier 10 w=o 5
Estimator
20
x
Innovational Outlier 10 w=o 5
20
=
(f) (4, e) = (0.8.0.2) SACF
Bias SE RMSE
0.019 0.124 0.267 0.392 0.467 0.101 0.108 0.103 0.074 0.000 0.103 0.165 0.286 0.399 0.467
0.019 0.101 0.103
0.014 0.096 0.097
0.007 0.076 0.076
0.002 - 0.001 0.046 0.000 0.046 0.001
CACF
Bias SE RMSE
0.009 0.118 0.265 0.392 0.467 0.102 0.110 0.104 0.074 0.000 0.102 0.161 0.284 0.398 0.467
0.009 0.102 0.102
0.007 0.097 0.097
0.004 0.076 0.076
0.001 - 0.001 0.046 0.000 0.046 0.001
JACF
Bias SE RMSE
0.001 0.168 0.401 0.600 0.681 0.109 0.138 0.140 0.111 0.126 0.109 0.217 0.425 0.610 0.692
-0.487 0.00 1 -0.091 -0.21q -0.330 0.097 0.056 0.058 0.144 0.109 0.133 0.218 0.335 0.508 0.109
RACF
Bias SE RMSE
TACF
Bias SE RMSE
- 0.025 0.000 0.002 0.002 0.003 - 0.025 0.143 0.142 0.141 0.141 0.141 0.143 0.145 0.145 0.142 0.141 0.141 0.141 0.083 0.065 0.056 0.056 0.056 0.115 0.116 0.110 0.110 0.110 0.142 0.133 0.124 0.124 0.124
0.083 0.115 0.142
-0.025 -0.025 -0.025 -0.025 0.139 0.137 0.137 0.137 0.141 0.140 O.!,aQ 0.139 0.037 0.117 0.123
0.010 0.107 0.108
0.010 0.107 0.108
q.010 0.107 0.108
the SACF and CACF are asymptotically equivalent, their limiting results are the same. On the other hand, the JACF does not converge to a constant when w + *. It has a small standard error but a disastrous bias in the limiting case. The results indicate that the jackknife is not a device for correcting additive outliers. The RACF successfully reduces the adverse effects of the outlier. The RMSE is usually maintained at a constant level for o > IO. Under the additive outlier model with a large O, the contaminated observation is most likely the extremum in the series. Therefore, the TACF can totally eliminate the effects of the outlier. The root mean squared error of the TACF is usually lower than that of the RACF. In conclusion, the TACF is preferable to the other estimators when the series is contaminated by an additive outlier. Second, we discuss the innovational outlier situation. The root mean squared errors of the SACF, CACF and JACF improve when CC)increases. For w + 00, these estimators converge to constants. As anticipated, the root mean squared errors of the RACF stay at a certain level robustly. The innovational outlier is also not an obstacle for the TACF in the estimation of p(k). When w -+ 00, the TACF achieves the smallest RMSE in some cases. It is interesting to note that all of the estimators are not affected by the innovational outlier. In fact the existence of the innovational outlier actually improves the estimation of p(k), especially when the magnitude of the outlier is large. In order to explain this phenomenon, we further investigate the innovational outlier model in (4.4). Let y = +( B)a,, where +(B) = e(B)/+(B) = 1Since x is a stationary time series as defined in (1 .l), we have
W.-s. Chan, W. W.S. Wei / Estimators
of time serie autocorrelatiot~
2$; < 00. In model (4.4), for large w, the observations dominated
Z,,
Z,, 1,
159 ...
are
by the effects of the outlier, i.e.,
Thus, the autocorrelation structure is more clear and easy to estimate when o is large. For example, Chan (1992) showed that lim { lim &(kj) n + x
=p(k).
\&J-LX
(4.7)
The equation (4.7) implies that the SACF is pushed towards p(k) under the innovational outlier model. 4.3 Multiple-outlier case In practice, a time series might contain several, say m, outliers of different types, and wt: have the following general time series outlier model Z, = ~ + ~ ojTj( B)I,‘“‘,
(4.8)
j=l
where Yl = [NB)/4Wl a,, Tj(B) = 1 for an A0 and Tj(B) = [0f each estimator of p(l) are given in Table 3. For additive, outlier contamination, the SACF, CACF and JACF are seriously affected. The bias and XMSE of these estimators not only depend on the size of the outliers but also relate to the number of A0 in the series. For example, we consider the case of o = 10. The RMSE of the SACF is 0.627 for (AO, AO, AO); 0.352 for (AO, IO, AO); 0.231 for (IO, AO, IO); and it further slides down to 0.077 for (IO, IO, IO). In fact t+e RMSE’s of (IO, IO, IO) with o > 0 are usually smaller than that of the outlier-free case (W = 0). The results indicate that the SACF, CACF and JACF c ould be significantly distorted by additive outliers. On the other hand, the estimation of the autocorrelation function is not much affected by innovational ourhers. Both the RACF and TACF successfully reduce the adverse effects of the outliers. The R SE of the TACF is usually lower than that of the overall performance of the TACF in this stu y is reassuring.
- 0.328
0.194 0.381
0.00 7.
0.13’7 0.137
-0.119 0.146 0.188
- 0.221 0.131 0.257
Bias SE RMSE
Bias SE RMSE
Bias SE RMSE -0.191 0.146 0.240
- 0.212 0.160 0.265
-0.379 0.149 0.407
- 0.074 0.106 0.129
Bias SE RMSE
- 0.385 0.147 0.412
5
- 0.092 0.107 0.141
o= 0
Bias SE RMSE
(a) (AQ, AQ, AO)
Estimator
i
-0.122 0.127 0.176
- 0.216 0.156 0.267
- 0.586 0. 1.‘3 1 0.605
- 0.614 0.122 0.625
- 0.615 0.121 0.627
10
-0.122 0.127 0.174
-0.217 0.155 0.266
- 0.729 0.084 0.734
- 0.747 0.072 0.750
- 0.747 0.072 0.750
20
-0.122 0.127 0.176
-0.217 0.154 0.266
- 0.792 0.000 0.792
- 0.807 0.000 0.807
- 0.807 0.000 0.807
05
5
-0.159 0.116 0.196
- 0.221
0.131 0.257
- 0.084 0.126 0.152
- 0.155 0.133 0.204
-0.167 0.106 0.198
-0.177 0.105 0.205
-0.119 0.146 0.188
0.137 0.137
0.00 1
- 0.074 0.106 0.129
- 0.092 0.107 0.141
(cl (IO, AO, 10)
o= 0
Table 3 Bias, standard error (SE), And root mean squared error (F. MSE) of five estimators of the lag one autocorrelation (1) process (#I :-f:0.8) with three outliers
- 0.050 0.083 0.097
0.096 0.097
- 0.011
- 0.248 0.092 0.264
- 0.214 0.076 0.227
- 0.219 0.075 0.231
10
0.016 0.042 0.045
0.052 0.067 0.085
- 0.292 0.05 1 0.296
- 0.232 0.043 0.236
- 0.236 0.043 0.240
20
0.046 0.003 0.046
0.168 1I.001 0.168
- 0.310 0.000 0.309
- 0.240 0.000 0.240
- 0.243 0.000 0.243
00
based on 5000 simulations of size 50,
* 3 -I: 0
3
3
a 3
F ;
0
m
C
- 0.244 0.131 0.277 -0.125 0.160 0.203 -0.138 0.143 0.199 -0.155 0.132 0.204
- 0.074 0.106 0. i 29
9.00 1 c. 137 9.137
-0.119 0.146 0.188
- 0.221 0.131 0.257
Bias SE RMSE
Bias SE RMSE
Bias SE RMSE
ias SE RMSE
CACF
JACF
RACF
TACF
- 0.251 0.130 0.283
- 0.092 0.107 0.141
Bias SE RMSE
SACF
tb) (AO, IO, AO)
- 0.03 1 0.096 0.101
- 0.098 0.127 0.160
- 0.185 0.121 0.221
- 0.334 0.104 0.350
- 0.337 0.104 0.352
0.026 0.052 0.058
- 0.059 0.111 0.125
- 0.204 0.069 0.215
- 0.371 0.062 0.376
- 0.372 0.062 0.377
0.057 0.000 0.057
0.117 0.023 0.119
- 0.208 0.000 0.208
- 0.385 0.000 0.384
- 0.385 0.000 0.385
-0.221 0.131 0.257
-0.119 0.146 0.188
0.00 1 0.137 0.137
- 0.074 0.106 0.129
- 0.092 0.107 0.141
--0.124 0.104 0.161
- 0.019 0.102 0.104
- 0.001 0.092 0.093
- 0.066 0.079 0.103
- 0.074 0.079 0.109
(d) (10, IO, IO)
- 0.017 0.069 0.07 1
0.063 0.090
- 0.005 0.053 0.054
- 0.058 0.047 0.075
- 0.061 0.047 0.077
0.036 0.038 0.052
0.116 0.034 0.121
- 0.007 0.028 0.028
- 0.054 0.024 0.059
- 0.055 0.024 0.060
0.058 c:.ooo 0.058
0.170 0.000 (2.171
- 0.008 0.000 cE.008
- 0.052 0,000 9.052
- 0.053 0.000 0053
!E-s. Chan, W. W.S. Wei / Estimators of time series arrtocorrelatiotl
162
In this paper, an a-trimmed sample autocorrclation function is proposed. The new estimator is demonstrated to be useful for estimating the autocorrelation function when there is the possibility of outliers. The proposed estimator is also useful in model identification in the presence of outliers which is a crucial step for most of the existing outlier detection procedures.
The authors are grateful to the associate editor and two referees helpful comments on an earlier version of the paper.
for their
rences Abraham,
B. and Chuang,
A. (19891, Outlicr
detection
and time series modelink., Techtzometrics,
31, 241-248. B. and Yatawara, H. (19891, A score test for detection of time series outliers, Jortnzal of Time Series Analysis, 9, 109- 119. Anderson. T.W. (1971). The Statistical Atlalysis of Time Series, (Wiley, New York). ox. G.E.P. and Jenkins, G.M. (1976). Time Series Analysis: Fcvecavtitlg and Control (Holden-Day,
Abraham,
San Francisco). Chan. W.S. (1992). A note on time series mode! specification in the presence of outlicrs, Jourtzal of Applied Statistics, 19, 117- 124. Chang. I.. Tiao, G.C., and Chcn C. (1989). Estimation of time series parameters in the presence of out liers, Techtlometrics, 30, 193-204. Dunsumir, W. and Robinson, R.M. ( 19811, Asymptotic theory for time series containing missing and amplitude modulated observations, Sankhya. Ser. A, 260-281. Efron, B. (1982), The JackktlifcJ, the Bootstrap atld Other Resampling Plans, CBMS-NSF Regional Conference Series in Applied Mathematics, 38, Society for Industrial and Applied Mathematics, Philadelphia. Fox. A.J. (19721, Outlicrs in time scrics, Jorrrtaai qf the Royal Statistical Society, Ser. B, 34, 350-363. Hillmer, S.C. (1984), Monitoring and adjusting forecasts in the presence of additive outliers, Jowt~al of Forccastitrg , 3, 205 -2 15. Huber. P.J. ( 19811, Robust Statistics, (Wiley. New York). Jenkins, G.M. and Watts, D.G. (1968). Spectral Attalysis at:d Its Applicutions, !Ho!dcn-Day. Oakland). cnda!!, M., Stuart, A.. and Ord, J.K. (1983), The Adr*atzced Xeo~ of Statistics, Vol. 3, Fourth Edition, (Hafner Publishing Company, New York). akhou!, J. (1981), Lattice methods in spectra! estimation, in: D.F. Findlcy (Ed.) Applied Time Series Analysis II, 3 l-326, (Academic Press, New York). aronna. R.A. (1976). Robust M-estimators of multivariate location and scatter, Atvzals of Statistics, 4, 5 l-67. asarotto, 6. (1987), Robust identification of autorcgressivc moving average models, Applied
Statistics, 36, 2 14-220. ilicr, R.G. (19741, The jackknife
- a review, Biometrika, 6
W.-s. Ghan, W. W.S. Wei / Estimators
of tinw serim urttocor’rdation
16.3
Qucrwuillc, M. (1949), Approximate tests of cor;dation in time series, Joztmul of t/w Rood Statistical Society. Ser. B, Tsay, R.S. (1986), Time w-i spccificatio:1 in the prcscncc of outlicrs. Jomwul of th Atnev-icm Stutisticul Assoc Tsr_ty, R.S. (1988), Outlicrs. Icvc~ shifts, and variance changes in time series. Jortrrzul of FoIpcu.s~i~~g. 7, l-20.