On the distribution of a statistic in multivariate inverse ... - Project Euclid
Recommend Documents
tion produces no effect toS andy since they are independent of 5. Following ... (l + v'v)~1β'β = λ and ζ2ζ2=μ anc * by using the result (Anderson [1], p. 319) on.
where a: p x 1 is the vector of unknown parameters, β: q x p is the matrix of ..... [10] C. R. Rao, Linear Statistical Inference and its Applications, New York, Wiley, ...
Key words and phrases. Identification, minimal sufficient parameter, multivariate skewed-normal ... location-scale skewed-normal distribution. From time to time, ...
problem and the problem of testing independence between pairs of vectors. These methods ..... 4.2 Rank Tests of Independence ..... WILKS, S. S. (1935). On the ...
We derive fast rates of convergence for the excess risk of empirical risk ..... case, we use kernel deconvolution estimators in the empirical risk minimization.
In this paper the asymptotic normality of a class of statistics, includ- ing Gini's index of ... lence between the general cograduation index and a suitable U-statistic.
One cannot achieve consistency without letting k = kâ tend to infinity. If this ..... where, for each n > 1, Bn is a Brownian bridge and (SET) stands for "small error.
paths of Z or the inverse process T. In Theorem 5 we show that δ = supθ νθ ⤠0 ... The second queueing application is to time-dependent arrival processes. The.
Oct 30, 1997 - b¾. Since /Рo/Ц is continuous, there is a c between b½ and b¾ such that. /¾(c) = /Рo/Ц(c) = c. Note further that b½, b¾, c GIЦ, while /Ц(c) GIР.
the Bartlett-type correction and the monotone corrected score statistics pro- ... After the advances obtained in correcting the likelihood ratio statistic (LR) using.
ample, Proposition 8.49, page 182 in Breiman (1968), this is true in particu- lar for any .... 215 lim kââ. ( nk. â i=1. Yk i. ) = (. Nc1 â c1 Nc1. ) where Nc1 is a Poisson random variable with parameter c1. ..... California Press, Berkeley.
University of Leeds and Rutgers University. Consider the problem of estimating the location vector and scatter matrix from a set of multivariate data.
Abstract: In this paper, a nonparametric estimator is proposed for esti- mating the L1-median for multivariate conditional distribution when the covariates take ...
effective RQMC technique (L'Ecuyer and Lemieux, 2000; Sloan and Joe, 1994), where the n ...... Prentice-Hall, Englewood-Cliff, N.J.. Kuo, F. Y. and Joe, ...
Wishart distribution with 1 degree of freedom. The p.d.f. of x1 is given ... Jacobian of transformations, normal distribution, pseudo Wishart, singu- lar noncentral ...
124, 169-215 (1989) ... Department of Mathematics, University of California, Los Angeles, CA 90024, USA .... half-bound states atE = 0 (Proposition 6.3). We plan ...
May 25, 1989 - Building up semi-algebraic geometry, H. Dells and M. Knebusch relied ... the real spectrum of a ring together with a sheaf of real closed rings ...
Analysis: A Festschrift in honor of Professor Jana Jurecková. Vol. 7 (2010) 105â112 c Institute of Mathematical Statistics, 2010. DOI: 10.1214/10-IMSCOLL711.
D-31-124-G562 and U.S. Air Force Office of Scientific Research, Grant No. ... Sponsored by the Mathematics Research Center, United States Army, Madison, ...
By Anthony Almudevar, Chris Field and John Robinson. Dalhousie University, Dalhousie University and University of Sydney. When a unique M-estimate exists, ...
2 jq1, N. 1. 2. Ž . which are proved in Lemma 5.2, we derive from 4.1 our first main inequal- ity, by1 bysy1 ky1. Žb. q s. Žbysyt. Žtq1. qq1. Ë. EL V F. V. EL. V EL. V.
Monvel, Pastur and Shcherbina 1995 , respectively . The present paper deals with ensembles that can be regarded as interme-. Ž . Ž . diate between 1 and 2 .
... over the set of all. âDepartment of Mathematics, University of Bristol, Bristol, UK, mailto:david.leslie@bristol. ac.uk ...... In D. K. Dey, S. K. Ghosh, and B. K. Mallick (Eds.), Generalized. Linear Models: A .... MacEachern, S. N. (1994). Est
On the distribution of a statistic in multivariate inverse ... - Project Euclid
Applying the general inverse expansion formula by Hill and Davis [5], ... tion produces no effect toS andy since they are independent of 5. Following. Gleser and ...
HIROSHIMA MATH. J.
14 (1984), 215-225
On the distribution of a statistic in multivariate inverse regression analysis Yasunori FUJIKOSHI and Ryuei NISHII
(Received September 17, 1983)
§ 1. Introduction
In a multivariate inverse regression problem we are interested in making inference about an unknown q x 1 vector jc = (x l 5 ..., xq)' from an observed px 1 response vector y = {yw-> yPϊ' Brown [3] has summarized various aspects of the problem. We assume that y is random, x is fixed and (1.1)
y = a + B'x + e = Θ'
+ e x
where Θ' = [α, £ ' ] : px(l + q) is the matrix of unknown parameters and e is an error vector having a multivariate normal distribution Np[0,11]. Further, suppose that the N informative observations on y and x have been given. When p>q, it is possible to obtain a natural point estimate for JC, and to construct a confidence region for JC, based on a statistic, which is a quadratic form of the estimate. For an application of the confidence region it is required to give the upper percentage point of the statistic. The purpose of this paper is to study the distribution of the statistic mentioned above. We shall derive an asymptotic expansion for the distribution function of the statistic up to the order N~2 and hence for the upper percentage point of the statistic. In Section 3 we treat the distribution problem in the situation where Θ is known and Σ is unknown. We note that the distribution of the statistic in this case is essentially the same as one of a statistic in growth curve model. The distribution has been studied by Rao [6] and Gleser and Olkin [4]. The numerical accuracy of our asymptotic approximations is checked by comparing with exact results of Gleser and Olkin [4]. In Section 4 we treat the distribution problem in the situation where Θ and Σ are unknown. In this case a reduction of the distribution problem is given. By using the reduction and perturbation method we shall obtain the asymptotic expansion of the distribution function of the statistic. Some formulas used in deriving the asymptotic expansions are summarized in Section 5.
216
Yasunori FUJIKOSHI and Ryuei NISHII
§ 2.
A distribution problem
Suppose that the N independent observations on y and x following the model (1.1) have been given, and let these observations be denoted by
Then the observations satisfy (2.1) where j N = (l,...9 1)' and E is an N x p error matrix whose rows are independently distributed as Np[0, Σ~\. The observation y and x for a new object satisfy the model (1.1). We assume that y is observed, but x is unknown. Since x is a fixed variate we may without loss of generality assume that (2.2)
X'jN = 0.
Further, we put on the usual assumptions; (2.3)
rank(Z) = g and n = N - q - 1 > p.
If Θ and Σ are known, from (1.1) we may estimate x by (2.4)
x0
which is obtained by the maximum likelihood method based on (1.1) or by minimizing
with respect to x. Since x is distributed as Np[x, (BΣ'12Γ)"1], use a confidence region for JC, based on (2.5)
Q0 =
this suggests to
(x0-x)'BΣ-'B'{x0-x).
The confidence region for x of confidence 1—α is given by the ellipsoid {x\Q0< Xβ(α)}> where χ^(α) is the upper α point of a χ2-distribution with g degrees of freedom. If Θ and Σ are unknown, it is natural to replace them by their estimates. The usual estimates of Θ and Σ based on (2.1) are given by
ί =s = ^
Multivariate inverse regression analysis
217
We use the following statistics according as Θ is known or not: (i) The case when Θ is known and Σ is unknown (2.6)
β 1 = (S1-x)'*S'-**'(al-x) X
where xt = (JBS"*£')" BS-*(y-a). (ii) The case when Θ and Σ are unknown; (2.7)