Chi-square distribution theory with applications to reliability problems

2 downloads 0 Views 2MB Size Report
bution of a weighted sum of chi-square variates, and partly with a number ... having chi-square distributions with even number of degrees of freedom.
Retrospective Theses and Dissertations

1965

Chi-square distribution theory with applications to reliability problems Ahmed Hassan El Mawaziny Iowa State University

Follow this and additional works at: http://lib.dr.iastate.edu/rtd Part of the Mathematics Commons Recommended Citation El Mawaziny, Ahmed Hassan, "Chi-square distribution theory with applications to reliability problems " (1965). Retrospective Theses and Dissertations. Paper 4037.

This Dissertation is brought to you for free and open access by Digital Repository @ Iowa State University. It has been accepted for inclusion in Retrospective Theses and Dissertations by an authorized administrator of Digital Repository @ Iowa State University. For more information, please contact [email protected].

This dissertation has been microfilmed exactly as received

65-12,470

EL MAWAZINY, Ahmed Hassan, 1935CHI-SQUARE DISTRIBUTION THEORY WITH APPLICATIONS TO RELIABILITY PROBLEMS. Iowa State University of Science and Technology, Ph,D., 1965 Mathematics

University Microfilms, Inc., Ann Arbor, Michigan

CHI-SQUAKE DISTRIBUTION THEORY WITH APPLICATIONS TO RELIABILITY PROBIENS

Ahmed Hassan El Mawaziny

A Dissertation Submitted to the Graduate Faculty in Partial Fulfillment of The Requirements for the Degree of DOCTOR OF PHILOSOPHY

Major Subject; Statistics

Approved:

Signature was redacted for privacy.

In Charge of Major Woric

Signature was redacted for privacy.

Signature was redacted for privacy.

Iowa State University Of Science and Technology Ames, Iowa 1965

11

TABLE OF CONTENTS I. INTRODUCTION

Page 1

1.1 Weighted sums of chi-squaras

1

1.2 Inferences using theory of exponential families

1

1.3

2

Applications to reliability theory

1.4 Approach by fiducial theory

3

1.5 Large sample and numerical results

4

II. REVIEW OF LITERATURE 2.1 lighted chi-square variates 2.2 An inference about gamma parameters III. CONDITIONAL DISTRIBUTION THEORY 3.1

5 5 10 12

Conditional distribution for inference about

13

3.2 Conditional distribution for inference about

18

IV. LARGE SAKPIE DISTRIBUTION THEORY

22

4.1 Standard large sample theory

22

4.2 Derived fiducial asymptotic normality

25

4.3

27

Quasi-fiducial asymptotic theory

V. APPLICATIONS TO A RELIABILITY PROBLEM

43

5.1 Fiducial limits for the parameters of series systems

45

5.2 Inference based on conditional distributions

47

5.3

48

Large sample approximations

5.4 Comparison between continuous and discrete cases VI, SUMMARY

49 52

VII. LITERATURE CITED VIII. ACKNOWLEDGMENT

55 57

IX. APPENDIX; DISTRIBUTION OF WEIGHTED SUMS OF CHI-SQUARE VARIATES

58

1 I. INTRODUCTION This thesis is partly concerned Tidth a new expression for the distri­ bution of a weighted sum of chi-square variates, and partly with a number of inference problems involving related distribution theory. In particu­ lar the results are applicable to the problem of determining confidence limits for the reliability of a series system of k

dissimilar components

when each component has an exponential failure law. 1.1 Weighted sums of chi-souares. Consider the distribution of tive constants and

P

=1C

where the o('s are posi­

(i = l,2,...,k) are independent random variables

having chi-square distributions with even number of degrees of freedom.

A

new closed expression for the density function of

(A) in

P is

given in theorem

the appendix. Mathematical induction has been used to prove the

theorem for an arbitrary number

k. The distribution of Q =

where the 'A*s are real numbers (not necessarily positive) and Xf (i = 1, 2,.c.,k) have the chi-square distributions with n^ degrees of freedom (not necessarily even), has been obtained by different mathematical methods ranging from direct integration of the joint distribution j^Stacy Several expressions for the density function of Q are presented in chapter II.

Since most of the re­

sults obtained are very lengtty, no explicit form will be given.

However,

the mathematical technique and the nature of each result will be stated. 1.2 Inferences using theory of exponential families. We now turn to the problems of inference. I^hmann (1959) describes the theory of exponential families of dis­ tributions ha\d.ng the form

2 C (6) h(x) exp (ZTi(x) Qi(e)), where

x and 9 are vectors of observations and parameters respectively,

and shows that there exist uniformly most powerful unbiased tests for cer­ tain hypotheses about 0»

In an application of this theory, Lentner and

Bushier (1963) have constructed a conditional distribution which arises from two independent gamma variables with scale parameters (9^, @2)*

This

conditional distribution is appropriate for inference about the function l/O^ + 1/02. Chapter III is a generalization of the work just mentioned; it deals with the conditional distributions appropriate for inference about linear functions of the form Y = 27Ci/^i* where the c's are real constants and the ô's are the unknown parameters of interest.

Because these con­

ditional distributions are independent of the 9»s except through T, we can find confidence limits for Y or equivalently test hypothesis about %. 1.3

Applications to reliability theory.

In many reliability problems one is concerned with a system made up of several components. Such systems can be classified into three catego­ ries; The series circuit systems (all components must be working for the system to operate), the parallel circuit systems (at least one component must be working for operation of the system), and combinations of these two. The term "reliability" is a convenient abbreviation for "probability of successful operation." It has been found that the time to failure of many types of electronic components follows an exponential distribution (that is, a gamma distribu­ tion with 2 degrees of freedom). It is well known that the series circuit systems composed of different types of components with mean lives (9]_,92j ...,9jç) have an exponentially distributed time to failure with mean life 4^"^, where

= E l/9j.

3 The results obtained in chapter III may be applied to obtain confi­ dence limits for the parameter ^ . Two numerical examples to illustrate the use of the formulas obtained are given in section (5*2). 1.4 Approach to fiducial theory. vfe also consider a different approach to inference about ^ based on fiducial theory. This approach involves finding the joint difucial distri­ bution of the parameters 0^ (i = l,2,...,k), and from this joint distribu­ tion, the marginal fiducial distribution of



Suppose

have the

exponential distribtuion with density function (1.1) The sum

f(xi;9) = g- exp(-

^>0

z of n independent observations from the above distribution may

be shown to have the gamma distribution whose density function is given by fn(z;o()= where

c0,

(X = 1/G. In order to find the fiducial distribution of0,

where the subscript F stands for "fiducial". Comparing fp(o 0 degrees of freedom,

exp (-§u)du 0

X QQ) =

S

j=l s where Sjs =

(0)/(gj-s)î

(h) and fj (O) represents the h-derivative of fj(y) with respect to y evaluated at y = 0. Imhof (1961), who used a proof different from that of Box (195^)» ob­ tained the density function of

by inverting the characteristic function.

The integral form of the density function, having simple poles only, was simplified and expressed as a finite sum of differential terms similar to Box's result. Imhof pointed out that it is not easily possible to establish the general equivalence of his formula and that of Box mentioned earlier.

e. Method of sections:

A different approach to the distribution of

the weighed sum of chi-square variates has been given by Ruben (iÇéO). The density function was considered as a centrally situated ellipsoid in k-dimensional space. Dividing the region by means of a series of parallel and adjoining (k-l)-flats; the distribution functions of k linearly in­ dependent weighted chi-square distributions is represented as an integral function of distributions of (k-l) independent variates»

The relationships

10 between the -weights in the k-space and those in the (k-l)-space were ob­ tained by the partial fraction decomposition of the characteristic function. The proof given was based on the geometrical analague to the algebraic mathematical induction. To apply Ruben's formula one has to start with the case

k = 2, generate the distribution of the next higher dimension

and proceed till the required distribution is achieved. Ruben pointed out that his integral recurrence relationship may be used to extend Grad and Solomon tables for k %> 3. f. Integration of the joint frequency approach: The distribution of the weighted sum of chi-squares, chi, gamma and exponential may be regarded as special cases from the general distribution worked out by Stacy (1962). Direct integration of the joint frequency of the variates gives the distri­ bution function as an infinite sum of terms. Stacy who gave an explicit solution to a special case, pointed out that there is no easy general method to evaluate those terms. 2.2 An inference about gamma parameters. In the paper by Lentner and Buehier (1963) already mentioned in section 1.2, a closed exact form was obtained for the conditional distri­ bution function of xiihere

w = Z]_

given u =

and

w given v =

and 23 have the joint distribution

The conditional distribution function, H(w|u; 0^,92)* of w given u depends only on l/G^ + l/Og as expected from the theory of exponential

11 familles. Similarly, the distribution of on l/9]_ - 1/02"

w for a given

The simple transformation 8^ =

depends only

(i=l,2) gives the

appropriate distributions for testing the hypothesis Y = y = c^/9]_ +

v

, where

and C2 are real constants. The results obtained

were applied to obtain confidence limits for Y or for the reliability to a fixed time

for a series system.

extended to the case of

k

In the present thesis these results are

parameters 0^,...,

so that it is possible

(although cumbersome) to set exact confidence limits for any linear function y=

Moreover we consider the large-sample problem, which was not

treated by Lentner and Buehier (I963).

12 III. CONDITIONAL DISTRIBUTION THEORY

The present chapter is concerned with conditional distributions which arise from independent gamma variates

having the

joint density function (3.1)

^ jr 1=1

(%i;

'

where n—1

—z/ô

The scale parameters (9]_,92» •••» %() are the unknown parameters of interest and the n*s are considered to be known.

Lentner and Buehler

(1963) have given the conditional distributions appropriate for inference about the sum, 1/Q]_ + l/®2*

the difference, l/G^ - l/®2*

^^e following

is a generalization to the work just mentioned above to obtain inference about the functions 4* &nd

^A = ^ 1

defined ty "1

k

n

-

It is convenient to give the next lemma which will be used to evaluate an integral in a later part of this chapter.

Lemma 1; . (3«3)

, r

I o

k a-j -^r yi" (r+s^) e dr = j=l ^

-1-arE A 1

ai ag Z I il ±2

, k Z (a + 2 i.) \ j=l ^

13 ^ W a-i\ / TT j=l

\a4-l
max |^0,

Uj

and

A(u; 4" ) is the normalization

constant obtained by dividing h]_(u) (given in equations (3.9) and (3.10)) k by

K exp The cumulative distribution function of w given u is obtained by

integrating equation (3.11) with respect to

w. If the incomplete gamma

function is defined by ,x (m+1) = I

e~^ t™ dt = ml

1.e-x Z 3=0

the cumulative form corresponding to case (l) is given

"

by

17

H (w \ u; ^) = A ^ (u; ) S Z ... 2 ^2 ^3

~|T j=2

r w ai+ I: 1.

This may be -written in terms of the incomplete gamma function as (3.12)

k r H (w Uî )= A'^ (a;4)^ r ... I TT U?j"l ' l2 I3 Ik j=2 JI

C-u)

(a]+E ij+l) where A(u;

^^ ... ^ ±2 ±3 ik

1

+ 2 ij): j=2 J

k

TT j=2

I])

Similarly, the cumulative form in case (2) can be obtained by integrating the corresponding density function ;d.th respect to

w.

Ws recall the

Uj„ > 0 is the lower limit of integration. A linear transformation of the variable of integration is required to put the result in terms of the in­ complete gamma function. The result has been worked out to be

(3.13) H(w'|U;4) = A*"l(uj t)

2

tjhere

A^ (u;(j) ) is given by

*•* f (ii)

18

A* (u;t) = Ï Ï ... 2 K+ E 1])!

îfCj) jé"

Confidence limits for up can be constructed by solving the following equation for H(w|u;t) where

0 •••» k. k

Confidence limits for the function are positive constants and

where

bg^, bg,

b^

b^^g*

^ are negative constants are k , obtained by applying the results obtained for "y, more precisely, Z ^±1^^ could be written as

where and

= b^ (i = 1, 2, m) , Cj = -bj (j = m+1, m+2,.k)« f , ©J, = 9j,/Cj. for r = 1, 2,.k. This shows that the appropriate

conditional distribution for inference about •where

w* and v'

is that of w' given v'

are as given by equations (3«15) with Zp replaced by z*

22 IV. LARGE SAI'IPIE DISTRIBUTION THEORY

In section (4.1)

WB

shall present the standard large sample theory

in application to the problem under study. In section (4.2) we consider large-sample fiducial theory.

By considering joint and marginal fiducial

distributions it is shown that the marginal fiducial distribution of has the distribution of a -weighted sum of chi-square variates and conse­ quently is asymptotically normal.

This distribution ivill be referred to

as "the derived fiducial distribution."

The rest of this chapter is devoted

to the asymptotic behavior of the conditional distribution of w given u obtained in chapter III, which arise in the theory of exponential families. The so-called "quasi-fiducial asymptotic normality" of the parameter

c|>

id.ll be obtained 'oy inversion of the standardized w. 4.1 Standard large sample theory. According to Cramer (1946), from the theoretical point of view, the method of maximum likelihood is the most important general method of esti­ mation. Let Lj(x;0j) be the likelihood function of a sample of nj values xij, X2j, ...,

j, from an exponential distribution with parameter ©j

for j = 1,2;...,k. The function Lj is defined by exp [. E

.

A

The maximum likelihood estimator 6j of the parameter 8j is given by the equation l)Lj/^@j = 0, VThich has the solution (4.1)

= % xij/nj = Zj/nj

where

Zj = Z Xj_j

I^t

Ij (@j) be defined as

»

23

Ij (Sj) = R°°^ ^ w a,

-à 92

1, dx , ^

which is easily found to equal (4.2)

Ij (8j) = nj/9j

for

j = 1, 2, ..., k.

Well-known large sample theory shows that l/ïjTÔj) (8j - 9j) is asymptotically normal with zero mean and unit variance. If the joint distribution of the statistics k-variate normal with mean values

@2*

T2,

tend to

and dispersion matrix

n"^((TT .), where O""-;are finite and n is the sample size, and if

^J

^J

P (TT , Tg,

T|^) is a continuous function with continuous partial

derivatives, then it can be shown (see for example Rao (1952)) that p(Tl, Tg,

Tj^) is asymptotically normal with mean (3(6^, ©g* •••» 8^)

and variance

(4^3)

°

^ crlj ^

^0j

.

Here we use usual definition that if the distribution of a random variable x depends on a parameter n, and if there exist two quantities and cr

p

which may depend on n such that the distribution function of the

variable (x - /^)/cr tends to the standard normal distribution function as n approaches infinity, x is said to be asymptotically normal ([^, cT" )< If TO let Dj = n Xj, the maxirdum likelihood estimator Oj is then given by = Sj/n'Xj The sbove asymptotic normality result may be applied lAiors A A /\ , (®.l) ®2î ^ l/^-i —

Tj = Gj ,

24 and /3 (@2^ 62» «

* ®k) - ^ l/8j -

^

Since the z*s are independent,( CTJ^J) is the diagonal matrix with ele­ ments 8j/ "iXj (j = 1, 2, ..., k), 0*

= - l/@j*

hence the

variance given ty equation (4.3) takes the form var

(©1, ©2» •••» ®k^ ~

^ (9^/Oi j)(-l/0j)^

= n"^ Z 1/(9^\) 3 y j = S i/(*j "j)

'

In finding approximate confidence limits based on maximum likelihood estimators, the variance is usually replaced by its estimator. Using . A2 var |3 = Ç 1/(0^ ny instead of var |3 , an approximate upper confidence limit

say, corresponding to confidence level

based on the large

sample theory can be shown to have the form (4.4) where

^^1 = % nj/zj + ^ ^

»

is chosen so that

©xp (41^) dt =

.

-00

In the next two sections w© shall substitute n=n]_ and make the transforma­ tion z^/n = t

and

Zj = n(t-rj)

for

j = 2, 3, ..., k.

Therefore equation (4.4) may be written in terms of the new variables defined above as follows (4.5)

k A 1 K ^ r 1 k ^.1

Suggest Documents