Peculiarities of the Large Numbers Law in Conditions ... - Springer Link

0 downloads 0 Views 224KB Size Report
mix, msx. According to the law of large numbers, mix. * and msx. * converge ... Interval [mix, msx] is the region, in which the sample mean fluctuates when N ® ¥.
ISSN 0735-2727, Radioelectronics and Communications Systems, 2011, Vol. 54, No. 7, pp. 373–383. © Allerton Press, Inc., 2011. Original Russian Text © I.I. Gorban, 2011, published in Izv. Vyssh. Uchebn. Zaved., Radioelektron., 2011, Vol. 54, No. 7, pp. 31–42.

Peculiarities of the Large Numbers Law in Conditions of Disturbances of Statistical Stability I. I. Gorban Institute of Mathematical Machines and Systems Problems of NAS of Ukraine (IMMSP NASU), Kyiv, Ukraine Received in final form January 31, 2011

Abstract—Peculiarities of the large numbers law in conditions of disturbances of statistical stability are studied. It is shown that for random sequences the sample mean may converge to a finite number, converge to positive or negative infinity or fluctuate in a fixed interval. A series of theorems are proven, which describe the large numbers law for hyper-random sequence. It is demonstrated that the sample mean in case of a hyper-random quantity may converge to a finite number, converge to a set of finite numbers, fluctuate in non-intersecting intervals of conditional boundaries, fluctuate in unconditional boundaries interval or converge to positive or negative infinity. Differences in convergence types of random and hyper-random sequences should be accounted for when studying radio-engineering devices and systems. DOI: 10.3103/S0735272711070053

INTRODUCTION The surrounding world obeys a set of defined physical laws with statistical stability of mass events holding a special place among the others. Statistical stability of mass events is an exciting physical phenomenon that served as a basis for creating the probability theory, which is currently widely used in various spheres of science and technology and, specifically, in radio-engineering and radio electronics. The mathematical part of the modern probability theory is based on Kolmogorov’s abstract axiom [1], while the physical part is grounded on the hypothesis that physical events are statistically stable and may be adequately described by stochastic models [2–4]. As a rule the statistical stability hypothesis agrees well with result of experimental research on relatively small time, space and time-space observation intervals, creating an illusion of perfect statistical stability. However research of various physical quantities and events on large observation intervals prove [4–6] that despite the physical essence of the studied phenomenon significant disturbances of statistical stability occur. Such disturbances are depicted in Fig. 1, where spectra of intrinsic noise for a low-frequency amplifier are presented for the cases of 2, 8, 32, 128, 256, 512, 1024 and 2048 means of instantaneous spectra (Fig. 1a–h, respectively)1. The disturbance of statistical stability is caused by the fact that the surrounding world is an open system with constantly changing characteristics and parameters including the statistical ones. Presence of statistical stability disturbances complicates probabilistic description of physical event, specifically the processes that take place in high-frequency radio-engineering devices and systems. Studies of statistical stability disturbances during physical events and development of effective means of their adequate representation considering such disturbances lead to creation of a new physics-mathematical theory, the so-called hyper-random events theory [2–4]. In probability theory random event, quantity and function are the basic mathematical objects (models). In hyper-random events theory hyper-random event, quantity and function represented by sets of independent random events, quantities and functions serve this purpose.

1

A low-frequency amplifier was intentionally chosen for research, because disturbances of statistical stability in such devices start exhibiting themselves on relatively small observation intervals.

373

374

GORBAN N, dB

N, dB

2

100

100

80

80

60

60

40

40

20

8

20 0

2

4

f, kHz

0

2

(a)

N, dB

f, kHz

4

f, kHz

4

f, kHz

4

f, kHz

(b)

N, dB

32

70

70

60

60

50

50

40

40

30

30

20

20

10

128

10 0

2

4

f, kHz

0

2

(c)

N, dB

(d)

N, dB

256

70

70

60

60

50

50

40

40

30

30

20

512

20 0

2

4

f, kHz

0

2

(e)

N, dB

(f)

N, dB

1024

70

70

60

60

50

50

40

40

30

30

20

4

0

2

4

20

f, kHz

2048

0

(g)

2 (h)

Fig. 1. RADIOELECTRONICS

AND

COMMUNICATIONS

SYSTEMS

Vol. 54

No. 7

2011

PECULIARITIES OF THE LARGE NUMBERS LAW IN CONDITIONS OF DISTURBANCES

375

Comparing to the classical probability theory the theory of hyper-random events offers improved capabilities for describing physical events in conditions of disturbed statistical stability. However it is worth noting that probability theory has some capabilities as well. The large numbers law (LNL), which was initially formulate by Bernoulli, plays an important role in both theories. Despite the three-hundred year history of this law its peculiarities in conditions of disturbed statistical stability still remain poorly studied even for the cases of random quantities and processes, not speaking of the hyper-random ones. This article aims to eliminate these gaps. STATISTICALLY UNSTABLE RANDOM SEQUENCES AND PROCESSES We’ll call a sequence (random sample set) X 1 , X 2 , … of random quantities statistically stable [5, 6] if, when the sample set’s volume N tends to infinity, the expected value of the sample dispersion 1 1 n N DYN = ån=1(Y n - mYN ) 2 of the sample mean’s fluctuation Y n = åi =1 X i (n =1, N ) converges to zero, N n 1 N where mYN = ån=1Y n is the sample mean of the mean’s fluctuation. We’ll call sequences that do not N satisfy this condition statistically unstable. Similarly we’ll call a random process X ( t ) statistically stable if, when observation time T tends to 1 T 1 t infinity, the expected value of integral ò (Y ( t ) - m yT ) 2 dt converges to zero, where Y ( t ) = ò X ( t1 )dt1 is t 0 T 0 1 T the accumulated mean, m yT = ò Y ( t )dt is the mean of the accumulated mean. We’ll call processes that do T 0 not satisfy this condition statistically unstable. In this case the convergence type does not play an essential role. However to provide the definition with mathematical rigor we’ll assume probabilistic converge. Let’s note that a deterministic quantity x 0 may be considered a degenerate random quantity with a step distribution function in point x 0 [1, 2] F ( x ) = sign[x - x 0 ], while a deterministic function x 0 ( t ) may be considered a degenerate random function with a distribution function F ( x , t ) = sign[x - x 0 ( t )], ì0, x £ 0, where sign[x ] = í î1, x > 0. Hence the notions of statistical stability and instability are also applicable to sequences of deterministic quantities and functions. The degree of statistical instability for sequences of random quantities and random events may differ. In one case, when the sample set’s volume N tends to infinity, the expected value of the sample’s dispersion of the sample mean’s fluctuations converges to infinity, while in the other case it appears to be limited. The degree of instability is characterized by the upper boundary C of the expected value of fluctuation’s sample dispersion. LNL FOR STOCHASTIC EVENTS IN CONDITIONS OF DISTURBED STATISTICAL STABILITY A well-known Chebyshev theorem, which defines the large numbers law for a sequence X 1 ,..., X N of pair-wise independent random quantities that have finite dispersions and expected values mx1 , K ,mx N , states [7] that when N tends to infinity the sample average

RADIOELECTRONICS

AND

COMMUNICATIONS

SYSTEMS

Vol. 54

No. 7

2011

376

GORBAN U, V

U, V 260

225 240 220

220

200 215 180 210

160 0

20

40

t, h

0

Fig. 2.

20

40

t, h

40

t, h

Fig. 3.

U,V

U,V 228 226

220.2

224

220.0

222 219.8

220 218

219.6

216 0

20

40

t, h

0

Fig. 4.

20 Fig. 5.

YN =

1 N åXn N n=1

converges with respect to possibility to the mean m yN =

1 N å mx N n=1 n

of the expected values mx1 , K ,mx N . Let’s note that this theorem tells us nothing about convergence of neither the sample mean Y N nor the expected value mean m yN , but rather states convergence of these quantities to each other or, in other words, converge of their difference to zero. This means that both the sample mean Y N and the expected value mean m yN may be unbound. For example, they may fluctuate around a constant, but they change synchronously. As follows from the Chebyshev theorem, if the theorem’s conditions are satisfied a sequence of random quantities is statistically stable if and only if, when the sample set’s volume N tends to infinity, the sample dispersion of the mean’s fluctuation of the expected values m yN converges to zero. For a sequence of random quantities that satisfy the theorem’s conditions this statement may be used as a definition of a statistical stability notion. In the case of a random process Chebyshev theorem may be stated as follows. Let the cross-sections X ( t1 ), X ( t 2 ),K of a random process X ( t ) represent pair-wise independent random quantities that have finite dispersions and expected values mx ( t1 ), mx ( t 2 ). Then, if the observation time t tends to infinity, the 1 t 1 t process mean Y ( t ) = ò X ( t1 )dt1 probabilistically converges to the mean m y ( t ) = ò mx ( t1 )dt1 of the t 0 t 0 expected values. RADIOELECTRONICS

AND

COMMUNICATIONS

SYSTEMS

Vol. 54

No. 7

2011

PECULIARITIES OF THE LARGE NUMBERS LAW IN CONDITIONS OF DISTURBANCES F 1 F

m*xr1

m*xr1

1

377

(x )

(x ) F

m*xr2

F

m*x

mxr

1

mx mxr

0

2

F

(x )

m*xr2

F

m*x

(x ) x

(x )

(x )

mx = mxr = mxr 1 2 Fig. 7.

0

Fig. 6.

x

If the limit for the expected value mean m y ( t ) exists, then, when t ® ¥, the process Y ( t ) decays and converges to this limit. If the limit does not exist, then, when t ® ¥, the process Y ( t ) and the expected value mean m y ( t ) fluctuate synchronously, so that Y ( t ) ® m y ( t ), i.e. the value of process Y ( t ) converges to the value of expected values’ mean m y ( t ). EXAMPLES OF STATISTICALLY STABLE AND INSTABLE RANDOM SEQUENCES AND PROCESSES A homogeneous sequence of random quantities with limited first two moments and a heterogeneous sequence of random quantities with limited first two moments are statistically stable if the limit for the expected value mean for these quantities exists. A periodic deterministic process is statistically stable. Realizations of two statistically stable processes being white Gaussian noise and harmonic oscillation are depicted in Figs. 2, 3, while the corresponding accumulated means are presented in Figs. 4, 5, respectively. If, when the sample set’ volume tends to infinity, the expected value mean for random quantities does not have a limit (for example, fluctuates), the sequence is statistically unstable. A numerical non-converging series is statistically instable. Let’s point out that for random processes the notions of nonstationarity and statistical instability are not equivalent. Stationary ergodic with respect to the expected value processes [8] are statistically stable. There are both statistically stable and instable processes among the nonstationary ones. Thus, statistically instable random processes form a special class of nonstationary random processes. PECULIARITIES OF LNL IN CONDITIONS OF DISTURBED STATISTICAL STABILITY It is worth noting one essentially important circumstance: the large numbers law for a sequence of 1 N random quantities does not guarantee that both the sample mean mx* = ån=1 X n and the expected value N 1 N mean mx = ån=1 mn have limits. This law only states convergence of the sample mean to the expected N value mean without requiring convergence to some fixed values. Two cases are possible: 1) the sample mean mx* and the expected value mean mx converge to a fixed value, and 2) there is no such convergence. Let’s consider these two cases. First of all, let’s note that the quantity mx* is random and may be described by a distribution function F * ( x ) if the sample set’s volume is finite. m In the first case a limit of xthe expected value mean mx may be described by a step distribution function with a jump in point mx . The distribution function F * ( x ) for a sample mean mx* converges to this function mx

when N ® ¥ (Figs. 6, 7). Expected values for the sample mean that correspond to different sets may differ (Fig. 6) or coincide (Fig. 7). Coincidence of the sample mean’s expected values occurs when the sample set is homogeneous or RADIOELECTRONICS

AND

COMMUNICATIONS

SYSTEMS

Vol. 54

No. 7

2011

378

GORBAN

1 F

m*xr1

(x )

F

m*xr2

0

mix mxr

1

mxr

2

(x )

x

msx

Fig. 8.

when it is heterogeneous, but the expected values of its elements coincide. In the figures curves F F

m*xr1

( x ) and

( x ) represent mean distribution functions for two sets with different volumes, while the points mxr and

m*xr2 mxr in 2

1

x axis correspond to the expected values. In the second case when N ® ¥ the sample mean mx* and the expected value mean mx may either converge to positive or negative infinity, or fluctuate in some intervals. The case, when with N ® ¥ the quantities mx* and mx are intervals, is of special interest. The considered * * intervals may be finite or infinite. If the intervals are finite, the boundaries mix , msx and mix , msx for the * sample mean mx and the expected values mean mx exist. These boundaries may be described by step * * distribution functions with a jump in points mix , msx , mix , msx . * * and msx converge to mix and msx , respectively (Fig. 8). According to the law of large numbers, mix Interval [mix , msx ] is the region, in which the sample mean fluctuates when N ® ¥. Thus, the sample mean of random quantities may converge to a finite number, converge to positive or negative infinity or fluctuate in a limited interval. In the latter case we may speak of convergence of the sample mean to the interval [4].

LNL FOR HYPER-RANDOM SEQUENCES The following theorem, which is similar to the aforementioned Chebyshev theorem, is true for hyper-random quantities. THEOREM 1 Let a hyper-random quantity X = { X / g Î G } represent a set of random quantities X / g for different statistical conditions g Î G with conditional expected values mx / g and limited conditional dispersions D x / g . Lower and upper boundaries of the expected value for hyper-random quantity X are mix and msx , respectively. A hyper-random set X of mutually independent for all conditions elements with volume N is formed from the general set of hyper-random quantity { X 1 ,K , X N } obtained in uncontrolled changing 1 N statistical conditions. Hyper-random mean mx* = ån=1 X n is calculated based on this set. N When the set’s volume tends to infinity (N ® ¥), the hyper-random sample mean mx* probabilistically r r 1 N converges to a set mx = mx / gr , g Î G, which represents a set of mean mx / gr = ån=1 mx / gn conditional N expected values mx / g1 , K , mx / g N of random quantities X / g 1, K , X / g N that correspond to all possible conditions g n Î G , n = 1, N . At the same time lower and upper boundaries of this sample mean probabilistically converge to the lower and upper boundaries of expected values for the hype-random quantity X :

RADIOELECTRONICS

AND

COMMUNICATIONS

SYSTEMS

Vol. 54

No. 7

2011

PECULIARITIES OF THE LARGE NUMBERS LAW IN CONDITIONS OF DISTURBANCES

½ ïì½ * lim P í½rinfr mx - mix½> e N ® ¥ ï½gÎG ½ î

ïü ý = 0, ïþ ìï½ ½ üï lim P í½supr mx* - msx½> e ý = 0, N ® ¥ ï grÎG ½ ïþ î½

379

(1)

where P( z ) is the probability of condition z, e is the infinitesimal positive number. To prove the theorem let’s consider a random sample set { X / g 1 ,K , X / g N }, obtained for a fixed r 1 N conditions sequence (g 1, K , g N ) Î G. The sample mean mx*/ gr = ån=1 X / g n calculated using this N sample set, when the sample set volume N tends to infinity, according to Chebyshev theorem for random quantities, probabilistically converges to a mean of conditional expected values mx / gr : *

lim P{| mx / gr - mx / gr | > e } = 0.

N ®¥

r r Convergence of the quantity mx*/ gr to the value mx / gr for all g Î G means uniform probabilistic r convergence of the random quantity mx*/ gr to the value mx / gr with respect to g. This means that there is a r r r r probabilistic convergence of hyper-random sample mean mx* = mx*/ gr , g Î G to the set mx = mx / gr , g Î G. For any fixed sequence of conditions and N ® ¥ the sample mean mx*/ gr is limited by the interval * * * * * * [mix , msx ], where mix = rinfr mx / gr , msx = supr mx / gr , while the expected value mean mx / gr is limited on the gÎG

r gÎG

interval [mix , msx ]. In this case the lower and upper boundaries rinfr mx* and supr mx* of the sample mean r gÎG gÎG

correspond to minimal and maximal values of the expected value mean mix and msx , i.e. the equality (1) is true. It is easy to check that the known modifications of the large numbers law for a sequence of random quantities allow generalizations for the case of hyper-random sequence. A specific case is a theorem that defines required and sufficient probabilistic convergence conditions for such sequences. This theorem is stated for a heterogeneous sample set of hyper-random quantities as follows. THEOREM 2 * * X 1,K , X N is the hyper-random quantity, mix , msx are the boundaries of the hyper-random r r N X , mix , msx are the boundaries of the hyper-random quantity mx = mxr / gr , g Î G, that n=1 n

Let’s assume 1 mean mx* = å N 1 N represents a set of mean mxr / gr = ån=1 mxn / gn conditional expected values mx1 / g1 ,K , mx N / g N for random N quantities X 1 / g 1, K , X N / g N . Then the required and sufficient probabilistic convergence condition of the sample mean mx* to the set mx * * of mean expected value, when the sample set’s volume N tends to infinity, and of the boundaries mix , msx of this sample mean to the corresponding boundaries , consists in convergence to zero of the following m m ix sx r r values for all g Î G

RADIOELECTRONICS

AND

COMMUNICATIONS

SYSTEMS

Vol. 54

No. 7

2011

380

F

m*xr

GORBAN

r 1 / g1

1 (x )

1 F F

Sm*x

(x )

F

m*xr

F

Im*x

0

r 2 / g2

m*xr

(x )

where mx*r / gr =

F

(x )

F

m*x

mxr / gr msx mix mxr / gr 1 1 2 2 Fig. 9.

1 N

(x ) m*xr

x

é o ê m*r 2 r x /g Mê ê *r 2 r m 1 + x /g ê êë o

r 1 / g1

0

r 2 / g2

(x )

(x )

mx = mxr / gr = mxr / gr 1 1 2 2 Fig. 10.

x

ù ú ú, ú ú úû

(2)

ån=1( X n / g n - mxn / gn ) is the mean of centered conditional random quantities X 1 / g 1, N

K,X N / gN . The proof of theorem 2 is similar to that of theorem 1. r r Let’s note that the boundaries of conditional mean expected values mxr / gr , when g Î G is fixed, may either exist or not exist. r r If they exist for all g Î G, then the hyper-random sample mean mx* converges to a set of deterministic r r quantities mx . Absence of the limit for any g Î G means that the corresponding conditional mean expected value either converges to positive or negative infinity, or converges to the interval. r r If the boundaries for conditional mean expected values mxr / gr exist for all g Î G, the set mx may be described by a set of conditional step distribution functions with jumps in points mxr / gr . The boundaries of this set mix , msx are described by step distribution functions with jumps in points mix , msx , which coincide with the boundaries’ expected values mSx , mIx (mix = mSx , msx = mIx ) (Fig. 8). In the figure curves F *r r ( x ) and F *r r ( x ) represent distribution functions of a mean for two mx1 / g1 mx2 / g 2 r r different finite sample sets in conditions g 1 and g 2 . The points mxr / gr and mxr / gr in x axis correspond to the 1 1 2 2 expected values. r r When conditional expected values mxr / gr " g Î G densely fill in the interval [mix , msx ], the hyper-random sample mean converges to the interval quantity [mix , msx ] depicted in Fig. 9 by the dark area whenrN ® ¥. r It is interesting to consider a degenerate case, when conditional expected values mxr / gr " g Î G are equal ( mxr / gr = mx ). In this case the boundaries of the expected values mix and msx coincide (mix = msx = mx ) and, when the sample set’s volume tends to infinity, the sample mean of hyper-random quantity converges to a deterministic quantity mx (Fig.10). In the case of convergence (N ® ¥) of all conditional mean expected values mxr / gr to intervals (Fig. 11) the hyper-random quantity mx represents a multi-interval, which is a set of finite intervals [9] depicted in the figure by dark areas. If separate intervals within the multi-interval overlap, the multi-interval mx degenerates into the interval [mix , msx ]. Then the sample mean [mix , msx ], when N ® ¥, fluctuates within this interval without crossing its boundaries.

RADIOELECTRONICS

AND

COMMUNICATIONS

SYSTEMS

Vol. 54

No. 7

2011

PECULIARITIES OF THE LARGE NUMBERS LAW IN CONDITIONS OF DISTURBANCES

381

1 F

Sm*x

F

m*xr

r 1 / g1

(x )

F

m*xr

F

(x )

Im*x

0

r 2 / g2

(x )

(x )

mxr / gr msx mix mxr / gr 1 1 2 2 Fig. 11.

x

Thus, when N ® ¥ the sample mean of a hyper-random quantity may converge for a fixed value, converge to a set of fixed values, fluctuate within the interval [mix , msx ] or converge to positive or negative infinity. The fact of quantity type difference, to which the sample means of random and hyper-random quantities converge, plays an important role in applications. THEOREM ON CONVERGENCE OF BOUNDARIES ESTIMATES FOR A HYPER-RANDOM SAMPLE MEAN THEOREM 3 sample set, mix , msx are Let’s assume that X 1,K , X N is in a general case a heterogeneous r r hyper-random boundaries of the hyper-random quantity mx = mxr / gr , g Î G, representing a set of mean 1 N mxr / gr = ån=1 mxn / gn conditional expected values mx1 / g1 , K , mx N / g N of random quantities X 1 / g 1, N r r K , X N / g N , while the quantities described by expression (2) all converge to zero for all g Î G. L nonintersecting hyper-random sample sets ( X 11 ,..., X N 1 ), …, ( X 1L ,..., X NL ) each with volume N (L ³ 2) are formed from a general set in the uncontrolled changing statistical conditions. Let’s assume that when L ® ¥ and N ® ¥ the corresponding hyper-random sample means mx*1 =

1 N 1 N X n1 ,..., mx*L = å X nL å N n=1 N n=1

densely fill in the interval [mix , msx ]. Then in case of unbound increase of the sample sets number and the volume of each sample set the boundaries estimates for a sample mean *

*

mix = inf mxl , l =1,L

*

*

msx = sup mxl

(3)

l =1,L

probabilistically converge to the corresponding expected value boundaries mix , msx of the hyper-random quantity mx : * lim lim P{| mix - mix | > e } = 0,

L® ¥ N ® ¥

1

*

lim lim P{| msx - msx | > e } = 0.1

L® ¥ N ® ¥

V. N. Tutubalin draw the author's attention to the necessity of introducing the condition of tight filling of interval by sample means into the theorem definition.

RADIOELECTRONICS

AND

COMMUNICATIONS

SYSTEMS

Vol. 54

No. 7

2011

382

GORBAN

The proof of the theorem is as follows. According to theorem 2, for any lth sample set when the set’s volume tends to infinity the boundaries of the hyper-random sample mean mx*l probabilistically converge to the expected value boundaries mix , msx for the hyper-random quantity mx . This means that the values domain for a hyper-random sample mean mx*l is limited by the interval, which converges to the interval [mix , msx ]. Since when L ® ¥ and N ® ¥ dense filling of the interval [mix , msx ] with the sample means occurs and the expressions (3) are true, in the case of unbound increase of the sample sets number and each set’s volume * * the boundary estimates for the sample mean mix , msx probabilistically converge to the same boundaries mix , msx . As follows from the theorem, consistent boundaries estimates of the hyper-random quantity’s expected value may be obtained by calculating sample means for sample sets and calculating the searched boundaries using them. * * * * It’s easy to check that consistent boundaries estimates mix n = inf mxl n , msxn = sup mxl n for initial l =1,L

l =1,L

* mxln

is the nth order initial moments of any order n may be calculated in the same way as well, where moment’s estimate that corresponds to lth sample set. It is worth noting that the boundaries of the central moments m *ixn , m *sxn may not calculated in this way due to absence of the required for them information on estimates of conditional expected values. THEOREM SIMILAR TO THE BERNOULLI THEOREM THEOREM 4 Let a series of N experiments be conducted in uncontrolled changing statistical conditions. In each experiment a hyper-random event A represented by a set of random events A / g in conditions g Î G: A = {A / g Î G } may occur. The probability of the random event A / g in the fixed conditions g Î G equals p a/ g . The lower and upper boundaries of the hyper-random event A are equal to PIa , PSa , respectively. Occurrence frequency of the event A in the considered series of experiments is X = N a / N , where N a is the experiments number when the event A occurred. This frequency is a hyper-random quantity represented by a set of random quantities X / g (g Î G): X = {X / g Î G }. Then the boundaries FIx (x ), FSx (x ) of the distribution function Fx (x ) of frequency X when N ® ¥ probabilistically converge to step functions with jumps in points PIa , PSa . The proof is based on theorem 1. The hyper-random event A may be considered as a hyper-random quantity X , which equal unite when the event A occurs and zero when the event does not occur. Conditional expected value mx / g for a random quantity X / g equals p a/ g , while the dispersion D x / g = (1 - p a/ g ) 2 p a/ g + (0 - p a/ g ) 2 (1 - p a/ g ) = p a/ g (1 - p a/ g ) is a limited quantity. The expected value boundaries for the hyper-random quantity X equal PIa , PSa . The sample mean of the hyper-random quantity X is represented by the occurrence frequency X = N a / N of the event A. Then the stated theorem is true according to theorem 1. As follows from the theorem, with increasing the number of experiments ( N ® ¥ ) the occurrence frequency of the event A lies within the interval [PIa , PSa ] despite the number of conditions G (if G ¹ 1). In this case the frequency does not converge to some specific value. CONCLUSIONS Thus, peculiarities of the large numbers law in conditions of disturbed statistical stability are studied.

RADIOELECTRONICS

AND

COMMUNICATIONS

SYSTEMS

Vol. 54

No. 7

2011

PECULIARITIES OF THE LARGE NUMBERS LAW IN CONDITIONS OF DISTURBANCES

383

It is shown that for random sequences the sample mean may converge to a finite number, converge to positive or negative infinity or fluctuate in a fixed interval. A series of theorems are proven, which describe the large numbers law for hyper-random sequence. It is demonstrated that the sample mean in case of a hyper-random quantity may converge to a finite number, converge to a set of finite numbers, fluctuate in non-intersecting intervals of conditional boundaries, fluctuate in unconditional boundaries interval or converge to positive or negative infinity. Differences in convergence types of random and hyper-random sequences should be accounted for when studying radio-engineering devices and systems. REFERENCES 1. À. N. Kolmogorov, Basic Notions of Probability Theory (ONTI, Moscow, 1936; 1974) [in Russian]. 2. I. I. Gorban, Theory of Hyper-Random Events (IPMMS NANU, Kyiv, 2007) [in Russian], http://ifsc.ualr.edu/ jdberleant/intprob/. 3. I. I. Gorban, “Hyper–Random Phenomena: Definition and Description,” ITA 15, No. 3, 203 (2008). 4. I. I. Gorban, Theory of Hyper-Random Events: Physical and Mathematical Basics (Naukova Dumka, Kyiv, 2011) [in Russian]. 5. I. I. Gorban, “Disturbances of statistical stability in physical processes,” Matematicheskie Mashiny i Sistemy, No. 1, 171 (2010). 6. I. I. Gorban, “Disturbance of Statistical Stability,” Information Models of Knowledge (ITHEA, Kiev–Sofia, 2010), pp. 398–410. 7. B. V. Gnedenko, Lectures on Probability Theory (IFML, Moscow, 1988) [in Russian]. 8. ². ². Gorban, Probability Theory and Mathematical Statistics for Scientific Workers and Engineers (IPMMS NANU, Kyiv, 2003) [in Ukrainian], http://www.immsp.kiev.ua/perspages/gorban_i_i/index.html. 9. S. P. Sharyi, Finite-Dimensional Interval Analysis (Institute of computational technologies, 2010) [in Russian], http://www.nsc.ru/interval.

RADIOELECTRONICS

AND

COMMUNICATIONS

SYSTEMS

Vol. 54

No. 7

2011