On Queues with Interarrival Times Proportional to ... - Semantic Scholar

1 downloads 0 Views 239KB Size Report
2] B. W. Conolly and N. Hadidi. A comparison of the operational features of conventional queues with a self-regulating system. Appl. Stat., 18:41{53, 1969.
On Queues with Interarrival Times Proportional to Service Times Israel Cidon y

Sun Microsystems Labs Inc. Mountain View, CA 94043-1100

Asad Khamisyz

Department of Electrical Engineering Technion, Haifa 32000, Israel

Roch Guerin

IBM T.J. Watson Research Center Yorktown Heights, NY 10598

Moshe Sidi x

Department of Electrical Engineering Technion, Haifa 32000, Israel

Abstract We analyze a family of queueing systems where the interarrival time In+1 between customers n and n + 1 depends on the service time Bn of customer n. Speci cally, we consider cases where the dependency between In+1 and Bn is a proportionality relation, and Bn is an exponentially distributed random variable. Such dependencies arise in the context of packet-switched networks which use rate policing functions to regulate the amount of data that can arrive to a link within any given time interval. These controls result in signi cant dependencies between the amount of work brought in by customers/packets and the time between successive customers. The models developed in the paper and the associated solutions are, however, of independent interest and are potentially applicable to other environments. Several scenarios which consist of adding an independent random variable to the interarrival time, allowing the proportionality to be random and the combination of the two are considered. In all cases we provide expressions for the Laplace-Stieltjes Transform(LST) of the waiting time of a customer in the system. Numerical results are provided and compared to those of an equivalent system without dependencies.  y z x

This work was done while at the IBM T.J. Watson Research Center. And Department of Electrical Engineering, Technion, Haifa 32000, Israel Part of the work was done while visiting the IBM T.J. Watson Research Center. Part of the work was done while visiting the IBM T.J. Watson Research Center.

1 Introduction This paper deals with the analysis of a family of queueing systems, where the inter-arrival time between two customers depends on the service time of the rst customer. The motivation for studying these queueing systems originated in the context of high-speed packetswitched networks. The issue of interest was the e ect of input policing functions that have been proposed to control the ow of packets in such networks. The basic goal of these policing functions is to ensure adequate network performance by regulating the amount of data that can arrive to a link within any given time interval. These controls result in signi cant dependencies between the amount of work brought in by packets and the time between the arrival of successive packets. Such dependencies have a signi cant impact on system performance [1, 2, 3, 4, 5], and the reader is referred to [5, 6] for discussions on this topic and reviews of related works. While numerous previous papers have studied the e ect of many di erent dependencies in queueing systems, very little work seems to have been done on the type of dependencies that are the focus of this paper. In particular, the case where the service time of a customer depends on the time since the previous arrival has been thoroughly studied [1, 2, 4, 6, 7, 8, 9, 10, 11]. In contrast, this paper addresses the converseproblem where the time to the next arrival depends on the service time of the arriving customer. To the best of our knowledge, previous work on this problem has been essentially limited to the study of general conditions for either stability [12] or nite moments of the busy period [13]. The main contribution of this paper is, therefore, to analyze the waiting time of a customer in a single-server FIFO queue, when the interarrival time In+1 between customers n and n + 1 depends on the service time Bn of customer n. Speci cally, we consider cases where the dependency between In+1 and Bn is a proportionality relation, and Bn is an exponentially distributed random variable. The solution method is based on the application of the standard Wiener-Hopf method, e.g., see [18]. However, we exploit the particular structure of the system to provide a computationally much more ecient solution than available by applying standard techniques. This enables us to study a number of con gurations which provide useful insight on the e ect of new control mechanisms being used in high speed networks. Speci cally, some policing functions now used in packet switched networks introduce the 1

W(t)

I n+1 = B n

t Figure 1: Workload in Queue with Interarrival Time Proportional to Service Time type proportional dependency mentioned above. For instance, let us illustrate how a simple spacer controller that is used to limit the peak rate at which a source can generate data into a network (see [14, 15, 16]), introduces such dependencies. The enforcement of a maximum rate is achieved by requiring that after sending a packet of size B , a space of duration B=R be inserted before the next packet can be sent. The rate R is then the maximum allowable rate for the source. (Note that the existence of a maximum network packet size is assumed here.) This rate R is typically equal to the source peak rate, but can be set to a lower value when low speed links are present in the connection's path [14, 15] or if the network trac has to be smoothed [17]. Assuming that the above spacer is saturated by a source of rate R whose trac is fed to a link of speed C , interarrival and service times at the link are then proportional with In+1 = Bn and = C=R. As shown in Fig. 1, which plots the evolution of the workload at the network link for the extreme case = 1, i.e., equal source and link rates (stability requires  1), the analysis of this simple case is of little interest. However, there exist a number of extensions to this basic model which make it non-trivial, although still tractable, and more important, useful in modeling actual systems. The simplest extension consists of adding an independent random variable to the interarrival time. The presence of such a random component in the interarrival time allows us to model more accurately how the trac generated by a spacer controller arrives at an internal 2

network link. First, such a model can capture the e ect of interactions between packets from a given source and other trac streams inside the network. In particular, the gaps that the spacer initially imposes between packets are modi ed according to the di erent delays that consecutive packets observe through the network. The arrival process at a link can then be modeled as consisting of a deterministic component (the spacing imposed by the spacer at the network access), to which a random network jitter has been added. Second, the addition of a random component also allows us to relax the assumption of a saturated spacer queue, since it can be used to model the time between packets that arrive to the spacer. Finally, another useful application is when the spacer itself randomizes the gaps between successive packets. This randomization in the spacer may be useful to avoid correlation between trac streams of distinct sources. In particular, it helps to prevent (malicious) sources from harming network performance by cooperating to generate a large burst of data into the network. Another extension of the basic model is to allow the proportionality constant to be itself a random variable. Speci cally, we consider cases where the proportionality factor is randomly chosen from a nite set of values. This allows the modeling of a generalized spacer, where the factor used to compute the enforced spacing is allowed to vary. For example, the arrival of a high priority packet that is sensitive to access delay could be handled by allowing earlier transmission. The inter-arrival times at the network link would then depend on both packet priorities and the size of the previous packet. This model has also applications to other environments, such as manufacturing or repair stations systems. The organization of the paper is as follows. In Section 2 we introduce the notations and models studied in the paper. In Section 3 we derive the LST W (s) of the waiting time in steady-state for the simple case of deterministic proportional dependencies with an additional, positive, exponentially distributed and independent jitter. Appendix A shows how this can be extended to allow for both positive and negative jitters. The case where the proportionality factor is allowed to be itself a random variable is treated in Section 4. The analysis of this system requires the use of the spectral analysis method typical to G/G/1 queues. We provide some numerical examples that compare the performance of the system with dependencies with an equivalent system without dependencies. Section 5 covers the addition of a random delay similar to that of Section 3 to the scenario of Section 4. Finally, Section 6 summarizes the ndings of this paper. 3

2 The Model 2.1 General We consider queueing systems in which the interarrival time between the n-th and the n + 1-th customers (packets) depends on the service time of the n-th customer. We focus on proportional dependency, which is very natural in packet switching networks. Here the interarrival time between two consecutive packets arriving over a communication link is proportional to the size (in bits) of the rst packet and consequently to the time it will take to forward this packet over the next link. As was previously discussed, this model (interarrival proportional to previous service time) captures the e ect of a rate control mechanism which \spaces" packets apart as a function of their size. In addition to the proportional dependency, it is very helpful to allow the addition of an independent random component (to the interarrival time) which models a variable delay (jitter) caused by a travel through the network, or additional delays between packets imposed at the source. In what follows, we describe the three scenarios which are considered in this paper. The rst and the simplest one assumes deterministic proportional dependency with an additive interarrival random delay. The second scenario explores the e ect of random proportional dependency without additive delay. The third is a combination of the rst and the second scenarios, namely, random proportional dependency with additive interarrival random delay. Our system consists of a single server queue with an in nite bu er. The service time of the n-th packet (n  1) is denoted by Bn and is assumed to be an independent exponentially distributed random variable (RV) with parameter . The interarrival time between the n-th and the n + 1-st packets is denoted by In+1. (Without loss of generality we assume that the rst arrival occurs at time zero). The workload in the system just before the n-th packet arrival is a RV denoted by Wn with a probability density function (pdf) fWn (w) and LST h i Wn(s) = E e?sWn . We also de ne W (s) = limn!1 Wn(s) when the limit exists. The evolution of the workload at arrival epochs is given by

Wn+1 = (Wn + Bn ? In+1)+ n  1 (1) where X + = max(0; X ) and W1 is usually assumed to be zero. Obviously, Wn is also the waiting time of the n-th packet when packets are served according to the rst in rst out (FIFO) order.

4

The evolution described in (1) is typical to G/G/1 queues and is very well known [18]. In particular, when Bn ; In+1; n  1 are independent random variables and Bn is exponentially distributed, (1) describes the evolution of the workload in a GI/M/1 system. The feature which characterizes the queueing systems considered in this paper is that In+1 is allowed to depend on Bn .

2.2 Deterministic Proportional Dependency with Additive Delay We rst characterize the most basic system of deterministic proportional dependency without additive delay. In this system the interarrival time In+1 is proportional to the previous service time of the n-th packet Bn with a ( xed) proportional dependency parameter , i.e., In+1 = Bn . Therefore, the workload in the system just before the arrival epochs of packets, evolves according to:

Wn+1 = (Wn + (1 ? )Bn)+

(2)

The solution of (2) when n ! 1 is trivial. We have 8 >
1 W lim W = 1 if = 1 n n!1 > : 1 if < 1

(3)

since the term (1 ? )Bn is always negative, zero, or positive, respectively. For example, the evolution of a system where = 1 and W1 = 0 is depicted in Fig. 1. Therefore, we introduce the simplest non-trivial system by adding an independent component to the interarrival time In+1, namely, In+1 = Bn + Jn when Jn is an independent RV. Therefore,

Wn+1 = (Wn + (1 ? )Bn ? Jn)+

(4)

It should be clear now that if Jn is positive then it must be that < 1 to avoid a trivial zero solution in steady state. It is also clear that the system is stable if E [(1 ? )Bn ? Jn ] < 0. Consequently, if the expected value of Jn is 1= then the system is stable if and only if + = > 1. In Section 3 we analyze the steady-state behavior of the system assuming that Jn is an independent exponentially distributed RV with parameter  . In Appendix A we extend the results of Section 3 to the case where Jn can also take negative values. 5

It is also of interest to explicitly identify the resulting arrival process to this system. When In+1 = Bn + Jn is a sum of two independent exponentially distributed RVs with =  (independent parameters = and  , respectively, its LST can be expressed as = +s +s of n). Therefore the pdf of In is fIn (t) = otherwise.

 ?

h

i

e?  t ? e?t when  6=  and 2te?t

2.3 Random Proportional Dependency Let us reexamine the system without additive delay as described by (2). In order to avoid a trivial solution in the form of (3) we will assume that the proportionality parameter itself is a RV di erent from a constant, namely,

Wn+1 = (Wn + (1 ? n)Bn)+

(5)

Therefore, In+1 = nBn , where n is a RV with a nite support, independent of any other RV in the system. Speci cally, we consider the case where n = i with probability ai for P 1  i  N + M for some integers N; M  1. Clearly, Ni=1+M ai = 1. Furthermore, it is clear that in order to avoid trivial solutions not all the i are greater than or equal to 1 and also not all the i are equal to or smaller than 1. Without loss of generality assume that 1 < 1 < 2 <    < N and N +1 < N +2 <    < N +M  1. The stability condition for the system is E [(1 ? n )Bn] < 0 and, therefore, the system is stable if and P only if Ni=1+M ai i > 1. In Section 4 we analyze the steady-state behavior of this system. The interarrival time In+1 is again independent of n and with probability ai is an exP = i and ponentially distributed RV with parameter = i . Therefore, its LST is Ni=1+M ai = i +s

its pdf is

+M  ?  t i=1 ai i e i .

PN

Another possible extension of this model is to assume that the average service time is also random, e.g., with probability aij n = i and  = j , making  itself a RV. However, it turns out that this is equivalent to enlarging the support of the RV n as shown in Section 4.

6

2.4 Random Proportional Dependency with Additive Delay The third model we consider is simply a combination of the above two scenarios. Here we have In+1 = nBn + Jn where n is a (non constant) RV and Jn is an additive random jitter to the interarrival period. Hence

Wn+1 = (Wn + (1 ? n)Bn ? Jn)+

(6)

where n = i with probability ai and Jn is exponentially distributed with parameter  . We will assume the ordering of the various i as de ned in the previous model. The stability P condition here is Ni=1+M ai i + = > 1. The analysis of this system is similar to that of the previous one and is given in Section 5. The interarrival time In is again independent of n and with probability ai is a sum of two independent exponentially distributed RVS with parameters = i and  respectively. h i P ai = i  and its pdf is f (t) = PN +M a  e? i t ? e?t Therefore, its LST is Ni=1+M = In i i ? i=1 i +s +s in case that  6= i  for all 1  i  N + M . If for some j (1  j  N + M )  = j  , then the j -th term in the sum is replaced by aj  2 te?t .

3 Deterministic Proportional Dependency with Additive Delay The analysis of this system is relatively simple. We begin by expressing the LST Wn+1(s) as follows, h

W n+1(s) = E e?s[Wn+(1? )Bn?Jn ]+

i

where Jn is a an exponentially distributed RV with parameter  .

W n+1(s) = E 

" Z

e?s[Wn+(1? )Bn?Jn]+

h

w+(1? )x

0

i

=

Z

0

1

fWn (w) dw

e?y  e?s[w+(1? )x?y] dy +

Z

1

w+(1? )x

1

Z

0

e?x dx

e?y dy

(7)

#

  W W = ( ? s)(  n(s) ? n ( ) + + (1 ? )s) ( ? s)( + (1 ? ) )  + (1 ? ) Wn() 7

Rearranging (7) and letting n ! 1 we get: 1 + s W (s) = (1 + )[1 + (s ?  )] W ( )

(8)

where = (1 ? )=. Note that the stability condition + = > 1 (or  < 1) insures that the root of 1+ (s ?  ) at s =  ? =(1 ? ) is negative and therefore W (s) in (8) is an analytic function for Re(s) > 0, as required. Applying the normalization condition W (0) = 1 we get W ( ) = (1 ?  )(1 +  ) and hence, 1?  (1 ?

 )(1 +

s ) (9) W (s) = 1 + (s ? ) = 1 ?  +  s + 1? 

The quantity W (1) = 1 ?  is the probability that the waiting time just prior to an arrival in steady-state is zero. It is easy to invert W (s) and to realize that with probability 1 ?  the waiting time is zero, and with probability  it is exponentially distributed with parameter (1 ?  )= = [ ? (1 ? ) ]=(1 ? ). The waiting time in the system in steady-state at the arrival instants thus behaves as the waiting time in M/M/1 queue with arrival rate  and service rate 1= = 1? . However, the arrival rate to the system is actually  = +  and the service rate is clearly . Comparing our system with a M/M/1 system with parameters ,, we observe that both systems have the same stability condition. However, in the M/M/1 system the term that governs   the exponent (with a minus sign) of the pdf is  1 ? +  . In our system the term 



is  1?1 ?  which is larger when the stability condition holds. This implies that the tail probability decreases much faster in our system when compared to the corresponding M/M/1 system.

An extension to this system where Jn can also take negative values (but the interarrival period is of-course kept positive) appears in Appendix A.

8

4 Random Proportional Dependency The analysis of this system is more involved than the previous one, and we use the spectral analysis method described in [18]. We begin by rewriting (5) as

Wn+1 = (Wn + Un)+ where Un = (1 ? n)Bn and n is a RV that takes the values i with probability ai for 1  i  N + M where 1 < 1 < 2 <    < N and N +1 < N +2 <    < N +M  1. Since Bn is exponentially distributed, the LST of Un , denoted by U (s) is given by

U (s) =

NX +M i=1

ai 1 ? i s

(10)

where i = ( i ? 1)=. Following the \spectrum factorization" approach (see eq. (8.35)-(8.36) in [18]), we need to factor U (s) ? 1 as follows, U (s) ? 1 = + ((ss)) (11) ? where + (s) is an analytic function of s for Re(s) > 0 with no zeroes in this half plane, and ? (s) is an analytic function of s for Re(s) < D (D is a positive constant to be determined later) with no zeroes in this half plane. Once we have the above factorization, we immediately obtain the LST of the waiting time, up to a constant (see eq. (8.41) in [18]), i.e.,

W (s) = L (ss)

(12)

+

and the constant L is determined from the normalization condition lims!0 W (s) = 1. To form the factorization in (11) we write

U (s) ? 1 = NV ((ss)) where from (10)

N (s) =

NX +M j =1

aj

NY +M i=1

i6=j

(1 ? i s) ? 9

NY +M i=1

(1 ? i s)

V (s) =

NY +M i=1

(1 ? is)

and we have to study the location of the roots of the polynomials N (s) and V (s). Let K be the degree of these polynomials. Then if N +M 6= 1 we have K = N + M , and if N +M = 1 we have K = N + M ? 1. From the de nition of V (s) it is easy to see that the roots of this polynomial are i?1, 1  i  K and they are all distinct. Furthermore, for 1  i  N these roots are positive, while for N + 1  i  K they are negative. In the following theorem we determine the location of the roots of N (s).

Theorem 1: If the stability condition holds, i.e.,

+M j =1 aj j

PN

> 1, then N (s) has N ? 1

distinct positive real roots, a single root at s = 0 and K ? N distinct negative real roots.

Proof: For 1  k  K de ne

?k = k?1 =



k ? 1 Therefore, ?1 > ?2 >    > ?N > 0 > ?N +1 > ?N +2 >    > ?K

(13)

Computing the polynomial N (s) at s = ?k we obtain for 1  k  K

N (?k) = ak

K Y i=1

i6=k

(1 ? i ?k ) = ak

K Y i=1

i6=k

 ? k 1? ? i



(14)

The quantity N (?N ) is positive since ?i > ?N > 0 for any 1  i  N ? 1. The quantity N (?N ?1) is negative as 1 ? ?N =?N ?1 is the only negative term in the above product. In fact, the sign of N (?k ) for 1  k  N is (?1)N ?k and therefore N (?N ); N (?N ?1 ); : : : ; N (?1) have alternating signs. Similarly, N (?N +1 ) is positive since ?i < ?N +1 < 0 for any N + 2  i  K . Again we see that the sign of N (?k) for N + 1  k  K is (?1)k?N +1 and therefore N (?N +1 ); N (?N +2 ); : : : ; N (?K ) have alternating signs. Consequently, there are at least N ? 1 distinct positive real roots, one root at s = 0, and at least K ? N ? 1 distinct negative real roots. Since both N (?N +1 ) and N (?N ) are positive and N (0) = 0, it is clear that the additional real root of N (s) is either between ?N +1 and 0 or between 0 and ?N . This is true since N (s) is a polynomial of degree K . The location of that additional root is determined by 10

computing the derivative of N (s) with respect to s at s = 0:

dN (s) = ds s=0

NX +M i=1

NX +M

i ?

j =1

aj

NX +M i=1

i6=j

i = 1

NX +M

aj ( j ? 1) > 0

j =1

where the last inequality follows from the stability condition

+M j =1 aj j

PN

> 1.

Since the derivative is positive and ?N +1 > 0, the additional root must be between ?N +1 and 0 and therefore N (s) has K ? N negative roots and N ? 1 positive roots that were already proven to be real and distinct. This completes the proof of Theorem 1. Let the distinct roots of N (s) be denoted by 1 ; 2 ; : : :; K , with 1 > 2 > : : : > K and N = 0. Note that from the proof of Theorem 1 we know that ?i > i > ?i+1 for 1  i  K ? 1. Therefore, j i 6= 1 for 1  i; j  K . From Theorem 1 we conclude that U (s) ? 1 can be factorized as stated in (11) with K Y

+ (s) =

i=N K Y

(s ? i)

i=N +1

(1 ? is) N Y

? (s) = ?

(1 ? is)

i=1!

K Y

(? i )

i=1

NY ?1 i=1

!

(s ? i )

Since i < 0 and i < 0 for N + 1  i  K , it follows that + (s) is analytic for Re(s) > 0 and has no zeroes in this half plane, as required. Similarly, since i > 0 for 1  i  N and i > 0 for 1  i  N ? 1, it follows that ?(s) is analytic for Re(s) < N?1 and has no zeroes in this half plane (so D = N?1 in this case). Using (12) we obtain

W (s) =

L

K Y i=N +1 K Y

i=N +1

11

(1 ? is)

(s ? i )

and from the normalization condition we have that L = Ki=N +1 (?i ). Note that L is also the probability that an arriving packet need not queue. In summary, the LST W (s) of the waiting time is given by Q

K Y

W (s) =

i=N +1

!

(?i )

K Y i=N +1

K Y i=N +1

(1 ? i s)

!

(15)

(s ? i )

Note that it is very easy to invert the above LST to obtain the distribution of the waiting time. Taking the n-th derivative of W (s) with respect to s at s = 0 we obtain the n-th moment of the waiting time. In particular, the expected waiting time is given by

E [W ] =

K X i=N +1

( i ? i?1 )

and the second moment is given by

E [W 2] =

K X

K X

i=N +1 j=N +1 j 6=i

( i j ? (ij )?1 ) ? 2

K X i=N +1

?1 i

K X i=N +1

( i ? i?1 )

Then, the variance is obtained from the standard formula,

V ar(W ) = E [W 2] ? (E [W ])2

(16)

Remark 1: The computational e ort in determining the LST of W (s) lies in the determination of the roots i , N + 1  i  K . However, since each root is known to be real and since we know that ?N +1 < N +1 < 0 and ?i < i < ?i?1 for N + 2  i  K , it is very easy to determine the roots with any simple search procedure.

Remark 2: The analysis above does not depend on the values of i and  separately but on the values and the sign of the parameters i = ( i ? 1)=. Consequently, it is easy to generalize our system to the case where  itself is a RV and with probability aij it takes the value j while 12

n takes the value i . This is equivalent to the case where with probability aij we de ne

ij = ( i ? 1)=j and adjust (10) such that the sum is taken over all values ij .

Remark 3: Let W = limn!1 Wn ; U = lim n!1 Un . For a G=G=1 system with   = 1 (but remains strictly less than one, preserving stability) one can show that (see, [19]), (

2E [U ] w Pr[W  w]  = 1 ? exp E [U 2]

)

(17)

Recall that U = B ?I where B; I are the generic r.v.s of the service time and the interarrival time, respectively. Denote by W ind the generic r.v. of the waiting time in an equivalent GI=G=1 system (in which the r.v.s B and I are independent). From (17) W st W ind (where A st B means that A is stochastically smaller than B) if and only if E [B I ]?E [B]E [I ]  0. That is, in heavy trac, the waiting time in a G=G=1 system with service and interarrival times positively correlated is stochastically smaller than in the equivalent GI=G=1 system. We conjecture that this fact holds not only in heavy trac but for any load. For our system P with random proportional dependency we have E [B  I ] ? E [B ]E [I ] = 12 Ki=1 ai i and in the following numerical examples we show that the average and the variance of the waiting time in the equivalent GI=M=1 system are smaller than in our system for all considered loads. An upper bound on the average wait for a GI=G=1 system was developed in [20]. Using the same techniques we obtain an upper bound for a G=G=1 system (and hence for our system with random proportional dependency). We have, (W ) E [W ]  V?ar 2E [U ] =

2

?

2

i=1 ai (1 i ) i=1 ai ( i 1)

PK

PK

?

(18)

The following lower bound on the average wait was developed in [20], 2 + ]2 PK E [( U ) =N +1 ai(1 ? i ) E [W ]  ?2E [U ] = iP  Ki=1 ai( i ? 1)

(19)

Another lower bound on the average wait for our system can be obtained using similar 13

techniques as exercise 2.8 in [21], (W ) ? E [U ] ? + 1 E [W ]  V?ar N 2E [U ] 2 PK PK 2 a i (1 ? i ) i =1 = + i=1 a2i ( i ? 1) ? N + 1 PK 2 i=1 ai ( i ? 1)

(20)

and for small N this is a tight bound (see eq. (18)). In Figs. 2-5 we depict the average and the variance of the waiting time of a system with random proportional dependency. For comparison purposes, we also depict the same quantities in an equivalent GI=M=1 system in which the service time is exponentially distributed with parameter , and the interarrival times are independent and sampled from P = i . Namely, with probability a probability distribution whose LST is A(s) = Ni=1+M ai = i +s ai, the interarrival time is exponentially distributed with parameter = i. Recall that the analysis of this GI/M/1 system requires the determination of a single root between 0 and 1 of the equation s = A( ? s). In all gures we consider a system with N = 3; M = 2 and ai = 0:2; 1  i  5. In Figs. 2 and 3 we use 1 = 1:4; 2 = 1:6; 4 = 0:1 5 = 0:2; and we depict the average and the variance of the waiting time versus the largest proportional parameter 3 . As expected, both quantities are decreasing with increasing 3 . In Figs. 4 and 5 we keep PN +M i=1 ai i constant. In particular, we use 1 = 1:2; 2 = 1:3; 4 = 0:1 and 3 + 5 is kept constant. The average and the variance of the waiting time are depicted as a function of the largest proportional parameter 3 . It is interesting to note that although 3 + 5 is kept constant, both the average and the variance increase with increasing 3 (and decreasing 5). This implies that decreasing 5 has a more pronounced e ect on the performance of the queueing system. The reason is that increasing 3 and decreasing 5 while keeping their sum constant, increases the variability of the arrival process, and hence increases the average and the variance of the waiting time as shown in Figs. 4 and 5. In all cases we observe that the equivalent GI=M=1 system exhibits much larger averages and variances of the waiting time. This implies that correlations between service times and interarrival times have a smoothing e ect on the system. This has also been observed in [1, 2, 6, 7, 8] for di erent types of correlations. 14

25

-------- Equivalent GI/M/1 System

AVERAGE WAITING TIME

20

_____ Proportional Dependency

15

10

5

0

2

4

6

8

10

12

LARGEST PROPORTIONAL PARAMETER

Figure 2: Average waiting time versus largest proportional parameter: ai = 0:2, 1  i  5; 1 = 1:4; 2 = 1:6; 4 = 0:2 5 = 0:1:

100

VARIANCE OF WAITING TIME

90 -------- Equivalent GI/M/1 System

80

_____ Proportional Dependency 70 60 50 40 30 20 10 0

2

4

6

8

10

12

LARGEST PROPORTIONAL PARAMETER

Figure 3: Variance of waiting time versus largest proportional parameter: ai = 0:2, 1  i  5; 1 = 1:4; 2 = 1:6; 4 = 0:2 5 = 0:1: 15

AVERAGE WAITING TIME

50 45

-------- Equivalent GI/M/1 System

40

_____ Proportional Dependency

35 30 25 20 15 10 5 0 1.6

1.7

1.8

1.9

2

2.1

2.2

2.3

2.4

2.5

2.6

LARGEST PROPORTIONAL PARAMETER

Figure 4: Average waiting time versus largest proportional parameter: ai = 0:2, 1  i  5; 1 = 1:2; 2 = 1:3; 4 = 0:1 3 + 5 = 2:6:

2000

VARIANCE OF WAITING TIME

1800 1600

-------- Equivalent GI/M/1 System _____ Proportional Dependency

1400 1200 1000 800 600 400 200 0 1.6

1.7

1.8

1.9

2

2.1

2.2

2.3

2.4

2.5

2.6

LARGEST PROPORTIONAL PARAMETER

Figure 5: Variance of waiting time versus largest proportional parameter: ai = 0:2, 1  i  5; 1 = 1:2; 2 = 1:3; 4 = 0:1 3 + 5 = 2:6: 16

5 Random Proportional Dependency with Additive Delay We now combine the two models of Sections 3 and 4. As before, we start by rewriting (6) as Wn+1 = (Wn + Un)+ where Un = (1 ? n )Bn ? Jn, n is de ned in Section 4 and Jn is an independent exponentially distributed RV with parameter  . The LST of Un , denoted by U (s), is given in this case by (recall that i = ( i ? 1)=)

U (s) =  ? s

NX +M i=1

ai 1 ? i s

(21)

We now need to factorize U (s) ? 1 as in (11) and then to obtain the LST of the waiting time as in (12). To that end we write

U (s) ? 1 = NV ((ss)) where from (21)

N (s) = 

NX +M j =1

V (s) = ( ? s)

aj

NY +M i=1

i6=j

NY +M i=1

(1 ? i s) ? ( ? s)

NY +M i=1

(1 ? is)

(1 ? is)

and we have to study the location of the roots of the polynomials N (s) and V (s). Let K be the degree of these polynomials. Then if N +M 6= 1 we have K = N + M + 1, and if N +M = 1 we have K = N + M . From the de nition of V (s) it is easy to see that the roots of this polynomial are i?1, 1  i  K ? 1 and  . Furthermore, for 1  i  N the roots i?1 are positive, while for N +1  i  K ? 1 the roots i?1 are negative. The root at  is also positive. In the following theorem we determine the location of the roots of N (s).

Theorem 2: If the stability condition holds, i.e.,

+M j =1 aj j + =

PN

> 1, then N (s) has N

distinct positive real roots, a single root at s = 0 and K ? N distinct negative real roots.

17

Proof: We can apply exactly the same technique as in the proof of Theorem 1 to show the existence of at least N ? 1 distinct positive real roots, a single root at s = 0, and at least K ? N ? 2 distinct negative real roots of the polynomial N (s) since the last term of N (s) becomes zero when s = ?k (recall that ?k = k?1). Similarly, the location of another negative real root between ?N +1 and 0 can be proved since the derivative of N (s) is positive

at s = 0 when the stability condition holds. Therefore, we only need to show that there exists another positive root of N (s) which is di erent from the ones showed before. Recall that the sign of N (?k ) is (?1)N ?k for 1  k  N . This is true here since the sign of N (?k ) Q P is only determined by the sign of the rst term of N (s) ( Nj =1+M aj  Ni=1+;iM6=j (1 ? i s)) since Q the last term (( ? s) Ni=1+M (1 ? is)) vanishes at all these points. Therefore, the sign of N (?1) (where ?1 is the largest) is (?1)N ?1 . Now, we will explore the sign of N (s) as s ! 1. By rewriting N (s) as:

N (s) =

NY +M i=1

2

(1 ? ?s ) 4s ?  +

NX +M

aj 

j =1 1 ? j s

i

3 5

it becomes clear that the sign of N (s) as s ! 1 is (?1)N since ?i  0 for N +1  i  K ? 1 and ?i > 0 for 1  i  N . Therefore, there must be another positive real root of N (s) between ?1 and 1. There is only one root in this region since the total number of roots of N (s) is K . Let the distinct roots of N (s) be denoted by 1 ; 2 ; : : :; K , with 1 > 2 > : : : > K and N +1 = 0. Note that from the proof of Theorem 2 we know that j i 6= 1 for 1  i; j  K. From Theorem 2 we conclude that U (s) ? 1 can be factorized as stated in (11) with K Y

+ (s) =

i=N +1 K Y i=N +1

(s ? i )

(1 ? is) N Y

? (s) = ?

( ? s) (1 ? is) i=1

K Y

(? i )

i=1

18

!

N Y

!

(s ? i )

i=1

Using (12) we obtain

W (s) =

L

K Y i=N +1 K Y

i=N +2

(1 ? is)

(22)

(s ? i )

and from the normalization condition we have that L = Ki=N +2 (?i ). Note that L is also the probability that an arriving packet need not queue. In summary, the LST W (s) of the waiting time is given by Q

K Y i=N +2

W (s) =

!

K Y

(?i )

K Y i=N +2

i=N +1

(1 ? i s)

!

(23)

(s ? i )

Note that this expression is very similar to (15) except that the locations of the roots i will be di erent in this case. Taking the corresponding derivatives we obtain the expected waiting time

E [W ] = N +1

K X i=N +2

( i ? i?1)

and the second moment

E [W 2] =

K X

K X

i=N +2 j=N +2 j 6=i

? 2 N +1

( i j ? (i j )?1 ) ? 2

K X i=N +2

K X i=N +2

i?1

K X i=N +2

( i ? i?1 )

( i ? i?1 )

6 Summary This paper presents the analysis of the customer waiting time in systems where the interarrival time In+1 between customers n and n +1 depends on the service time Bn of customer n through a proportionality relation, and Bn is an exponentially distributed random variable. 19

We considered di erent scenarios of increasing complexity and provided ecient computational methods for their analysis. The general conclusion, which con rms the initial intuition, is that such correlations have a smoothing e ect on the behavior of the queueing systems and emphasizes the general consensus that independence assumptions are usually pessimistic in practical environments. Our motivation for investigating such systems originated from rate control policies which are popular in new high-speed network architectures. However, the models developed in the paper and the associated solutions are of general interest and potentially applicable to other environments. Extensions of these models to service times with general distributions and for proportional parameters that are continuous random variables are currently under study.

References [1] B. W. Conolly. The waiting time process for a certain correlated queue. Operations Research, 16:1006{1015, 1968. [2] B. W. Conolly and N. Hadidi. A comparison of the operational features of conventional queues with a self-regulating system. Appl. Stat., 18:41{53, 1969. [3] O. J. Boxma. On a tandem queueing model with identical service times at both counters, Parts I and II. Adv. Appl. Prob., 11:616{659, 1979. [4] N. Hadidi. Further results on queues with partial correlation. Operations Research, 33:203{209, 1985. [5] K. W. Fendick, V. R. Saksena, and W. Whitt. Dependence in packet queues. IEEE Trans. Commun., COM-37(11):1173{1183, November 1989. [6] I. Cidon, R. Guerin, A. Khamisy, and M. Sidi. Analysis of a correlated queue in communication systems. In Proc. INFOCOM'93, pp. 209{216, San Francisco, March 1993. (See also IBM Research Report RC 17585, IBM, January 1992). [7] B. W. Conolly and N. Hadidi. A correlated queue. J. Appl. Prob., 6:122{136, 1969.

20

[8] B. W. Conolly and Q. H. Choo. The waiting time process for a generalized correlated queue with exponential demand and service. SIAM J. Appl. Math., 37(2):263{275, October 1979. [9] N. Hadidi. Queues with partial correlation. SIAM J. Appl. Math., 40(3):467{475, June 1981. [10] C. Langaris. A correlated queue with in nitely many servers. J. Appl. Prob., 23:155{ 165, 1986. [11] C. Langaris. Busy-period analysis of a correlated queue with exponential demand and service. J. Appl. Prob., 24:476{485, 1987. [12] R. M. Loynes. The stability of a queue with non-independent inter-arrival and service times. Proc. Cambridge Philos. Soc., 58(3):497{520, 1962. [13] S. Ghahramani and R. W. Wol . A new proof of nite moment conditions for GI/G/1 busy periods. Queueing Systems Theory Appl. (QUESTA), 4(2):171{178, June 1989. [14] K. Bala, I. Cidon, and K. Sohraby. Congestion control for high-speed packet switch. In Proc. INFOCOM'90, San Francisco, 1990. [15] I. Cidon, I. Gopal, and R. Guerin. Bandwidth management and congestion control in plaNET. IEEE Commun. Mag., 29(10):54{63, October 1991. [16] A. I. Elwalid and D. Mitra. Analysis and design of rate-based congestion control of high-speed networks, I: Stochastic uid models, access regulation. QUESTA, 9:29{64, September 1991. [17] R. Guerin and L. Gun. A uni ed approach to bandwidth allocation and access control in fast packet-switched networks. In Proc. INFOCOM'92, pp. 1{12, Florence, Italy, 1992. [18] L. Kleinrock. Queueing Systems, Volume 1: Theory. John Wiley & Sons, New York, 1975. [19] J. F. C. Kingman. On queues in heavy trac. Journal of the Royal Statistical Society, 24:383{392, 1962. [20] J. F. C. Kingman. Some inequalities for the queue GI/G/1. Biometrika, 49:315{324, 1962. 21

[21] L. Kleinrock. Queueing Systems, Volume 2: Computer Applications. John Wiley & Sons, New York, 1976.

APPENDIX A Proportional Dependency with Additive and Subtractive Delay In this Appendix, we extend the results of Section 3 to the case where the packet delay jitter caused by previous traveling through the network can take both positive and negative values. This means that packets can arrive sooner or later than expected based only on the service time. However, since most networks preserve a rst come rst served ordering, we do not allow the interarrival time to be negative. Since in our case In+1 = Bn + Jn, the LST Wn+1(s) can be expressed as: h

Wn+1(s) = E e?s[Wn +Bn ?( Bn+Jn )+ ]+

i

where < 1 and Jn is a random variable to be de ned. As long as Jn takes only positive values the term ( Bn + Jn ) is always positive and hence is identical to ( Bn + Jn )+ . Here we allow Jn to take also negative values (but In+1 is kept non-negative). To that end we assume that with probability a Jn is an additive exponentially distributed RV (denoted by J^n ) with parameter  (as in Section 3) and with probability (1 ? a) a subtractive exponentially distributed RV with parameter  (denoted by J~n ). Note that since < 1 then [Wn + Bn ? ( Bn ? J~n )+ ]+ = [Wn + Bn ? ( Bn ? J~n )+ ]. Hence,

Wn+1(s) = aE e?s[Wn +(1? )Bn+J^n )]+ + (1 ? a)E e?s[Wn +Bn ?( Bn?J~n )+] h

i

E e?s[Wn +Bn ?( Bn?J~n)+ ] = h

i

Z

x

Z

0

1

fWn (w) dw Z

1

h

Z

0

1

e?x dx 

i

(24) (25)

e?y e?s[w+(1? )x+y] dy + e?y?s(w+x)dy 0 x   = ( + s)( + (1 ? )s) Wn (s) ? ( + s)(+  + s) Wn(s) +  +  + s Wn (s) (1 + s) +  ( +  + s) W (s) = (s  + s)( +  + s)(1 + s) n



22

where = (1 ? )=. Applying (25) and (7) in (24) and letting n ! 1 we get:

as a W (s) = (s ? )(1 W (  ) ? +  ) (s ?  )(1 + s) W (s)

(26)

(1 + s) +  ( +  + s) W (s) +(1 ? a) (s  + s)( +  + s)(1 + s)

where Rearranging (26) and applying the normalization condition (W (0) = 1) we get

W (s) = 1W+( )

+ s)(1 + s)  a[(1 ?  ?  ) +  ] ? [1 a?(a ++  ?(1  ? )] + [1 + (a ?  +  +  )]s + 2s2

with

+ a ?   ? a  +   ] W () = (1 + )[?(1 ? a) + aa(? a  +  ?  ) 2

(27)

The stability condition is derived from E [Bn ? ( Bn ? Jn )+ ] < 0] and is ? a + (1 ? a)(1= ? =[ ( +  )] < 0. It can be veri ed that this condition is equivalent to W () > 0.

23