Non-equlibrium thermodynamics of processes in computer systems

0 downloads 0 Views 79KB Size Report
computer systems. Franciszek Grabowski1, Dominik Strzałka1. 1 Rzeszow University of Technology,. Department of Distributed Systems, ul. W. Pola 2, 35-959 ...
Non-equlibrium thermodynamics of processes in computer systems Franciszek Grabowski1, Dominik Strzałka1 1

Rzeszow University of Technology, Department of Distributed Systems, ul. W. Pola 2, 35-959 Rzeszow Poland {fgrab, strzalka}@prz.rzeszow.pl

Abstract. This paper presents the preliminary analysis of system behavior that works far from the thermodynamical equilibrium in environment with limited resources. Examples of such system are real computer systems. Nowadays in such systems the runoff characteristic of the information flow is very turbulent in contradiction to the actual existing models. Such systems work under constant overload, which means from thermodynamical point of view permanent thermodynamical non-equilibrium. The classical approach to their modeling is still based on Boltzmann-Gibbs (BG) thermodynamics, which is proper only for systems that are in thermodynamical equilibrium (thermostatic) state or very close to it. The changing number N of tasks in such systems and limited resources K of environment cause its chaotic behavior and generate dependencies that have got the long-term dependencies. Such processes lead to the degradation of the system performance X and elongate the response time R, in other words degrade the Quality of Service (QoS). To understand the whole behavior of such systems one need proper thermodynamical basis that seems to be Tsallis formula of the non-extensive entropy.

1 Introduction The mechanistic paradigm that has still governed science comes from Roman’s ancient times rule “divide and conquer”. This concerns computer science as well, because algorithms, programming, operating systems, hardware etc. are still treated separately. Such approach means the simple system paradigm with short-range dependencies. Meanwhile one of the most significant features of real systems are longrange dependencies in time. Generally, the notion system can be interpreted as a structure consisting of large number of interdependent elements. This means that the nature of complex things can’t be reduced to the nature of sum of simpler or more fundamental things, because the interdependencies of system elements frequently have got properties that are long-range in time and space domain. Even more, for such systems so far existing assumption about thermodynamical basis isn’t proper, because their states aren’t close to the thermodynamical equilibrium i.e. with low or lack of entropy production (Boltzmann-Gibbs thermodynamics). But such situation means that system

works in equilibrium state without congestions, long-term dependencies, power laws, etc. [1]. In other words this is a simple system. Such systems exist in surrounding environment, but are rather exception than rule. In most cases we have got complex systems and for their analysis one needs the system approach proposed by von Bertalanffy [2]. This approach takes into account the system behavior as a whole with its analysis in time and space domain by influence of long-term dependencies (statistical self-similarity) [3] and complex nature of its structure (“small world” property) [4]. But without the appropriate thermodynamical basis such analysis is still incomplete. The goal of this paper is a proposal of the new thermodynamical model of computer systems with limited resources. Article consists of five sections. Section 2 presents introduction to non-extensive definition of entropy. Section 3 concerns on self-organization in computer systems. Section 4 presents some non-extensive properties of Verhulst model. Paper is crowned in section 5.

2 Self-organization in computer systems A new definition of entropy appeared in 1988 [5]. Constantion Tsallis gave a proposition that was, at that time, revolution, because he stated that entropy not necessary should be the extensive measure. He used a new definition of the exponential function (called the q-exponent):

eqx

[1 + (1 − q )x ]1 / (1− q ) when 1 + (1 − q )x > 0 = otherwise  0

x, q ∈ R

(1)

with the inverse logarithm function (called the q-logarithm):

ln q x =

x1−q − 1 1− q

x∈R

(2)

to formulate a new definition of non-extensive (non-additive) entropy1:

S q ( pi = 1 / W , ∀i ) = k ln q W = k

W 1−q − 1 1− q

(3)

where W is the number of microstates, pi is the probability of being in microstate i, k is positive constant and q∈R. The definition (3) in one special case (when q=1) becomes the classical Boltzmann-Gibbs definition of entropy [5] (i.e. lnq(x) = ln(x) when q=1). There are many interesting properties of this entropy but the most important one is non-extensivity [8]. This means that when one considers two statistically independent systems A and B their join entropy can be superextensive (q1), i.e.: 1

to be more precise: if q ∈ ℜ thus one has got infinite number of the entropy definitions that can be described only by one parameter q

S q ( A + B ) = S q ( A) + S q (B ) + (1 − q )S q ( A)S q (B )

(4)

This definition of entropy gives a basis for a new statistical mechanics that in conq tradiction to traditional BG assumes that ∑ pi ≠ 1 when q≠1. So it means that one hasn’t got a full description of system dynamics, which can have for example chaotic or long-range dependent behavior (in system disappears additivity property of entropy). Nowadays the behavior of such systems can’t be fully described, because is very complex, but surprisingly it turns out that is governed by the simple isomorphic laws [2].

3 Self-organization in computer systems To consider limitations of the computer systems scalability lets take into account the simplest Malthus and Verhulst models represented by the Taylor series expansion [2]:

f (N ) =





n =0

an N n = a0 + a1 N + a2 N 2 + a3 N 3 + ⋯

(5)

where N is the number of tasks in system and ai denotes some constants. If one takes into account first two terms of (5) and assumes that a0=0 and a1=r will get the simplest Malthus model of tasks growth under unlimited resources of environment:

X n ( N ) = dN (t ) / dt = rN (t )

(6)

where Xn is the performance of the system and r is a growth or a drop rate. Equation (6) corresponds with the optimistic case in analysis of the system asymptotical performance [6]:

X max ( N ) = N / (D + Z )

(7)

where D is total service time in system and Z is user think time. For the system with unlimited resources rate r is constant:

r (N ) = 1 / (D + Z ) = X / N = const

(8)

This means that system is insensitive on congestions and during his normal work stays in stable equilibrium state. For example in such system the information flow has got laminar character. But in real environment resources are always limited. To take into account this property Verhulst model was introduced with 1-N/K term where K means the limited system resources. In equation (5) this means that one takes into account three terms a0=0, a1=r and a2=-r/K:

X = dN / dt = rN (1 − N / K )

(9)

When N/K→0 one has got laminar flow; term -rN2/K describes the degradation component of laminar flow, and when N/K→1, then X→0. All situations are on fig. 1.

When N increases, the number of tasks in system rises and starts competition between tasks on limited resources. This leads to the system self-organization but from the other hand this means the increase of queuing and communication cost. In equation (9) one can see two feedback loops:

X 1 = rN

(10)

X2 =1− N / K

(11)

Polarization of these loops can be computed from [7]:

sign(dX 1 / dN ) = sign(r )

(12)

sign(dX 2 / dN ) = sign(− 1 / K )

(13)

Fig. 1. Behavior of equation (9). Continuous line represents term rN, dotted line term rN2/K

The first loop (12) has got positive polarization; second one (13) negative. But to understand behavior of equation (8) one need to consider the polarization of loop in the whole equation:

dX / dN = r − 2 Nr / K = 0

(14)

Taking into account equation (14) one can see that loops polarization change according to value of N:

sign(r − 2 Nr / K ) = + , for N < K/2

(15)

sign(r − 2 Nr / K ) = − , for N > K/2

(16)

When N increases, loops polarization changes and system tunes itself to “the working point”, but this point can oscillate when N > K/2 and the bifurcation appears.

The maximum performance Xmax one can compute from condition dX/dN=0. It is achieved when N=K/2:

X max = rK / 4

(17)

Comparing equation (17) with similar equation in the analysis of system asymptotical performance [6]:

X max
K/2:

rk ≈ 4 / (KDmax )

(20)

In contradiction to the simple system with unlimited resources (where r = const), in complex system r has got changing value sensitive to the limited resources of environment K and element with the maximum service time Dmax. When amount of resources is small this sensitivity is very strong (equation (20) is the classical example of power law). To determine the system response time R one should use Litlle’s law [6]:

R (N ) = N / X = K /[r (K − N )]

(21)

Taking into account equation (20) one can write:

R (N ) ≈ K 2 Dmax /[4(K − N )]

(22)

Equation (22) is proper only for quasi-equilibrium states, when 0