BULLETIN OF THE POLISH ACADEMY OF SCIENCES TECHNICAL SCIENCES, Vol. 65, No. 2, 2017 DOI: 10.1515/bpasts-2017-0026
Stable scheduling of single machine with probabilistic parameters W. BOŻEJKO1*, P. RAJBA2, and M. WODECKI2 1
Department of Control Systems and Mechatronics, Faculty of Electronics, Wrocław University of Science and Technology, Wyb. Wyspiańskiego, 50-370 Wrocław, Poland 2 Institute of Computer Science, University of Wrocław, 15 Joliot-Curie St., 50-383 Wrocław, Poland
Abstract. We consider a stochastic variant of the single machine total weighted tardiness problem jobs parameters are independent random variables with normal or Erlang distributions. Since even deterministic problem is NP-hard, it is difficult to find global optimum for large instances in the reasonable run time. Therefore, we propose tabu search metaheuristics in this work. Computational experiments show that solutions obtained by the stochastic version of metaheuristics are more stable (i.e. resistant to data disturbance) than solutions generated by classic, deterministic version of the algorithm. Key words: scheduling, uncertain parameters, tabu search, stability.
1. Introduction Single machine scheduling problems with cost goal functions, despite the simplicity of their formulation, belong to the most difficult class of (NP-hard) combinatorial optimization problems. In the literature there are many such problems with different parameters of tasks, functional properties of machines and different goal functions. Starting from the simplest problems – with one constraint (e.g. problem denoted by 1k∑wiUi, concerning the latest tasks’ due dates), to complex problems of set-ups and time windows, where in the optimal solution machines may have downtimes. They are important both from theoretical and practical standpoint because: 1. they enable modeling and analysis of simple production systems and individual positions in more complex systems, 2. they are considered as special cases of more general problems, 3. some of their particular properties can often be generalized, 4. they provide a common basis for the methods for solving many NP-hard problems, 5. they are not only simple in implementation but there is also a number of test examples in the literature. Single machine problems are important from the theory and practice point of view, because: (1) they allows us to research single production nest of the manufacturing system (bottleneck), (2) their specific properties can be generalized to multi-machine issues, (3) many transportation and logistic problems (e.g. variations of traveling salesman problem) are formulated as single machine problems with setup times and additional constrains. Examples of their practical applications were presented by Lann and Mosheiov [14]. We consider the single machine total weighted tardiness problem (in short – TWT), denoted as 1k∑wiTi. A set of n jobs *e-mail:
[email protected]
J = f1, 2, …, ng have to be processed without interruption on a single machine that can handle only one job at a time. All jobs become available for processing at the beginning (time zero). Each job i has an integer processing time pi, a due date di and a positive weight wi. For a given sequence of jobs (earliest) due date Ci, the tardiness Ti = maxf0, Ci ¡ dig and the cost wi ¢Ti of job i 2 J. The objective is to find a job sequence which minin mizes the sum of the costs ∑i =1wi ¢Ti. This is a classical problem of scheduling theory.
2. Literature review The total weighted tardiness problem is NP-hard [15]. Enumerative algorithms (which use dynamic programming and branch and bound approaches) for the problem are described in [22, 29]. The algorithms are a significant improvement over exhaustive search but they remain laborious and are applicable only to relatively small problems, with the number of jobs not exceeding 50 (80 in a multi-processor computer [29]). The enumerative algorithms mentioned above may require considerable computer resources both in terms of computation times and core storage. Therefore, many algorithms have been proposed to find near optimal schedules in reasonable time. Local search methods start from an initial solution and if repeatedly try to improve the current solution by local changes. The interchanges are continued until a solution that cannot be improved is obtained, which is a local minimum. To increase the performance of local search algorithms, metaheuristics like tabu search are used [2, 7], simulated annealing [23], path relinking [4], genetic algorithms [7], ant colony optimization [9]. A very effective iterated local-search method has been proposed by Kirlik and Oguz [13]. The key aspect of the method is its ability to explore an exponential-size neighborhood in polynomial time by a dynamic programming technique.
219
Bull. Pol. Ac.: Tech. 65(2) 2017 Unauthenticated Download Date | 4/25/17 7:48 PM
sions in thebecomes conditions of uncertainty (lack of exact values of arameters) quotidian. Stochastic scheduling probof jobs, time, processing time,weight weightand anddue duedate date cisions in the conditions of uncertainty (lack of exact values of pi p, w i , iw , diJ,i dJi –– –– set processing lems, where job durations random variables with known rameters) becomes quotidian. Stochastic scheduling probset of jobs, ms, where job durations arearerandom variables with known parameters) becomes quotidian. Stochastic scheduling probpi , wi , di – processing ofa job a jobi ∈i time, ∈ J, weight and due date probability distributions have beenstudied studied literature J,time, ms, wheredistributions job durations arebeen random variables with known obability have ininthetheliterature forfor weight and due date pi , wi , di – ofprocessing lems, where job durations are random variables with known a job i ∈ J, more than40 40years. years.AAcomprehensive comprehensive review methodsand and tardiness of a jobi, i, obability distributions have been studied in the literature for ore than review ofofmethods tardiness of a job Ti Ti – – of probability distributions have beenoptimization studied the literature forP. Rajba, and M. Wodeckiof a job i ∈ J, W. Bożejko, algorithms solving combinatorial problems with ore than 40 years.combinatorial A comprehensive reviewinofproblems methods and of time a time jobof i,ofa job TiiCi –– – tardiness gorithms solving optimization with completion a jobi, i, C more than 40 years. A comprehensive review[8], of Vondrák methods and tardiness of a job i, Ti – completion random parameters were presented by: Dean [28] gorithms solving combinatorial optimization problems with ndom parameters were presented by: Dean [8], Vondrák [28] time of a job i, Cπi π –– – completion permutation, algorithms solving combinatorial optimization problems with completion time of a job i, Ci – permutation, and Pinedo [21].InIn case ofsingle singlemachine machine problems itis isusuusundom parameters were presented by: Dean [8], Vondrák [28] nd Pinedo [21]. case of problems it In manywere applications, serious difficulties occur while – cost tardiness of a job i, π –T–i – permutation, f (π) function, random parameters by:the Dean [8], Vondrák [28] in- f (π) cost function, ally assumed that theparameters parameters process (eg. d Pinedo [21]. Inthe case ofpresented single machine problems it release isrelease usuπ Ci – – permutation, ly assumed that ofofthe (eg. dicating parameters or when the process data comes from inaccurate completion time of a job i, andassumed Pinedo [21]. In case of singleofmachine problems it are is usuf (π)Φ –– – cost setoffunction, ofallallpermutations permutationsofofelements elementsfrom fromJ,J, dates, times oftasks tasks performances ordue dates, etc.)release ly the parameters process (eg. ates, times ofthat performances orthe dates, etc.) are in-in-short f Φ measurement equipment. Due todue short realization terms, π – permutation, (π) – setcost function, E T ally assumed that the parameters of the process (eg. release T f(π) – cost –– – set of allfunction, permutations of elements from J, dependent random variables. Generally, there two types ,π blocks, tes, timesrandom of tasks performances or due dates, etc.) are inseries andvariables. production elasticity there are no comparative ependent Generally, there areare two types ofofdata π E π,πΦ blocks, Φ – set of all permutations of elements from J, dates, times of tasks performances or due dates, etc.) are inE T k k objective function considered inthetheliterature: literature: and no possibility toGenerally, conduct experimental studies thatofwould π k,πi ,ks – Φ – blocks, – insert set ofand all permutations pendentfunction random variables. there are two types bjective considered in swapmoves, moves,of elements from J, , s – insert and swap i E T l l lπ E, π T – blocks, dependent random variables. there of arecertain two types of – blocks, enable one to determine explicit values parameters. πkl ,π k bjective function considered inGenerally, the literature: ,s – – insert swap moves, i M(π) ofand moves, setset of moves, objective Furthermore, function considered the literature: in many in economy branches like tourism, agri- M(π) l ik l, siklk, –slk – – insert andswap swap moves, moves, insert and l l –– – set of neighborhood, commerce, building industry, etc., the processes that M(π) N(π) M(π) – setmoves, of moves, N(π) neighborhood, .regular regular- culture, -non-negative, non-negative, non-decreasing function (of complenon-decreasing function (of compleM(π) – set of moves, occur Some have, by their nature, random character (they depend on N(π) f –– – neighborhood, densityfunction, function, tiontimes). times). typical examples are given below: f N(π) – neighborhood, regular - non-negative, non-decreasing function (of completion Some typical examples are given below: – density neighborhood, weather, market conditions, accidents, etc.). Making decisions N(π) f – density function, .tion regular non-negative, non-decreasing function (of complefunction, – density cumulative distribution. times). Some typicalofexamples are given below: Ff Ff ––F – cumulative cumulative distribution. in the conditions uncertainty (lack of exact values of parame distribution. – density function, tion times). Some typical examples are given below: Each schedule can arepresent representbybypermutation permutation F –schedule cumulative distribution. expected number late/tardy jobs (van den Akker and schedule ofofjobs can bebe permutation ters) becomes quotidian. Stochastic scheduling problems, where Each Each ofjobs jobs can bea a represent • •expected number ofof late/tardy jobs (van den Akker and F – cumulative distribution. π = (π(1), π(2), . . . , π(n)) of elements of the jobs setJ. Let Each schedule be a represent Hoogeveen [27], and Salmasi [10]), job durations are random variables with π = (π(1), π(2), …, π(n)) ofofelements of the setpermutation J. set Let ΦJ. de(π(1), π(2), . of . . , jobs π(n))can elements of jobs theby jobs Let • expected number ofElyasi late/tardy jobs (van denknown Akkerprobability and π = Hoogeveen [27], Elyasi and Salmasi [10]), Each schedule of jobs can be a represent by permutation •Hoogeveen expected number of late/tardy jobs (van den Akker and Φ denotes the set of all n-elementary permutations. For any distributions beenand studied in the[10]), literature for more than notes the setset of all n-elementary permutations. permuπ (π(1), π(2), . .of . , π(n)) of elements of the For jobsany setFor J. any Let Φ= denotes the all n-elementary permutations. [27],have Elyasi Salmasi i of elements i πdenotes = (π(1), π(2), .Φ, . π(i) .all ,by π(n)) of the jobs set J.any Let i∑ Hoogeveen [27], Elyasi andreview Salmasi [10]), and algorithms 40 years. A comprehensive of methods tation π 2 Φ, by C = ∑ p w denote the completion time permutation π ∈ C = p w denote the comπ( j) Φ the set of n-elementary permutations. For π(i) π( j) j 1 π ∈ Φ, by Cπ(i)= = ∑ j=1j=1 pπ( j) w denote the comexpectedweighted weightednumber numberofoflate/tardy late/tardyjobs jobs(Li(Liand and permutation • •expected Φ denotes the set of all n-elementary permutations. For any ijob solving combinatorial optimization problems with random paof execution of the π(i). total cost w of the solution πcomisthe pletion timeof execution ofthe the The total costof of the permutation π of ∈execution Φ, byjob Cπ(i) =The pπ(i). denote the ∑ pletion time of job total cost π( j)The j=1 iπ(i). Chen [17]), • expected weighted number late/tardy jobs[28] (Liand andPinedo Chen [17]), permutation π ∈ Φ, by C = p w denote the comrameters were presented by: of Dean [8], Vondrák ∑ π(i) π( j) j=1 solution π is •Chen expected weighted number of late/tardy jobs (Li and pletion solutiontime π is of execution of thenjob π(i). The total cost of the [17]), [21]. In case of single machine problems it is usually assumed pletion time of execution ofnthe job π(i). The total cost of the solution π is Chen [17]), wπ(i) Tπ(i) , f (π) =∑ f (π) = w Tπ(i) (1)(1) that the parameters of the process (eg. release dates, times of ,,(1) n∑ • expected total weighted tardiness (Cai and Zhoi [5, 6]), π(i) solution π is • expected total weighted tardiness (Cai and Zhoi [5, 6]), i=1 n tasks performances or due dates, etc.) are independent random (1) f (π) = i=1 ∑ wπ(i) Tπ(i) , • expected total weighted tardiness (Cai and Zhoi [5, 6]), (1) f (π) = i=1∑ wπ(i) Tπ(i) , variables.total Generally, there are two types of objective wheretardiness tardiness • expected weighted tardiness (Cai and Zhoi [5,function 6]), where i=1 • total weight of batches of jobs (Hu et al. [11]), • totalconsidered weight ofinbatches of jobs (Hu et al. [11]), the literature: where tardiness where tardiness max{0,C dπ(i) Tπ(i)== π(i) max{0,C −− dπ(i) }.}. (2)(2) • total weight of batches of jobs non-decreasing (Hu et al. [11]), 1. regular – non-negative, function (of com- where tardinessTπ(i) π(i) • total weight of batches of jobs (Hu et al. [11]), = max{0,C − dπ(i) }..(2) (2)its pletion times). Some typical examples are given below: The job π(i) isTπ(i) π(i)if .non-regular non-regular(in(in context Just-In-Time scheduling): considered early, completed before thethecontext ininJust-In-Time scheduling): The job π(i) is considered early, ifπ(i) it it− is is before its(2) dcompleted Tπ(i) = max{0,C π(i) }. ● expected number of late/tardy jobs [10, 27], ≤ dπ(i) ) and tardy, ifit the jobis iscompleted completed after duedate date (C non-regular (in the context in Just-In-Time scheduling): The job π(i) is considered early, if is completed before its π(i) ≤ d ) and tardy, the job after due (C π(i) π(i) expected number ofscheduling): late/tardy jobs [17], TheThe π(i) is is considered early, if itifisitcompleted beforebefore its due its . non-regular (in● the contextweighted in Just-In-Time jobjob π(i) early, is completed > dπ(i) due date (i.e. Cπ(i) ≤Cπ(i) dconsidered ) and tardy, theis job is completed date (C expectednumber early andtardy tardy jobs (Soroush due > dtardy, ).if).theifjob itsitsdue date (i.e. π(i) ● number expected weighted tardiness [5, 6], date (C ∙ d )π(i) and completed after its after due π(i) π(i) • •expected ofoftotal early and jobs (Soroush π(i) ≤ dπ(i) )dtardiness and tardy,problem if the jobconsists is completed after date (C π(i) The total weighted finding >).tardiness itsdue due date (i.e. C > d [25]), number The total weighted problem consists ininfinding ● total weight of batches of jobsjobs [11],(Soroush date (i.e. Cπ(i) π(i)π(i) π(i) ). • expected of early and tardy [25]), ∗ > d ). its due date (i.e. C ∗π∈ π(i) •[25]), expected number of early and tardy jobs (Soroush such such atotal permutation ∈ Φ which minimizes function f on Thea permutation weighted tardiness problem consists in finding The total weighted problem consists in ππ(i) Φtardiness which minimizes function ffinding on thethe The total weighted tardiness problem consists in finding ¤ ∗ [25]),2. non-regular (in the context in just-in-time scheduling):such set aΦ,i.e. i.e. a permutation π 2 Φ whichminimizes minimizesfunction function ff on permutation π ∈ which onthe the ∗ ΦΦ expected total weighted number early and tardy jobs set Φ,such • •expected total weighted number ofof early aΦ, permutation which minimizes function f on the ∗ π∗ ) ∈ ● expected number of early andand tardytardy jobs jobs [25], setsuch seti.e. i.e. f (π = min{ f (π) : π ∈ Φ}. f (π Φ, ) = min{ f (π) : π ∈ Φ}. (Soroush [26]), • expected total weightedtotal number of early andoftardy jobs tardy set Φ, i.e. (Soroush [26]), ● expected weighted early f (π ∗ )∗= min{ f (π) : π ∈ Φ}. •(Soroush expected[26]), total weighted number ofnumber early and tardyand jobs jobs [26], f (π in) permutation = min{ f (π) : π ∈ Φ}..permutation π ∈ 3.1.Blocks Blocks jobs every 3.1. ofof jobs in permutation InIn every permutation π ∈ (Soroush [26]), ● expected weighted sum of earliness and tardiness • expected weighted sum of earliness and tardiness • expected weighted sum of earliness and tardiness 3.1. Φ there are such subsequences of jobs, that: Blocks of jobs in permutation In every permutation π∈ Φ there are such subsequences of jobs, that: [1, 18]. sum Blocks of jobs in permutation In every permutation π∈ (Alideeand and Dragon [1],Mazdeh Mazdeh [18]).tardiness Φ3.1. • expected weighted of earliness and (Alidee Dragon [1], etetal.al.[18]). there are such subsequences of jobs, that: 3.1. Blocks of jobs in permutation. In every permutation 1. executing of every job from subsequence ends before •(Alidee expected weighted sum of earliness and tardiness 1. Φ executing every job from subsequence there areof such subsequences of jobs, that: ends before itsits and Dragon [1], Mazdeh et al. [18]). In this we present a problem π 2 Φ there are such subsequences of jobs, deadline (all of the jobs are not tardy), (Alidee andpaper Dragon [1], Mazdeh et of al.scheduling [18]). on a single 1. executing of of every job are from ends before its deadline (all the jobs notsubsequence tardy), ororthat: 1.deadline executing of every job from subsequence ends before machine with the due dates and the total weighted tardiness cost2. 1. executing of every job from subsequence ends before its its executing of every job from subsequence ends after dead(all of the jobs are not tardy), or In this paper we present a problem of scheduling on a sin2. executing of every job from subsequence ends after itsits deadIn this paper we present problem scheduling a sinminimization. We aassume that of processing timesonand due dates line deadline (all of the jobs are not tardy), or deadline (all of the jobs are not tardy), or (allofofthe thejobs jobs tardy). 2. executing every job from subsequence ends after itsdeaddeadgle machine with thedue due datesand and total weighted (all areare tardy). this paper we the present adates problem of scheduling on a sineInmachine thethe total weighted tararewith deterministic or random variables with standard ortarErlang2.line 2. executing of every job fromsubsequence subsequence ends after itsits executing of every job from ends after deadIn this paper we present a problem of scheduling on a sin(all ofsubsequences theofjobs are tardy). diness cost minimization. WeTWT assume that processing times eness machine with the Stochastic due We dates andproblem the total weighted tar- by line cost minimization. assume that processing distribution. will be brieflytimes denoted line (all the jobs are tardy). Such a we call blocks. Such a subsequences we call blocks. line (all of the jobs are tardy). gle machine with the duedescribes and thevariants total weighted tarand due dates deterministic orrandom random variables with stanSTWT. Literature certain ofwith STWT problem Such a subsequences we call blocks. ness cost minimization. Wedates assume that processing times nd due dates arearedeterministic or variables standiness cost We that processing times dard Erlang distribution. Stochastic TWT problem will be Block Such aearly subsequences we called call blocks. withminimization. different distributions of random variables: Jang etbe al. [12] Such subsequences are blocks. d due dates are deterministic or assume random variables with stanard ororErlang distribution. Stochastic TWT problem will jobs Block ofofearly jobs and due dates are deterministic or random variables with stan(normal distribution), Cai and Zhoi [5] (exponential distribubriefly denoted by STWT. Literature describes certain variants rd or Erlang distribution. Stochastic TWT problem will be iefly denoted by STWT. Literature describes certain variants Block frompermutation permutationπ π∈ ∈ΦΦ We call a jobs subsequenceofofjobs jobsπ Eπ Efrom of early We call aofsubsequence dard or Erlang distribution. Stochastic TWT problem will be tion), Li et al. [16] (Erlang distribution). Here we study the reBlock early jobs of STWT problem with different distributions of random variBlock of early jobs E iefly denoted by STWT. Literature describes of certain variants STWT problem with different distributions random vari- asas E-block, if: permutation π 2 Φ π ∈Φ We call acall subsequence of jobs π πEfrom E E-block, if: sistance to STWT. random parameter changes on solutions constructed a subsequence jobs briefly denoted by Literature describes variants ables. Jang andal. al. [12] (normally distribution), Caiand and Zhoi as E-block, frompermutation permutation π ∈ Φ We We callif:a subsequence ofofjobs π from STWT problem with different distributions of certain random varibles. Jang and [12] (normally distribution), Cai Zhoi according to thedifferent tabu search metaheuristics. We alsovaripresenta) every an E-block if: Eπ E is early and d ≥ C , where C of STWT problem with distributions of random jobjif: [5] (exponential distribution), and [16](Erlang (Erlang distriE-block, j Clastlast a) as every job ∈j ∈πj 2 π isEearly and d j d≥ , where C is isthethe Janga certain and al. [12] (normally distribution), Cai and Zhoi ]les. (exponential distribution), LiLiand al.al.[16] distrilastlast measure (called stability) that allows one to Zhoi evaluate time a) every job is early andof where Clast Eis, the j ¸ C last, job E isofearly ables. Jang and al. [12] (normally distribution), Cai and E of finishing executing the last from π bution). Here we study the resistance to random parameter a) every job j ∈ π and d ≥ C , where C is the time of finishing of of jthe last jobjob from E, ] (exponential distribution), Li and al.to [16] distrilast ution). Here we studyofthe resistance random parameter E executing the resistance solutions toand random data(Erlang perturbations. time of jfinishing ofearly executing ofj ≥ the last fromπlast πElast , is the E is a) every job ∈ π is and d C , where C [5] (exponential distribution), Li al. [16] (Erlang distrilast E maximal subsequence which fulfilling limitation (a). b) π changeson onsolutions solutions constructed according tabusearch search b) time Efinishing ofsubsequence executingwhich of thefulfilling last job from π , (a). isof subsequence limitation (a). π b) ution). Here we study the resistance to random parameter hanges constructed according totothethetabu πmaximal maximal which fulfilling limitation ofisfinishing of executing of the last joblimitation from π E ,(a). Etime bution). Here we study the resistance tomeasure random parameter E metaheuristics. We also present a certain (called stais maximal subsequence which fulfilling b) π anges on solutions constructed according to the tabu search It is easy to proof, thatif ifEπ πE isis E-block, E-block, that etaheuristics. We also present a certain measure (called staproof, that that limitation (a). iseasy maximal subsequence fulfilling b)It πItisEiseasy totoproof, that if π iswhich E-block, that changes onallows according to the tabu search bility) that onetoconstructed to evaluate the resistance solutions etaheuristics. Weone also present athe certain measure (called sta3. solutions Problem description and enumeration scheme lity) that allows evaluate resistance ofofsolutions toto E It is easy to proof,min{ that jif∈πEπ EEis: E-block, that metaheuristics. We presentthe a certain measure (called to starandom data perturbations. }≥ Clast .. j≥ lity) allows one also to evaluate resistance of solutions ndomthat data perturbations. Clast . that min{that j ∈ πif π: disjd}E-block, It is easy to proof, bility) that allows one to evaluate the resistance of solutions to E To formulate the problem, we will use the following notation: ndom data perturbations. min{ j ∈ π E: d j } ≥ Clast E. : d j }Bull. ≥C .every min{ j ∈ πof jobs perturbations. J – set of jobs, So, in any permutation from πlast, Ac.: jobXX(Y) is exe-2016 2random data Pol. Tech. Bull. Pol. Ac.: Tech. XX(Y) 2016
2
pi, wi, di – processing time, weight and due date of a job i 2 J,
220
cuted early in permutation π. Algorithm (AE-block) designating Bull. Pol. Ac.: Tech. XX(Y) 2016 E-block is included in the work [2]. Bull. Pol. Ac.: Tech. XX(Y) 2016 Bull. Pol. Ac.: Tech. 65(2) 2017 Unauthenticated Download Date | 4/25/17 7:48 PM
i i pi pi di di wi wi
1 1 2 2 12 12 3 3
Table 1. Data for the instance.
2 2 3 3 19 19 1 1
3 3 1 1 12 12 2 2
4 4 2 2 9 9 5 5
5 5 3 3 5 5 3 3
6 6 2 2 1 1 3 3
Fundamental Block Properties of the TWT problem are de-
7 8 9 10 rived from the following Theorem. 7 8 9 10 rived from the following Theorem. 3 3 2 4 3 3 2 4 T HEOREM 1. [2] Let π ∈ Φ be a permutation with blocks 17 24 19 3 T HEOREM 1. [2] Let π ∈ Φ be a permutation with blocks 17 24 19 3 B , B , ..., probabilistic B , and let the jobs of each T-block of π are ordered 4 Stable 2 scheduling 4 5 of single machine B11 , B22with , ..., Bm , and letparameters the jobs of each T-block of π are ordered 4 2 4 5 according tom the WSPT rule. If the permutation β has been
according to the WSPT rule. If the permutation β has been obtained from π by an interchange of jobs that f (β ) < f (π), obtained from π by an interchange of jobs that f (β ) < f (π), E then in βFundamental , at least oneblock job of some block π was movedare before Block of tardy properties of theofTWT problem deo, in any permutation ofjobs jobs from π E , every job is executed then in β , at least one job of some block of π was moved before o, in any permutation of jobs from πjobs, every job isdirectly executed We call subsequence of (occurring one after rived from the following Theorem. the first or after the last job of this block. rly in permutationT π. Algorithm (AE-block) designating the first or after the last job of this block. rly in permutation π. Algorithm (AE-block) designating another) πinfrom permutation -block is included the work [2]. π 2 Φ a T-block, if: T Note that Theorem 1 provides the necessary condition to ob-block is included in the work [2]. a’) every job j 2 π is early and dj Ci tardiness, > d> di ,iwith , C Ci˜− (12) (12) (12) =˜scheduling T˜T˜i iT= ,the if C˜σifollows >the dvalue C˜From Ltardiness EMMA 1. Cumulative tardiness, i.e. ran22 2 C iC idistribution iof i˜,d µ (15) d˜Tvariable iCumulative i − d i , of and tardiness and Proof. definition the random variable Proo ˜i a ∞∞expected ∞Proof. ∞ ∞of ∞ definition −(y−µ) −(y−µ) −(y−µ) Stable probabilistic parameters iii− 2the and tardiness From of the random variable C ˜ ˜ ˜ 1 1 1 ˜ ˜ ˜ − d , if C > d , C 2σFrom ) 2 y−µ (12) (12) (12) = = = T T T (12) = T i i i i y−µ 2 2 2 and tardiness Proof. the definition of the random variable ≤ ≤ ≤ d d . d . . 0, 0, 0, if if if C C C √ i i i ) y f (y)dy = e + µ 1 − F ( i lows that −(d −µ) i i Jof˜i itardiness, ˜dz. ˜i iand ∞ ∞ ∞ ∞ ∞ ∞e 2 variable 2−(y−µ) ∞ and and follows follows that that distribution −(y−µ) −(y−µ) Lemma 1.distribution Cumulative distribution i.e. of ˜We ˜i0, ∞ N(0,1) ˜− 2σ EMMA 2.random If the C˜iand , definition i =Proof. 1, 2, .From .introduce ,(16) n(16) variable speci,(15) thus dy = σ−(y−µ) assumWe substitutions = C and and and tardiness tardiness tardiness and (15) (15) Proof. Proof. From the definition the definition of of of the random the variable variable C ˜√ ˜2By √ ,˜˜iiiT− ∈ Ji,∈ dom variable T0, definition 122σ ,12σ thus dy = σCdz. substitutions z√ eσi1random e2random yFrom yis fif yC fCthe (y)dy = = yy,σ ynzthe dy. dy. dy. ˜iCiC ˜˜i≤ (12) T˜i0,= dom variable i .introduce i and µ di(15) ˜C ˜ifai(y)dy ˜˜irandom Proof. of the C and tardiness (15) Proof. From the definition of random C˜C (16) follows that i , diL ≤ ifCififi C≤ lows d1, .= 0, ∞ ∞ σrandom ˜random ,if > d.i i.,,d . From CC iythe i and i> 22σ2σ 2dy. 2σ ˜− i f≤ ivariable 2π and (16) follows that LdiEMMA 2. ,C i(y)dy = 2, .= is= aand variable speci2σ )√e+e2σ ˜ddiiiiC ˜ 1e variable ˜.difi di≤ ˜i,, i 2 J √ √ iC y y f f dy. dy. (y)dy (y)dy = = y y ∞−) y f (y)d and (16) that − d if C > d , C 0, if C ˜ ˜ e y f (y)dy = µ 1 − F ( dthe di idIf dd.ifollows d.√ ˜y ˜C ˜(y)dy ˜ ˜ σ σ σ 2π 2π 2π distribution T variable d −µ ˜ i i i i i i i 2 C C d −µ (12) (12) = = T T C N(0,1) − d , C > d , 2 2 i ˜ ˜ C i i i i ˜ 2σ i i i fying the date of completion of task (11), then and and and (16) (16) (16) follows follows follows that that that i i i i √ i ∞ ∞ ∞ ∞ y f dy. (y)dy = y e ˜ −(y−µ) −(y−µ) − d , if C > d , C and (15) Proof. From the definition of the random variable C ˜ and (16) follows that and (16) follows that ing y = d we obtain z = and y = σ z + µ. Then (17) LLEMMA LEMMA EMMA 1. 1. 1. Cumulative Cumulative Cumulative distribution distribution distribution of of of tardiness, tardiness, tardiness, i.e. i.e. i.e. of of of ranranrani i i i (12) = T ing y = d we obtain z = and y = σ z + µ. Then d d d d d d ˜ σ σ σ 2π 2π 2π ˜ ˜ ˜ ˜ ˜ ˜ i d σ 2 ˜i− 11then C ˜i˜> ii ii 2π i (11), of E( T i)2= 2 x ∞ofdii completion ∞i the − ,(d ddd˜idistribution ,+ Cifif > C C dd≤ di)iL ,iF ,dEMMA C = ˜of ˜T fying date(12) ≤ 0, if if(d C C σi.e. ˜i,(d σi task i− id iiT ix)F iid i> i− i,of i+ i˜i.˜.F , iF1. if C > diC 2π ,d+ C ∞ + x) FC x)F )(d (d ), if > 0, 1 i of −(y−µ) ˜,if= (d − Fdistribution ), x(12) > 0, the(12) F˜CiT˜0, dyi √ σ∞ (17) C˜iL−EMMA 2σ22 dy. i EMMA i˜,− LdLEMMA Cumulative 1. Cumulative Cumulative distribution tardiness, tardiness, tardiness, i.e. i.e. i.e. ofxif of ranof ranrany−µ y−µ i˜0, Cumulative distribution of tardiness, raniof ˜ii(d i +if √ ˜(d ˜˜T yd= yiffCzC˜z˜i= dy. i2 −(y−µ) (y)dy (y)dy = = y−(y−µ) ee2∞ ≤ d(12) if ∞ 12σ C˜T C Ci˜.iC C CC C˜> C substitutions ∞∞ y−µ 2= (12) = i1. iJJ ˜T+ ˜ix) ii˜i= i ii distribution i iif 22assum2assum2 substitutions ≤ d˜iii1. .≤ 0, Citardiness, −∞ i 1 and (16) follows that i= 2σ , , thus thus , thus dy dy dy = = σ σ dz. σ dz. dz. By By By assumWe We We introduce introduce introduce substitutions = = (12) (y)dy fC∞˜ iz =dom T˜idom i if , , i , i ∈ ∈ i ∈ J dom variable variable distribution distribution distribution T T and √ ∞ ∞ ∞ (12) = T L EMMA 1. Cumulative of i.e. of rany dy. y e −(y−µ) −(y−µ) −(y−µ) = Fvariable 2y −(di −µ) (x) = F d . 0, C y−µ y−µ y−µ i i i ˜i (x) 2σ ˜ ∞ ∞ ∞ ∞ ∞ ˜ ˜ √ −(y−µ) ∞ ∞ i −(y−µ) σ σ σ ∞ ∞ 1 1 1 f dy.2We (y)dy = y e ˜ ˜ ˜ 2 Tdom 2 T By ddsubstitutions di i,(y)dy σσ 2π 2π ˜z− 2σ ˜,i∈,iifi∈ i d − , 0,ififdistribution >d0, d.i , 0, C ˜We i i −(d −µ)2z d= µσfCd√ σran1f substitutions .dom dddiitardiness, .i.tardiness, . variableifi.e. 0,T˜i ,T˜T C if if C C ˜C −z √ 1− ˜iidi≤ −z 12σ y= dy = eassum1 C ∞J i y thus , , thus thus dy dy = = σ = dz. dz. dz. By By assumassumWe We introduce introduce substitutions = i ivariable iCumulative We introduce substituti i= i J ∈ J J dom dom variable variable distribution distribution i≤ i≤ 2dy 22y 2σσ ˜y i ∈ distribution T y−µ iz ≤ 0, C 2, )introduce ≤ 0, if C ∞ x ≤ 0, i d d 0, if x ≤ 0, σ 2π i i 2σ 2σ µ d i i 2 d d −µ d −µ −µ 2σ √ √ i i ˜ σ σ y (y)dy y f f (y)dy (y)dy = = y y e e e dy. dy. dy. L L EMMA EMMA 1. 1. Cumulative distribution distribution of of i.e. of of rani ˜ i i i i 2σd2i = dµ. ˜substitutions σ.d2σ 22µ − d 2π =∞y√ obtain and (12) TL √y1σ ,e√ thus dy =dz σ dz. By assu We introduce = √ y fCof (y)dy + µ F (= )yz(σ fing dy. (y)dy e˜CC˜i˜C= 2= f= iThen dy. i∈ J variable distribution Tof √ f˜y− (y)dy (σ z+ + µ) µ) dz C i= ˜of did−µ σ 2π i , tardiness, ˜fiC iy i(y)dy i(y)dy ing ing yye= y=f= drandi= dwe we we obtain obtain z z = z = and and y = y = = σ σ z σ z + z + µ. µ. Then Then ∞y N(0,1) −(y−µ) EMMA 1.dom Cumulative distribution i.e. ranC˜i.e. idiy−µ ii ee σ 2σ )dand C i i −µ d (18) f (y)dy 1 − F i −µ d −µ i i y−µ y−µ 1 ˜ L EMMA 1. Cumulative distribution tardiness, of ˜ i σ σ σ √ ∞ d d d d d d σ σ 2π σ 2π 2π ) y (y)dy = e + µ 1 − F ( N(0,1) i i and iand C i i i i i σ ˜ (13) ≤ d . 0, if C , , (d (d (d + + + x) x) x) − − − F F F (d (d (d + + + x)F x)F x)F (d (d (d ) ) + + ) + F F F (d (d (d ), ), ), if if if x x > x > > 0, 0, 0, F F F −F (d + x) (d ) f L EMMA 1. Cumulative distribution of tardiness, i.e. of ran˜ ˜ d d d σ 2π d d N(0,1) 2π σ 2π i d − µ d d 2π 2π ing ing ing y = y y = d = d d we we we obtain obtain obtain z = z z = = and y = y y = σ = z σ + σ z z + µ. + µ. Then µ. Then Then i i ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ 2 C ˜ ˜ ing y = d we obtain z (13) i i i i i i i i i i i i (13) i i y−µ i i , , thus thus dy dy = σ σ dz. dz. By By assum assum We We introduce introduce substitutions substitutions z z = = i i i d −µ iiiiCof ∈ ∈i+Jx)F Jx)F dom variable variable distribution distribution TT i i Ci i iyii i√ C C CCi iCidistribution CC CCitardiness, Ctardiness, 2σ σ dy. C σ=F σyy−µ σ i dy i(d i,,+ y>f0, (y)dyi =ing eobtain i iCi Cumulative itardiness, L= L Ldistribution EMMA EMMA EMMA 1. 1. 1.˜F(d Cumulative distribution distribution tardiness, of of i.e. of i.e. ranof of randsubstitutions σ,= σσthus ˜i+ σ0, i x)F − d˜0, dz. assumWe introduce zF = 2π umulative of tardiness, y =Cintroduce z), and zσi+ µ. Then (d+ (d x) − x) FJ− (d F (d + x)F (d )(d +)i )+ F+ F(d F (d ),(di.e. ), ifF ),Cx˜if if > xxran> 0, F F L EMMA 1. Cumulative distribution of i.e. of ran(x) (x) (x) =dom = FFT˜TF (d + x) + (d + (d if xσy−µ > ,+ix) ∈− dom variable distribution T y−µ C ˜of i − F ˜(d .σ dz. f˜C∞substitutions (y)dy = 1σ− ˜Cumulative ˜F ˜i.e. ˜d i we i )introduce ,By thus dy =dy We =N(0,1) ∞ ∞C 2= 2= 2σ iC˜x)F dom variable distribution TranCidom C C C C− C C˜,i (d C C˜+ C (17) i˜i T˜and i F F i ˜iiF ivariable i distribution i ˜iiFi˜ i (d i∈ i ˜ii Fi ˜i (d y−µ y−µ ii ), iWe i˜iσi dsubstitutions di −µ −µ −z −z −z ,dz. thus = σBy dz zdy 11,zthus y1 di i if x > 0, Ci ∞∞i ∞dy−µ 2π x) (d ∞we ∈C˜iJi)if T˜i˜iiF,JiCT˜˜iii (x) σ i= σ By i= ˜i (diT˜+ y−µ and density function (x) (x) (x) = = = F = ii + density function , ˜ ˜ 0, 0, 0, if if x x ≤ x ≤ ≤ 0, 0, 0, ˜ ˜ ˜ ∞ ∞ σ C C C ∞ ∞ σ 2 2 ∞ ∞ ∞ ∞ ∞ , thus , thus dy = dy σ = = dz. σ σ dz. By assumBy assumWe We introduce We introduce introduce substitutions substitutions substitutions z = z z = d ∞ assumd −µ 2 2−z 222 (17) T T T ing ing y y = d d we obtain obtain z = and and y = = σ σ z z + + µ. µ. Then Then , i ∈ , , i J i ∈ ∈ J J dom dom dom variable variable variable distribution distribution distribution T T i i ˜ ˜ 2dy 2 21 −z ii 1 i Tii ,idistribution and 1 1 i i i −z −z i i i 1 , thus dy = σ dz. By assumWe introduce substitutions z = , thus = σ dz. By assumWe introduce substitutions z = Cumulative of tardiness, i.e. of ran−z −z −z √ √ √ iT˜i∈(x) J stribution 1 1 e e e y y f y f f (y)dy (y)dy (y)dy = = = (σ (σ (σ z + + z + µ) µ) µ) dz dz dz d −µ , i ∈ J dom variable distribution T = F σ σ σ σ ˜ ˜ ˜ i i ing y = d we obtain z = and y = σ z + µ. Then 0, 0, 0, if x if if ≤ x x ≤ 0, ≤ 0, 0, 0, if x ≤ 0, ∞ ∞ d −µ σ C C C σ 2 (d (d + + x) x) − − F F (d (d + + x)F x)F (d (d ) ) + + F F (d (d ), ), if if x x > > 0, 0, F F and d d −µ −µ d −µ i We introduce a substitu i i i i 2 2 ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ i i i i i i i i 2 2 ing y = d we obtain z = and y = σ z + µ. Then −z 2 2 2 √eand i i= ii −µ σ√ dz z= = √ C Ci i0, CCi˜i (di ) + FC˜Ci i(di ), if x CCi> y−µ + µzzµ2π dz. eσ= σ σyf,dCing i (y)dy √ √√ eeee1Then yobtain ydif(y)dy f˜Ci˜zyz(y)dy = (σ z+ (σ + + µ) + µ) dz dzdz (13) dde d−µ fC˜iµ. (y)dy i (13) d= = y= σ zy+ Thi − FC˜i (d + ix)F 0, d−µ d= if),x ing ≤ 0, ˜thus 2π 2π i −µ σ (13) i= di0, iz i= −µ i+ i (d dz −µ di+ −µ and ˜i(x) ∞F) We = = FFT˜TTi˜F −µ + x) F (d(d x)F (d + (d ifzi ), x= >if F+ diiidz. −µ dobtain d−µ −µ σ we C Ci − C dy assumWe introduce substitutions iiC iz ,˜i i(d ∈ Jf x) istribution ing ying = yyxdiydTech. = dd0, ddiiiiXX(Y) we we obtain obtain = and and and y(σ = yyyzσ = = zdµ) σ σσz µ. z+ + Then µ. ˜yix)F i i+ i )d iobtain ∞ ∞ ∞∞ i By (y)dy (σ µ) − µfCz∞2π 2d 2 2 dz d−µ (d + x) F (d + (d + Fi≤ (d > i(x) i= F C˜− Cf˜i− C C˜i(13) ˜σzσi= Bull. Pol. Ac.: 2π 2π ing we obtain and yidi= ziy+ ing = we zµ. = and = z+ + µ. Then ˜i+ ˜i+ ˜i), ˜0, 2π σ (d x)(d − F (d )x)F x), if(d x), > fCiix) i)+ i (d i0, i− (x)x)=− F F dwe 2π 2π −z −ze σThen σ σi= 1µ. 1√Then (13) (13) (d x) F (d fxCx)F (d + if x= > i0, ∞ ˜i+ ˜ix)F ˜0, ˜i+ d(13) −µ C C C ix) ix), i σ2016 σ 2π ˜˜F ˜˜C ˜ix)F i i ∞ σσ σσ (d + (d (d + + − x) F − F F (d (d + (d ) (d (d + ) F ) + + F F ), (d if x if > if x x 0, > > 0, F F F σ σ σ (x) = F 2 i 0, σ C C σ 0, 0, if if x x ≤ 0, i C C ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ d − µ i i i i i i i i i i i i dTiand (d + x)F (d ) + F (d ), if > ∞ ∞ −z i i i . (18) f (y)dy = 1 − F (d + x) − F (d + (d ) + F (d ), if x > 0, F 1 2 T (x) = C C C C C C C C C C C C i 2π 21dz and and density density function function ˜fi0, ˜i (14)(13) di −µ ˜C and density i function i 0, C(14) (x) =ii Cfunction ii˜T ii˜i i i i C˜ii i i i if √ ∞∞∞ ∞∞∞(σ y1d= yiσ fTech. fC= (y)dy = = µ) ee222(18) dz fF ∞ ∞ 2(y)dy 2 2XX(Y) 2 C C˜ii ix ≤ σ(σ ˜−z Pol. (σ ˜Ti˜T(x) (x) (x) = (x) = = F ˜density T˜= −z −z 1z1z+ 1∞+ 1machine 1= C −z˜2 0, C˜ii 0,C˜iii i i if ≤if0, 2−z 1−z2 − Bull. 2016 idensity .221−z zx = (y)dy 1−µ− F .2√ (18) dd−µ −µ√ d0, we obtain and yddAc.: z˜i−z ˜i∞ e∞(1 i+ F ∞ ∞ ∞Then ∞ (x) = FTF yN(0,1) fof (y)dy zµ e∞e1eµ) dz 2 µ) Stable scheduling single machine parameters ii ∞ density function anding density iprobabilistic xof ≤ 0, dfunction ˜if i and ii˜iand scheduling single with probabilistic C ∞ ∞ ∞ ∞ √ −z −z 2with 0, x Stable ≤ 0,xx0,≤ 2+ 2∞ iµ) 2z 2 2µ) √ σ∞ y1µ. f+ (y)dy =√ (σ z∞121dz. + dz T˜and C 0, if yxif= ≤if 2π 2π 21 2di21 2parameters (13) (13) ˜2iN(0,1) We notice that i√ i∞∞e (di +Tx) − Fdensity (d +function x)F −z −z −z −z −z−z −z2+ −z √ √ √ √ −z We notice that 1 1 1 e e e dz dz dz + + µ µ dz. dz. z z z = = = σ σ σ 1 0, 0,function x if ≤ if ≤ 0, 0, 12C y f (y)dy = (σ e C σ σ ˜i ˜i (d σ i density i ) + FC˜i (di ),ififx x≤>0,0, function d −µ ˜ d Cand C 0, if x ≤ 0, ∞ 2 2 2 d 2 2 2π i C (13) ddii −µ −µ di −µ ddiz −µ −µ d(σ −µ d√ √ √ yσzfσC+ y2π fµ) f2π (y)dy (y)dy (y)dy = + zzµ) zµ+ + µ) µ) e√e eeee2dz dz −z −z 22−µ 2dz 2(σ 2 2dz ∞y fC˜ (y)dy =(13)i = 2= 1dz. 1dz σ ee= d= i= i(σ i+ ˜y ˜z˜Ci˜√ 2π i −µ √ √ √ √ √ √ (σ e e e dz + dz + µ µ dz. dz. z z σ = i√ y f (y)dy = (σ + µ) dz z σ C 2π 2π 2π 2π σ d d d −µ −µ d 2π i i (d (d (d + + + x) x) x) − − − F F F (d (d (d ) ) f f ) f (d (d (d + + + x), x), x), if if if x x > x > > 0, 0, 0, f f f ∞ (13) 2 i C i i i ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ i i i i i i i i i 2 2 2 2 d −µ d −µ d d −µ −µ d −µ d d −µ −µ σ σ σ σ σ σ d −µ f (d+ 2 and and density density function σ2π i i∞ 2ii= iσ 2 −z ie 2∞ 222 2di −µ CCi iCi function CCi iCi CCi iCi if x(13) 12π ∞2π i∞ i2π µ dz. 2 z11−zx√ ddd(d −z −z 2π (13) (13) (13) −z ii−z iσ i∞d)i f(σ ≤C˜(d 0, d0, 2π 2π 2π 2π 2π −z −z −ze σ> σσσ22∞ dziσ+i−z 11√ µ) ,C˜single ,(14) (d+ x) +x) −x) − F− F F (d (d )(d fC)˜ii)f(d (di + x), x), ifi . xifif> xx(14) > 0, > fdensity andffTdensity (14) (x) (x) (x) = = =function +with x) −probabilistic F if−z 0, d√ d22π if0, + ˜if(d ˜tardiness ˜C ˜Ti˜fwhere ˜x), ˜i (d ˜−z 2, −z i of i+ i+ σx), σ2definition σ2 −z 2 σrandom Stable of machine parameters i −µ i −µ fC= (x) is of 2σ) 2 , Proof. From the definition the rand and function C C˜C C C− Cf˜variable y(13) fiC(14) =C˜i (d zσ dz C> C ˜iis f˜(x) (x) density random Cif .≤ 2e 222∞ tardiness Proof. the of the ∞ ∞ −z (e )σ= =−ze −ze there dz = −e2π ˜i (y)dy i T˜i f where i ˜idensity i f i˜and irandom i ˜iischeduling 12ze 1i+ i2 σ 2 2π 2From 21dz 2of iiPol. (e there ze dz = −e . and density function (dand + x) FCivariable (d + x), if x 0, √ √ √ √ dz + + µ µ dz. dz. e e z z e e = = C˜(x) −z di −µ ∞ ˜i (dii )i fBull. ˜if f f (14) (14) (x) = = 1 (14) f (14) (x) = i Stable 2 σ σ 0, 0, 0, if x x x ≤ ≤ 0, 0, 0, i ˜ ˜ ˜ C C scheduling of single machine with probabilistic parameters Ac.: Tech. XX(Y) 2016 We We We notice notice notice that that that −z −z and and and density density density function function function T T T 2 2 1−z2 dz. 1 T i i ddie −µ −µ 2π dedi −µ parameters parameters √ ∞√ ∞ ∞ ∞2 dz ∞ d i i f ˜ (x)function dz + µz222√ dz. zthat = σ i single Stable scheduling of machine with probabilistic 2with 2 probabilistic 2 −z 22√ 2 Stable scheduling of single machine Stable with scheduling probabilistic of single machine parameters (13) ction andi density i1 i −µ ii+ Stable scheduling of single machine with probabilistic parameters i0, (14) = ∞ ∞ ∞ ∞∞ −z −z −z −z −z 2parameters 1 1 1 1 1 + µ e e = σ , 2π 2π 2π 2π 0, 0, 0, if x if if ≤ x x ≤ 0, ≤ 0, 0, Bull. Pol. Ac.: Tech. XX(Y) 2016 0, if x ≤ 0, dσ −µ d −µ and (16) follows that 2 2 Stable Stable scheduling scheduling of of single single machine machine with with probabilistic probabilistic parameters Ti (d (d + x) x) − − F F (d (d ) ) f f (d (d + + x), x), if x > > 0, f f ˜ −z −z −z −z We We We notice notice notice that that 1 1 We notice that √ √ 1 1 and (16) follows that ˜ ˜ ˜ ˜ ˜ ˜ dz + µ e z e = σ Stable scheduling of single machine with probabilistic parameters i i i i i i σ σ σ σ d2i −µ di −µ2 2(15) Proof. From the definition of the random variable C and C C C C C C ˜ ˜ 2 2 2 2π 2π i i i i i i ˜ ˜ i (d + x) − F (d ) f (d + x), if x > 0, f 0, if x ≤ 0, d −µ d −µ √ √ √ √ √ √ dz + dz dz µ + + µ µ dz. dz. dz. e e e z z z e e e = σ = = σ σ 2 2 2 2 2 2 2 2 2 2 2 2 − d , if C > d , C 2−z 22 idz. i − F ˜ (di ) f ˜C(d 2 dz. We notice that σ= σ+ , i σ+ ififd −µ >√ i 2π ffT˜Ti˜We (14) (14) (x) (x) = =i consider id(d i x), ixd> i −z ˜eiσ and Proof. We x) x), xC 0, fcases: √ of −z −z −z −z −z −z −z −z −z−z C C˜i consider C˜i two Cif˜i+ dz + ee2the zif eTherefore = dz2π µ z√ σ nction i)− i(14) i> i , 0, ˜ii (dcases: if + Proof. two σ dµ −µdd d−µ −µ d −µ ddid−µ −µ √ 2π 2π and tardiness (15) Proof. From the definition random variable ∞ ∞ (d + x) − F (d 2 2 C C C ˜ ˜ ˜ ˜ i i i i i i ˜ ˜ ˜ d −µ −µ −µ i i i i f (x) = σ σ 2 2 2 2 2 2 2 2 2 2 2 Therefore −z −z 2 (12) = T 1 1 ˜ C C C where where where ffC˜C˜ifF (x) isisiis density of of of random random random variable variable C .= .+ .i x), 2notice 2,, there 22 2dz −z−z iσi σ22 2π i= 2π 2π 2π 2π (12) and (16) that TC + (d (d + − x) x) F− F )i(d (d fC˜iii)i)(d )ffCf˜˜Cii˜+ (d (d x), if˜ x˜ if > ifxif x0,> 0, 0,0, ffCTdensity ff f˜˜Ciifi0, (14) (x) = ˜i˜(x) iif −z −z −z −z −z−z −z −z −z −zand −z −e x> x≤ ≤ 0,iσProof. i+ (e (e (e222follows )dz )the )= = = −ze ,there there ze ze ze dzσ∞dz = = −e −e222π .CC ∞ ∞ random iC ˜˜i (d ˜0, ˜− ˜ivariable ix) i+ ix), 2π and tardiness Proof. From definition of the random variable σ−ze σ 2π C(x) We notice that that ∞(15) ˜˜and (d − (d )density (d + x(d > 0, ˜2= + x) FFrandom (d x), > 0, (14) (x) = C Cof C −ze 2C ˜.ii−e ˜i i≤ 2dz. 2.iC ˜random 2−z ii + ˜z0, ix), i i+ ˜idensity σ−z itardiness ii˜i(d 2π and tardiness Proof. From definition the random variable d Ttardiness and (15) the definition of Proof. the From the definition C the √ if xC 1 1 ˜−µ C C0, 1. x= > then + ethere = σ Cis C˜C Cvariable and tardiness and (15) Proof. From the definition of random variable Ti(d where fC= f(x) (x) density density ofif− random of random variable C .0, Ci0, .iif . fC˜x˜iProof. −z 2(ethat 2of 2of 2of 2the 2ze 2and 2and 2 .2(15) where (x) is of random variable .2π ˜C ˜iσσidz (15) ze d0, .eWe 0, if C We notice that i0, iis iis i ivariable fx) ffwhere fwhere (14) (14) (x) = (x) ˜= ˜(x) i2and xi(x) > 0, (ethe (e )µ = )σ∞)the = −ze = −ze ,2that ,random ,there there zevariable dz = dz = −e −e .2of .y)−z2y2= i−ze ≤ d≤ .From ififdensity (e and (16) follows ˜−z ix√ ithe 0, xC ˜where ness ness (15) Proof. From From the definition definition the variable variable C ˜ C˜f˜then C ∞∞ random ∞ d(14) dof −µ i≤ ithe We notice that 2and (14) (14) T˜i 1. T T rdiness (15) ifC i(x) i Proof. From definition the random variable C √e 0, if ≤ ˜ is density of random variable C . y f (y)dy = i i ii˜i(x) ˜ ˜ 2 where f (x) is density of random variable C . 2 2 2 i σ √. −ze We notice ˜ y f (y)dy = T ˜ i i (e ) = −ze , there ze dz = −e 2 2 2 2 2 2 2 2 ˜ 2π 2π −z − d , if C > d , C σ C i) f C and (16) follows that (d + x) − F (d (d + x), if x > 0, 0, 0, 0, if x if ≤ if x x 0, ≤ ≤ 0, 0, i i i i C i ˜ ˜ −z −z −z −z −z −z −z −z i i i i σ σ We We notice We notice notice that that i and (16) follows that if x ≤ 0, and (16) follows that and (16) follows that C˜iProof. C C 2 0, if x ≤ 0, Tf˜fi˜˜= ˜˜i > ˜density and (16) follows that 2 √−z2 2di −(y−µ) We0)P( that iconsider itwo 2 y We notice that f−zC−ze ˜˜notice d−e Proof. Proof.We We We consider consider two two cases: cases: cases: 2222−z2 2 di 22di σ σ2π ˜−ze − d , if C > d , C ∞ 222=,= −z −z ize ˜ ˜ √ (12) ˜ ˜ ˜ 2i (y)dy 2 2π y f (y)dy −e and and (16) (16) follows follows that that where where (x) (x) is is density of of random random variable variable C C . . (14) 22 −z ˜ ˜ ˜ ˜ ˜ i i i ˜ Therefore Therefore Therefore and (16) follows that i i (e (e ) ) = = , there there ze dz dz = = −e −e . . − d , if C d , C − d , if C > d , − d , if C > d , C −z −z ∞ ˜ ˜ 2 2 F (x) = P( T < x) = P( T < x| C − d > C − d > 0) ˜ ˜ ˜ C − d , if C > d , C C C i i i i ˜ ˜ ˜ ˜ d −µ i i i i i i i i ˜ i, if ix| i 0)P( id i 0) oftwo i 2 2−z irandom ii> i≤ i= EMMA 1. distribution ofTherefore tardiness, of (x)is P( T x) =Ctwo P( Tif C diiiT Cdistribution − where fCF˜iTProof. density C ..Cumulative 2 )i.e. ˜L ˜i= Proof. Proof. We consider consider consider cases: cases: 122π ˜0, ˜ii .> Proof. consider cases: TWe −z d2irand.ii−µ22 2−z − diWe di− ,i< ,dfof iftwo C > dvariable d< ,i≤ ,density C C L EMMA 1. Cumulative tardiness, of dz ˜(x) (12) icases: i− i> iWe (e = −ze , there ze = −e 2 i.e. 222ran2 −ze 2−z 222Therefore 2= −e 2 . i= 2 2 ze ˜˜= ˜then ˜ iT i− itwo iif if C > d , C where (x) is density of random variable C d 0, C 22π 2 iWe ˜ , 1. x d ˜ Therefore Therefore ∞ ∞ ˜ (12) = −z −z −z −z −z −z −z −z −z −z −z T i i i −(y−µ) i> 2 2 2 2 (12) (12) = T i 2 2 (e ) = , there dz We notice that i 2 2 (12) T where f (x) is of random variable C . σ 2 2 2 C i 1. 1. x x > x > 0, 0, 0, then then 2 i i Proof. consider two cases: ˜ ˜ 2σ −z −z −z −z −z −z −z −z i density ∞ ∞∞ ∞2∞ ∞ dz −(y−µ) i ˜ ∞∞ i of √ ∞ )−(y−µ) = there =1,−e ∞∞∞ ˜C e2ze y fC22˜2= dy. =2 (e Proof. We consider . σ−e ˜f1. (y)dy ∞,y,−(y−µ) −(y−µ) (12) = Twhere T C˜.irandom 22−(y−µ) Therefore =dz ∞ ≤ d.diidistribution . C˜ . (12) 0, ifif CC 1112−ze where where fx˜˜C> (x) (x) is (x) isthen density density of ofcases: random random variable variable C .(12) i i= 2∞ = T ˜˜≤ ˜two 2σ 22,2 = 2 ze 222z .y−µ 2∞ 1√ ˜iC ˜˜ii.i.0, ˜iii ≤ iC (e (e (e )))∞ −ze = −ze ze ze dz dz = −e . y−µ ˜> dz −e ∞= thud We introduce substitutions i (y)dy −z −z density of random variable Cdom 0, if 2= 2there 2 J where is density of variable C dvariable ≤ di .2 ∞,2 there 0, if Ci−ze σ ∞−ze ,∈ithen dom variable T˜i0, 2there −(y−µ) 1. x˜1. xff0, 0, 0, then 0,is then C 2σ 1. xT ,√ thus We z. = ≤ 0,T i˜C J∈ variable distribution 1 1,√ ˜itwo )if = = .√ −z id i(e 2y (e −ze ,σ there ze dz = .= yy∞˜)yfffC(y)dy dy. = yythere e22substitutions i> ii˜> i(x) ∞ ∞−e i.. C˜i − 2 C i ,0). 1introduce ˜ze ˜C ˜random 2σ 2σ dy d−e Proof. Proof. We We consider consider two cases: σ ≤ ddcases: d≤ .iC.≤ 0, if˜if C ∞ ∞ ∞= 2 22−e √ 2σf22−z √ 2(y)dy 2dy. σy ∞σ dy. = y y e dy. = i fy i√ i≤ i− 222ee d . 0, if C 22π < x| d ≤ 0)P( d ≤ +P( (y)dy = ˜ ˜ ˜ ˜ 1. x > 0, then ˜ i ˜f(y)dy −z2 −z −z −z 1. x > 0, then −z −z √ Therefore Therefore σ σ σ −e −e y f (y)dy (y)dy = = = i i i i i i i 2σ 2σ C ∞ C C < x| C − 0)P( C − d ≤ 0). +P( T √ √ ˜ ˜ Proof. We consider two cases: y y f f (y)dy (y)dy = = y y e e dy. dy. C 2σ i − µ d i i ˜ ˜ ˜ ˜ ˜ ˜ ˜ ∞ d −µ i i i i i √ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ C C LFF EMMA 1. Cumulative distribution of tardiness, i.e. of rani= d d 2di −µ y f dy. (y)dy y e σ 2π i Proof. We consider two cases: i i i i ˜ i i C C Therefore − µ d density of random variable C . F (x) (x) (x) = = = P( P( P( T T T < < < x) x) x) = = = P( P( P( T T T < < < x| x| x| C C C − − − d d d > > > 0)P( 0)P( 0)P( C C C − − − d d d > > > 0) 0) 0) 2 2 2 2 −z 2 2 2 σ d d d d −µ −µ d −µ σ 2π d d d d i i σ 2π σ 2π C ˜ ˜ ˜ i i>i 0, i i ii consider i i of i˜itwo itardiness, i i i i.e. i i iof (e ˜ ) = −ze , there ze dz = −e √ √i.√2π We cases: −e y> yd+µ yf(y)dy (y)dy = = =d= (y i(y)dy ii−e i and iddidiTherefore yN(0,1) d0) we = = TTi i TProof. iobtain i z2∞−µ d˜i ˜σ 2π 2π 2π i− 1. 1. xWe xconsider 0,P( then then y− = we obtain z−µ = y y=yfC˜σ iProof. 1− F i −e ˜tardiness, ˜− ˜= ˜di(d LLProof. EMMA 1. Cumulative distribution ranfC ˜fdiing ˜T ˜P( ˜iT dtardiness, dxidranCf˜C 2π 2π y−µ ∞σ σand We We consider consider two cases: cases: i iP( i ˜ing i1 dF i z σ 2π F F1. F (x) (x) => = P( = T˜Proof. P( < x) < x) = x) P( = P( T < x| diiof 0)P( > 0)P( 0)P( C˜of C − C − 0) > 0) 0) F(y)dy F (x) P( T < x) Ti˜˜i F(d √ L Cumulative tardiness, of ranLcases: EMMA 1. distribution EMMA of 1. tardiness, i.e. distribution ranof= ∞∞ yC = ∞iσ ddz. d2id−µ σ= σ σσ−e ii σ ˜(x) iT i˜two i< iT i˜i< i˜C iCumulative i i− i+ i> i˜ − id i+ i> EMMA 1. Cumulative distribution of i.e. of ranider two cases: (d + x) − F x)F (d + (di∞ii.e. ifd> F Proof. consider cases: i˜y−µ i,2π xvariable > 0, then N(0,1) Therefore T˜iWe T˜Cumulative Tdistribution i 2 iBy ˜d T˜i.e. (d x) − F (d x)F (d )dintroduce +i )Therefore ifi − xof F di ii0, d+µ thus assumid i ), 2π 2π −z −z ˜di.e. ˜> di ∞y−µ σ σ ,two i= ∈ J(x) dom T˜˜˜distribution inote iF ˜i0)P( ˜ii We i> MA AL1. 1. 1.EMMA Cumulative Cumulative distribution distribution of of tardiness, tardiness, i.e. of ranrani 0, z=C 1. xdistribution > 0, then Ci.e. C C˜iiTherefore Therefore Therefore i σ We that itwo y¡μ (x) = P( < x) = P( T < x| − d C − d > 0) T 2 dy C C C C iC ii + i ),substitutions MMA 1. Cumulative of tardiness, of ran d −µ ˜ i i i i i σ 1. x > 0, then i i i y−µ y−µ σ σ σ We note that ∞ −z T σ i y−µ 2 = F d ∞ 2π 2 2 assumthus dy = σ∞σBy dz. By assumWe introduce substitutions = i0, ˜i J = F We introduce substitutions thus dy = σdz. √ √ ,x) ∈ J≤ dom variable distribution T = σ ∞By dy y= yiif= fC∞zC˜˜We (y)dy (y)dy −e −e −z ˜< ˜˜iC ˜− ∞ ˜− ˜˜˜i= ˜then ˜< ∞ 2 , thus y−µ y−µ 1. x1. 1. >Fdistribution xF x0, > > then 0, then T≤ ˜ii− = = ∞ ,,= thus dy = σ dz. By assumWe introduce substitutions zzz = σintroduce ,d,thus dy σ dz. We substitutions substitutions z∞assumT∈ ∈ dom variable distribution T < < x| x| C − d˜iidiii˜(x) ≤ 0)P( 0)P( 0)P( C C − dC ≤ ≤ 0). 0). +P( +P( T,< ˜d˜ii˜iid− ˜We ,i˜T J∈ i˜ii∈ Jdintroduce om variable TiC distribution T˜i ,C −µ ˜Jx| ˜C ˜˜ii < sider two cases: σ=∞−zσ ,= thus = By We introduce substitutions 2µ σassuming 1. x+P( > 0, ,dom JP( dom variable distribution iP( ii,x) id i˜P( i− i 0, i˜≤ i= i=σ dy √ ∞ (x) (x) P( T T = = T− T < x| x| C − d˜0). d˜− > > 0)P( 0)P( C − −We diiT> > 0) 0)introduce 2µ. 2Then 2dz. ∞∞ ddi −µ 2 ,≤ ,−µ) thus thus = σ dz. dz. By By assumassumWe introduce substitutions ˜i= − − − µσ µ ddy ˜zC xi z∞− ≤ 0,zddy−µ Therefore d−z ¡μ iJT ii− ,iithen iT ∈ able able distribution TT˜˜TTi= σd ∞< ∞ √ ˜variable −z −z −z ˜substitutions ˜(y)dy ˜P( ˜iiT ˜− 2 yσσdy fCσ (y)dy = −e 0, ify fxCz∞if 0, ,2 y = σz + μ. thus = σ dz. By assumsubstitutions iid ˜introduce i˜= i= T˜T i −µ ing y> = dwe obtain and yii−e σ z−Then + ,∈ iiC ∈ JP( ariable distribution d= dσ 2π 2π 2 (σ z + µ i˜˜ii < i≤ −µ i iwe < < x| < x| C x| C − C d − d ≤ d ≤ 0)P( ≤ 0)P( 0)P( C C − C d d ≤ d ≤ 0). ≤ 0). 0). +P( +P( T iσ2i−µ x| C − d ≤ 0)P( 0). i+P( +P( −z FT˜idistribution (x) = P( T x) C d > 0)P( − d > 0) C σ √ σ ˜ ˜ y f (y)dy = −e ˜ ˜ −(d dµ −µ y f (y)dy = ˜ ˜ iand ˜ y = d obtain z = i i i i i i i i i i i i i i σ i i i i i i 2 σ ˜ ˜ y f (y)dy = (σ z+ µ)σ√ d −µ d d −µ i F (x) = T < x) = P( T < x| C − d > 0)P( C − d 0) i σ σ i ˜ ˜ 2 2 2 ˜ ˜ d P( T < x| − d ≤ 0) = 1 and P( T < x| C − d > 0) − − µ µ d d d +µ +µ +µ 1 1 − 1 − − F F F 2π ˜ ˜ ˜ d −µ C ˜ d −µ σ i (d + x) − F (d + x)F (d ) + F (d ), if x > 0, F ˜ −(d −µ) i i C ˜˜i˜− ˜ iT˜P( iC iiid i ing ii= √ √ √ ˜ iobtain ing y = d we obtain z = and y = σ z + µ. Then −e y f y y f f (y)dy (y)dy (y)dy = = = −e −e i TC i i i i i i i diand −µ i ˜ii < ix| ˜ ˜ ˜ σ − µ d C i < x| C − d ≤ 0)P( C − d ≤ 0). i i i +P( T F (x) = P( T < x) = T < x| C − d > 0)P( C − d > 0) i N(0,1) N(0,1) N(0,1) 2 n F (x) 2 i d ˜ ˜ ˜ 2π d −µ i d d −µ −µ d −µ P( T d ≤ 0) = 1 and P( < x| − > 0) i ˜ i √ i C C C C √ i i i i i i i i y f (y)dy = −e ∞ y f i (y)dy = −e ˜ ˜ ˜ ˜ ˜ ˜ y = d we z = and y = σ z + µ. Then ing y d ing y = d we obtain z = and y = σ z we + obtain µ. Then z = y = ˜ ˜ ˜ ˜ C C C 2 i i σ di+µ −µ ix) ii − iing T> i= i ing =iFobtain dii (d we obtain zand and yσµ. = σd z σx(13) − µi d dN(0,1) 2π − ddi iThen i= idThen ∞ 2σ )1 Fthat F (x) = (x) = =iiC T˜P( P( iddF dF 0)P( 0)P( C C C > ddiing 0) > 0) iF ˜T (13) ˜ii0) ˜idxxi(d idµ izµ. ˜Tiidx) +µ 1σ − Fσ − F (d + x) − (d + x)F (d )˜)i0)P( + F (d ), if > 0, F −µ i + µ. (x) = F ix) i− iP( iix)F i< iix| id i+ ii0)P( i˜> i− i> σC We note note note ing y> == dCd= we obtain obtain z∞z= = y˜221iyiσ)σi= = zF z+ + Then ˜˜i P( ˜0)P( ˜x| ˜˜˜ii> −z i F σd− σif = e+µ + 1Fσ− − = P( < x) = P( < x| C − 0)P( C − ˜We N(0,1) N(0,1) T˜P( TT˜˜˜T i −µ σσ σ. σ +µ d√ diand 2σ −µ 2π 2π 2π yi0) dC˜ii )ywe z√ = and y = σ µ. Then ˜T did−µ −µ (d (d (d (d ), if > 0, F ˜iC(d ˜T σ (d + − )(d + (d F + (d x) ), − if xif 0, x)F (d + ), 0, F FC C C i− iF i)˜˜x)F id iy0, i F iii˜that i(x) 2 + − F + (d + (d x− > F N(0,1) ˜C ˜C ˜x) ˜T ˜F ˜iif ˜i iiwe ii(d ii0, i∞ i< i(d i(d i≤ id+ i− i= σiσd> d0) x| x| − − dC(d ≤ − d), ≤ ≤ 0). 0). C +P( 2π de 2π + µ 1σ+ F i˜+ i+ ii)0)P( id 1µ − −z C C C C CF C C C˜ii> C 1σ σ ii i˜C˜i< i+ i= iiif i> ii F ˜i = +µ (d + + x) − F F (d + x)F )C + F ), ), if xC˜> > 0, FC= σN(0,1) σ 2 σ 2 iσ σ . (x) = FF(x) i− iii˜ii(d ix inote iC i˜C iC ˜ix)F ˜˜˜C ˜ N(0,1) We We note note that that We note that ˜Ci˜(d ˜(d ˜C ˜(d ithat iF √ i iWe i+P( i+ iC i(d iC ii(d − − µ µ (d + x) − F (d + x)F ) + F ), x > 0, d d F ˜i˜i F 0, if x ≤ 0, y f (y)dy −e σ σ ∞ ∞ σ ˜ ˜ ˜ ˜ < x| C − d ≤ 0)P( − d ≤ 0). +P( T C C C C i i ˜ (x) = F i i i i 2π T ˜ ˜ (x) ˜ (x) = i i i i i i i i i i ˜ − d < x). = P( C ˜ ˜ ˜ C C C C 2 ˜ ˜ C and density function −z ∞ ∞ < x| C − d ≤ 0)P( C − d ≤ 0). +P( T σ ∞ ∞ ∞ ∞ ˜ 1 σ √ 2 T< i i i di >and iC ˜ ˜ 2 TP( i y f (y)dy = (σ z + µ) e dz ˜ − µ d density function ∞ ∞ ∞ We note that 2π Tii= T < x| C − 0)P( C > 0) 2 2 Ti i x) =i ˜P( i i i i i i ˜ 2 2 2 d −µ ∞ ∞ ∞ i − d < x). = −z i i i i 0, if x ≤ 0, < x| C − d ≤ 0)P( C − d ≤ 0). +P( T 2 −z 1 +µ +µ 1 1 − − F F 1 − µ d −z −z C ∞ ∞ ∞ ∞ i i dd i˜T˜0)P( −(d −(d −(d −µ)∞ i≤ i˜C idd i di 2√ 2 1 2π di −µ ˜˜0)P( ˜− ˜i− ˜≤ ˜= ˜C ˜idand ˜< ˜C ˜i− ˜< ˜C ˜diiCid< ˜˜T− ˜≤ i1 −zd N(0,1) N(0,1) ∞ σ 122dz if x0) ≤ 0, 0, if> xif ≤ 0,i 0, ifi (y)dy 0,= 2 i −µ) ix−µ) i≤ − µ 0, x ≤ 0, −z −z P( < < x| x|˜≤ x| CC C − − d ≤ ≤ 0) 0) 0) = = 1 1 1 and and P( P( P( T T < < x| x| x| C − − d > > 0) 0) x| < < C x| x| − C − − ≤ 0)P( ≤ 0)P( − C d − d 0). ≤ ≤ 0). 0). +P( +P( +P( T T 1 1 ˜TiT ˜ ˜ ˜ y f (y)dy = (σ z + µ) e dz +µ 1 − F i −z iT i0, i i i i i i i i i i i i i i i i i i i i i i i 1 ˜ 2 σ σ − − µ µ d d d 2 2 2 2π σ CP( d d ≤ 0). T˜i < < x| C − d ≤ 0)P( C − d ≤ 0). +P( T (13) 2 2 0,0,x|P( if if x x ≤ ≤ 0, 0, √ √ N(0,1) y f (σ z + µ) e dz σ σ e y f y f (y)dy = (σ z + µ) (y)dy = (σ z + µ) C i i i i − µ − − µ µ d d d +µ 1 − F √ √ i i i 2 We We note note that that i i i i i y f (y)dy = (σ z + µ) e dz dz + µ z e = σ 0, if x ≤ 0, d −µ −(d −(d −µ) −(d −µ) −µ) 2 2 2 ˜ −(d −µ) ˜ ˜ ˜ ˜ ˜ ˜ ˜−µ σµ) ˜0) ˜T ˜i< ˜T i22π WeP(note ˜ii2σ 2of µ can eσ = µ dσd2σ − µ Cσ ) (σ i+ i˜ Ce˜= +µ 1iiN(0,1) Fd√ We the integral the thesis. Now we √ √ P( TP( T< x|< x| Cx| −Cithat − di − ≤di i≤ 0) ≤0) = 0)= 1=and 11and and P(P( T˜P( < x| di x| 0) >C 0) dfirst −µ 0) = 1have and P( T < x| dintegral > 0) yWe fC= f˜= (y)dy (σ ziµ + + µ) eσeid2of dz dzC ddz i˜)− idz i= iz 2F σ −µ dCidz d √ i We ithat iC i˜d iT i˜i< iP( i< i> ˜(y)dy N(0,1) √ i − di ≤y i)−µ iµ) i 2σ fF = (σ + edd2π σ −di − − We˜ note that eproved eCproved .2π .µ . µNow + µ−µ 1− − − − F F σ+ i −µ (13) −µ dii −µ i1 −µ) i(y)dy C id iµ ˜We +µ +µ 1µ − 1d1i)izF − F have the the thesis. we can d−µ N(0,1) N(0,1) N(0,1) σ 2+ 2ifirst 2i−(d dd√ note 2) i 2π d√ iy iif C˜N(0,1) i1 (13) (13) (13) ˜Thus ˜ d P( T < x| C − d ≤ 0) = 1 and P( T < x| C − d > 0) 2π d+µ −µ N(0,1) N(0,1) N(0,1) i +µ 1 − i i (13) i i +µ − F 2σ 2σ ) 2σ ) i 2σ (d + x) − F (d ) f (d + x), if x > 0, f σ i 2π σ i i i i i i σ σ note that Thus < x| C − d ≤ 0)P( C − d ≤ 0). P( Ti We σ − µ d i ˜ ˜ ˜ (d + x) − F (d ) f (d + x), x > 0, f i i i σ σ and density function N(0,1) d d √ √ √ 2π 2π √ = = = e e e . . . + + µ + µ µ 1 − 1 1 − F − F F σ σ (13) i note i that i= i˜C = e ˜i Cii (13) ˜i Cii C˜i Cii i i di move i ˜ ˜ σ σ ∞ ∞ 2 2 2π 2π 2π 2π (13) σ σ σ 2 2 C C − µ d N(0,1) N(0,1) N(0,1) σ σ − − − d d d < < < x). x). x). = = P( P( P( C C We note We note that that 2σ ) to proof of equality (18). Using the designation (15), the −z −z i 1FN(0,1) σ −(d −(d i(x) i˜(x) i= i˜= σ ˜˜ii− ˜˜ii− proof ˜˜ii < ˜˜ii < fTdensity We note = e2 .the∞ +∞ µ∞1√ − i2−µ) i −µ) (18). and density function ˜P( (14) P( P( T− T x| x|C C −and dfd1iiTi˜i= ≤ ≤ 0) = 1i < and and P( P( T− T x| x|C C −ddii > >0) 0) +µ σ σii− ˜= move to of equality the designation (15), −µ) ˜iσ ∞∞−(d 2σdz. 2π 2π Using σσ σ1121√(14) − µ µ and density function −µ) d density function function i 0) − F ˜ithat ˜x). 2∞ − − di˜C1i˜− d< di x| x). < x). = = C P( C C − −(d ˜< ˜≤ ˜0) ˜0) if xdi∞ ≤ 0, σ ty ity function function 1density 2σ )random )∞ −z i< i P( FC (x) =iC P( C −< dix)P( x)P( − d0, > 0)x). + P( C diC 0)We −z −z −z 11N(0,1) ˜ σ − µ 1222d2ddz. iµ 2π −z −z 0, if x ≤ 0, P( T x| C − d ≤ = 1 and T x| − d > We notice that ∞∞ ∞ 1 1 ˜i= −(d −µ) ensity function i= i˜0) iC i d−T ˜ ˜ 2√ 21 2d2√ 21 −µ dthat −µ ˜ ˜ i−z ˜ ˜ ˜ − d = P( C We notice = = e e + + µ µ 1 1 − − F F and substitution function of the variable C i i i i i i T ∞ ∞ 2√ 2 σ − d σ 2 2 i 2 2 2 i i i i P( T < x| C − ≤ = 1 and P( < x| C − d > 0) −z −z −z −z i 1 1 1 1 We We have have have proved proved proved the the the first first first integral integral integral of of of the the the thesis. thesis. thesis. Now Now Now we we we can can can F (x) P( − d C − d > 0) + P( C − ≤ 0) e e dz + µ σ z = i ˜ 2σ ) N(0,1) i i i i i i −z −(d −µ) −(d −(d −µ) −µ) 2 ˜ 2 2 2 ˜ ˜ ˜ ˜ ˜ ˜ 2 1 1 σ − µ. d i i i i i i 2 2 2 ˜and ˜x), ˜x| 2π 2π 2random 2 dz. substitution density function the variable C + x) − (d +dT if √ √ √ + µµd)= eand zσ ethe = σhave = .Now 1dz 2σ ithat iof iµ i dz + µF dz + µcan e+ z− eσ = σ σ Thus Thus Thus √ √ −(d −µ) P(T T < < < Cix| − C − ≤ddF diiC0) ≤ 0) 0) = and 11(d and P( P( < − C >dddii0) >σ0) 0) ==−(d ˜i= ˜idi i− ˜idi0, dz + dz. edF e√ 2of = σ deσ −µ −µ iWe i ˜P( i )1= i0) σeproved σ N(0,1) σµNow iP( i˜ix| i(d ix| i< i˜≤ iP( i< ix| i− i− i> 2σ 2z√ 2)the 2µ√ ˜1˜iand i −µ) iµ σ = e− µ − µ d1idz. 1edµ C˜˜T C 2σ − = 1ffTT P( TC x| C − dff= > P( < x| C − ≤ 0) and T˜˜Tix). < x| CC − > 0) 2π 2π we √ √ √ dproved ddof −µ We We We have have the the first first first integral integral integral of thesis. thesis. thesis. Now wewe can can 22σ 2the dz dz + + µ e+ z√ z√ ezproved σσ dzi −µ ddz. −µ di −µ have proved the fi i (d i− i− i i≤ i˜˜iC − dx), 0, i i i i fT˜id(x) (14) =0) ) 2σ 2σ ) 2 2 σ ˜ 2 Thus Thus Thus ˜ ˜ we used in proof (17) we obtain Thus d d −µ −µ d d −µ −µ i i −z −z σ 2π 2π 2π 2π 2π 2π (d + x) − F (d ) f (d + if x > 0, f (d + x) − F (d ) f (d + x) − F (d ) f (d + x), if x > 0, (d + x), if x > 0, f √ √ √ move move move to to to proof proof proof of of of equality equality equality (18). (18). (18). Using Using Using the the the designation designation designation (15), (15), (15), the the the −z −z −zσ2 − C C C 2σ ) 2π 2π d −µ d −µ 2σ ) − d < x). = P( C We have proved the first integral of the thesis. Now we = = = e e e . . . + µ + + µ 1 µ − 1 F 1 − − F F σ i i i i x) − f˜Cd˜iix), + x), x< > 0,< f˜C− 2+ ˜C ˜˜i0, ˜˜˜F ˜= ii˜ ˜+ iix), iif i(d i(d i C˜ivariable i(d i−(di −µ) 2π σ σσFN(0,1) i˜iii where i)˜x)(1 σ1 σ + − F (d ))iif F (d = F ˜iused i .µ 2π i 2π ˜ii− √ i(x) N(0,1) √ σ e We − FN(0,1) C = eof(17) .(15), + µ 1.σUsing − − d˜if x). = P( C we inproof proof we(18). obtain C˜x) C C C ˜> ˜d 2π 2π the σ iiC iP( i )C ˜ ˜ ˜ ˜ (d + + x) F F (d (d ) ) f f (d + + x), if x x > 0, fdCf= Thus (14) (x) = f is density of random C i C1 i(d i (14) i fC 2π i i 2 there 2N(0,1) 2π 2π C C C N(0,1) ˜C ˜i (d ˜ i i i i i i (d + x) − F (d ) f (d + if x > 0, f ˜˜i˜(x) 0, x ≤ 0, σ σ σ σ ˜ C − x). = where (x) is density of random variable C . CT˜˜fiff(x) − ≤ 0) = and P( T < x| − > 0) 2 2 2 (e ) = −ze , there ze i i i (d + x)(1 − F (d )) + F (d ) = F ˜ ˜ ˜ C C C C move move move to to proof to proof of of equality equality equality (18). (18). Using Using the the designation designation designation (15), (15), the the the = i i i T notice that move to proof of equal f (14) (14) (x) = σ σ i i i C i i i i σ σ σ (e ) = −ze , ze ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ σ − µ d i i i (14) (x) = iddiC i ix)P( C= 2π 2π 2π C Cd σivariable Fi˜i(x) (x) = = P( P(P( C CiiC − x)P( x)P( C − − − dC di> > > 0) + +x). + P( P(P( CC − − ddi˜(14) ≤ ≤ 0)0) TTi i FFT= i C ii< iC − − di0) x). < < x). = P( = C ˜i= T˜iP( ˜˜iC ˜and ˜Cdidii˜ii− We We have have proved proved the the˜equality first first integral of ofsubstitution the the thesis. thesis. Now Now we can ca 2 ) function ii(14) 2π˜ We (14) =) i= ˜T ˜C 2π proof integral i− ii − id< i˜< iiP( iC i< iC i− id0, i˜≤ iC 2σnotice 2 σdesignation 0, if ≤ 0, − d(x) P( substitution substitution that that that we density density density function function of of the the the random random variable C.C di˜i0) < P( C T(x) move to of (18). Using the (15), We that Thus ithesis. i and i i− iThus ∞random ∞ √ −(y−µ) ˜C ˜0) ˜xfunction = eWe + µ 1We − Fhave proved the first integral of we can 0, if xx+ ≤ 0, if xif 0, ifC ≤function 0, ivariable x)+ ≤ 0, notice that We notice We notice that that F0, F =< = P( =x). P( C˜P( − Ci˜− dii − dixx). 0) > 0) + 0)≤ P( P( CTP( C −C d(d − di˜≤ diP( 0) ≤ 0) N(0,1) F (x) C0) − dthat < x)P( − dof > 0) + CWe −variable dnotice ≤∞the 0) 2substitution ˜−(y−µ) ˜C ˜Now 12and We notice that proved the first integral of the thesis. Now ˜iT˜(x) ˜iiF iC i˜d i i< i(d ii − iif i> i− iWe i= i≤ inotice ihave i 2that i iproved Thus0,0, i 0, ∞ if if x x ≤ ≤ 0, 0, 2P( 2substitution and and and substitution that that that density density density function of of the of the the random random random variable variable C C T˜iF(x) T0, density function of the ˜ = F (d + − F (d + ). We have the first integral of the thesis. We notice notice that ≤ 0, i i i iF 2 −z −z −z −z Thus ˜ ˜ ˜ i σ 1 ˜ We that (x) = P( C − d < x)P( C − d > 0) + P( C − d ≤ 0) move move to to proof proof of of equality equality (18). (18). Using Using the the designation designation (15), (15), the ˜ 2σ 2π C Cii+ C ˜i i˜ < √ i Proof. i(d i˜C i i move ethesis. ffirst (y)dy = dy = Thus P(Cisi= − dF x). proved We We have have have proved proved the the the first first integral integral integral of of of the thesis. Now Now we we can we canthN iofx) ii ). proved ˜(d we we we used used used in in in proof proof proof (17) (17) (17) we we obtain obtain random and substitution density function of the variable C can = (d + − F− x)F )iii)two FCcases: (d ˜obtain We consider two cases: ze 2thesis. ˜F ˜ihave i(15), Proof. We consider ii+ ii(d where f= (x) density random variable .C(d the first integral of the Now we C 2thesis. 2the 2Now We have proved integral the thesis. Now we can toWe proof equality (18). Using the designation the 2σ C C (d + + x)(1 x)(1 x)(1 − − F F (d (d )) + + + FCF F )+ )+ = = FTF F˜i= Thus Thus Thus 22 22of 22the i2 , first 2−z222of 2−z 2−z 2−z Therefore (e = −ze there dz = −e √ (y)dy e2the fwe = dy 2 )move 22can 22.= ˜C ˜i (d ˜ix)P( ˜i (d ˜C ˜i ii(d ˜> ii(d iC˜C id iP( iC i i)) i˜)) iWe C˜i Thus to proof of equality (18). Using designation ˜ ˜ ˜ ˜ ˜ Therefore ˜the −z −z −z −z −z −z −z −z −zused −z −z(w Thus ˜ C C C C C −z −z −z −z we we we used used used in in proof in proof proof (17) (17) (17) we we we obtain obtain obtain we in proof (17) C d d 2π F F (x) (x) = P( C − − d d < < x)P( C C − − d > 0) 0) + P( P( C C − − d d ≤ ≤ 0) 0) 2 2 2 2 2 2 2 i i ˜ ˜ move to proof of equality (18). Using the designat i where f (x) is density of random variable C . i i ˜ ˜ 2 2 2 2 i i i i i i i i i i i i 2 2 2 2 ˜ ˜ ˜ −z −z −z −z −zthe (d ˜˜i (x) ˜cumulative and substitution tha tha density function function the random random variable variable C C ˜iiC TC Ti˜density (d (d x)(1 + x)(1 − − F< − F (d F (d )) + )) + F + F (d F )(di0) )to = = F(x) = F F (e )˜Using = −ze ,,,the there ze dz = −e (d + x)(1 − + F (d )−z =C˜variable Fof −z −z −z −z where isdensity of variable C .iC 2C 22π 2.22 ..substitution 2to ˜of ˜random ˜(d ˜then ˜C ˜(d ˜move ˜(d ii2and here where fd> is2. random variable (x) is .Cdensity of random C ixdensity 2proof 2Using 2 2integral 2−z ˜i equality ˜of ii + iC iii)) i≤ i )+ i−z i )) ithere move to proof to proof proof of of of equality equality (18). (18). (18). Using Using designation the designation designation (15), (15), (15), the the the F(x) = P( − dF < x)P( Ci˜˜1. − dx)(1 0) + P( − d 0) d−ze di ze where fCC(x) (x) is random variable . we used in (17) we obtain ˜ 2F 2of 2the ˜ ˜ ˜C move move ˜fidensity ˜> 1. x 0, then ˜ ˜ (e ) = −ze , there dz = −e i .density C C= C C C C C)) Ci˜+ C (e (e ) = −ze , there ze dz = = −e −ze , there ze C C ˜fiC(x) 2the 2) 2 i random i ˜of i+ i i i ≤ 0. From definition of the distribution iequality i We have proved the first of the thesis. Now we can (e ) = ze dz = −e . x > 0, proof (18). the designation (15), C move to proof of equality (18). Using the designation (15), the i i i i i i i ˜ and substitution that density function of the random variable C i i P( − x)P( − d > P( − d ≤ 0) T ˜ ˜ ˜ ˜ (x) (x) is is density of of random variable variable C C . . 2 2 2 2 2 2 2 2 ∞ ∞ ∞ ∞ ∞ ∞ i −(y−µ) −(y−µ) −(y−µ) (d + x)(1 − F (d F (d ) = F i i i i i i i ∞ i i i f is density of random variable C . (e 21 2 . variable = −ze −ze ,∞,2there there ze ze2 zeof =random −e −e and density function the Ci iP( i C Ti ˜˜ FT˜definition (x) = C dd(d < x)P( − >ddi 0) 0) + P((e Ci (e −)d)2i= ≤ 0) ∞. .variable Ci C˜i i(d 1dz )we =used −ze ,∞∞there dz =22−e C C˜+ 2.x)P( xF˜= ≤ 0. From of the cumulative distribution i˜d− id iP( ii− ˜C ˜C ˜P( ˜+ ˜iP( ˜iddCi˜− in 1dz = σ ˜σisubstitut and sub density function of the random C i(d i0) 2 22 − = = F F F (d (d (d + + + x) x) x) − − F F F (d (d + + x)F x)F x)F (d ) ) + + ) F F F (d (d (d ). ). ). F F (x) (x) = (x) P( = = C P( − C C d − − < d d x)P( < < x)P( x)P( C − C − − > 0) > > 0) P( + + C − C ≤ ≤ ≤ 0) 0) i ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ we used in proof proof (17) (17) we we obtain obtain ∞ ∞ ∞ ∞ 2 ∞ −(y−µ) −(y−µ) −(y−µ) We consider two cases: ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ i i i i i i i i i i i i i i i i i i i i i i i i i i i −z C˜Proof. − d < C − d > 0) + P( C − d ≤ 0) d 1 − µ 2σ 2σ 2σ F (x) = P( C − d < x)P( C − d > 0) + P( C − d ≤ 0) and and and substitution substitution substitution that that that density density density function function function of the of of the random the random random variable variable variable C C C ˜ T T T ˜= Ci results Ciii+ C Cifor √ √√ in 1obtain 1∞that eevariable e1(15), f∞Using fC(17) (y)dy (y)dy = =and = dy dy = = i i We ii i= iF i−that i+ (d + x)(1 x)(1 FFCx)F (d (d )) + FF (d (d )˜i ).iiproof = = F FC+ i+ i 0. idy ii i2˜y i T˜ii iCC ii(d ix) iT iC iF iC i) function directly FiC− (x) = 0˜C(d x+ 2= = Therefore move of equality (18). the designation the √ that f−(y−µ) (y)dy − substitution density function of the random variable C ˜C˜ifi(y)dy ˜used and substitution density function of the C ˜C˜iC ˜i˜x)F ˜F ˜i≤ we used in proof we obtain 22˜random ii)) i). i)to ˜+ √ ˜ii− y f (y)dy −e ∞ i+ C = F = F (d (d (d x) + − x) − F F F (d (d x)F + (d + ) + F (d F (d (d ). C C C C = F (d + x) − F (d + x)F (d ) F (d ). ˜ ˜ 2 2 −z Proof. consider two cases: ˜ d µ 1 C we proof (17) we i ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ i i i i i i i i i i i i i i i i i i (d + x)(1 − F (d )) + (d ) = F ˜ ˜ √ ˜ ˜ C 1 2σ 2σ 2σ e . dz = 1 − F i C C C C C C C C C C C C function results directly that F (x) = 0 for x ≤ 0. F (x) = P( T < x) = P( T < x| C − d > 0)P( C − d > 0) C C C C ˜ ˜ ˜ √ √ √ i i i We consider two eedobtain roof. consider consider two f˜eCiwe (y)dy = = = dy == .2π 2π idydy we used in we fC˜i (y)d Therefore itwo icases: iFcases: iiF+ i− ix) iF ˜= icases: i < iF ˜(d F+T˜C (x) = P( T P( CiF >Therefore i Ci − i0) ii= x)(1 (d )) F (d )− = F ix)F iC id i 0)P( i√ We consider cases: N(0,1) C− C ddi dd di proof ˜(y)dy 2π 2π2π= ˜cases: ˜x) = (dProof. − (d + (d )T+ (d Ti˜We ˜iT= ˜i+ i+ iwe i d> i two i< −µ iiproof iin iobtain fC˜i∞f(y)dy (17) 2dfd C 1. xdWe then ˜i(d ˜we ˜x| Therefore Therefore id iP( iC iused + x)(1 Fi(d )) (d F 2σ 22 dy = (Proof. C˜We −2. < x)P( dF= > 0) + − 0) C CC We eProof. consider two two dthat Therefore we used we used used in (17) (17) we we obtain obtain = − Fe2π 2σ iproof C˜+ C ˜C≤ ˜Therefore ix)(1 i(d i )we i iF i− iconsider i> icases: i= ii(14) iC i ∞i −(y−µ) √ two cases: ∞ ∞∞2π e (y)dy To prove on the density function fTi˜i). i(d i˜idensity i Therefore and substitution function of the random variable C˜(17) C˜ii)i)(d C(x) N(0,1) in proof (17) we obtain ∞ we used (17) we obtain ˜dz Therefore in∞ 2.> 2. x0, xx)(1 ≤ x0, ≤ ≤ 0. 0.−0.C From From From definition definition definition of of of the the the cumulative cumulative cumulative distribution distribution (d F (d + x)(1 − − (d F )) (d + )) F+ (d F )distribution (d ) iTherefore = F ii dproof di did 2π din −µproof 21 1 ∞∞ −(y−µ) 1i−(y−µ) C ˜the ˜(d ˜˜C iformula ix)(1 i+ i+ ii)) i)) −z i didi 2π σ 1. xiconsider > 0, then σ i 22∞ (d + F (d )) + )− F1. i∞ i iwe + x)(1 − FiF + FF = F ∞ C C˜˜− C C˜˜+ C C 2 di 2 ˜1. ˜0. ˜+ ˜C ˜2. σ iC i1. = = F F (d + x) x) − F F (d (d + x)F x)F (d (d ) + + F F (d (d ). ). (x) To prove the formula (14) on the density function f i (d ix iCC ii˜F i(d iof ii˜of i(d icumulative ii˜)i(d x > 0, then 2 2 x then x > 0, then 2π ˜ ˜ ˜ ˜ ˜ ˜ ˜ ∞ C C C C C C i i i i i i i i 1 −(y−µ) 1. x > 0, then d d 2π ∞ 2σ 2σ i i i T 2. 2. 2. x ≤ x x ≤ ≤ 0. 0. From From From definition definition definition of the the the cumulative cumulative distribution distribution distribution C C C C C C ≤ From definition the cumulative i˜˜ √ √ ∞obtain ∞ of σ e− e∞µ2= (y)dy = = = 2(y)dy 2 ffC ∞ i proof 12∞22 dy i i F ˜ (di + i˜i (di ) +˜F i(di ). i i 0. −(y−µ) ∞ σ∞ 0,0, then then √distribution = Fresults + x) − x)F 2−z y(y)dy f−z = −e 2σ we = 2i−z ˜(d ˜id− ˜ii+ ˜T −z ∞∞ 0, then ionly ˜2−z d2d2i2iddy − − µ µ 11f∞ 1∞ should calculate the derivative from cumulative Ci2i(y)dy we used in (17) ˜+iC˜d− 1dy −z σ = F + x) − F x)F F (dC ∞∞ σ σ 2σ distribu˜0. C˜T CT C C ∞ ˜(d ˜the < x| C d+ ≤ 0)P( C d∞i ∞0). ≤ +P( i −z C 2 dy ∞y ∞ function function function results results directly directly directly that that that F F (x) (x) = = = for for for x≤ ≤ 0. σ 2. x˜x) ≤= 0. From of the e∞ = −(y−µ) −(y−µ) −(y−µ) ˜F ˜i+ 2 2 idefinition i< ii)˜0) i− iF ii00 x| d− ≤ ≤ +P( Tx(di≤ i∞ i0. i> i 0)P( ˜T ˜i(x) ˜) ˜1 ∞0). F (x) P( < P( < C − d+ > 0)P( C − = F + x) − F (d (d Fi˜). (d 2d−e += x)(1 F (d )) + (d => F C˜ix Cdistribution ∞ ∞ ∞ ∞ 2σ 21 21√ i˜i− i∞ −(y−µ) 21= di√ −µd 2 T(d −(y−µ) 1∞ 11 e f−z (y)dy = √ σ ˜(d ˜˜˜F C∞ d−e ˜C iresults ix) i(d i0x)F i≤ i). fef1C(y)dy (y)dy = iC i) i ). i− ix)F 2˜2∞ d21 ddz di−e ifunction idirectly iT ix| iC ˜F should only calculate derivative from cumulative −z 2σ µ− d if∞ 21 222π −z −z −z √ √ √ i= 0 C˜for C). C i1 iσ i(y)dy −z dy C˜T C CF d d − − µ − µ µ e e . . . dz dz = = = − 1 − − F F 2π √ 22π e f = √ C−z √ √ ˜ i (x) y f (y)dy = −e = = F = F (d + (d (d x) + + − x) (d F F (d (d x)F + x)F (d ) (d (d + ) F ) + + (d F F (d (d i− ˜ ˜ y f = −e (y)dy = i+ iC i).distribui x) iF i− i= i i i √ 2 2 2 function function results results directly directly that that that F F (x) (x) (x) = = 0 = for 0 0 for x x x ≤ 0. ≤ 0. 0. ˜ function results directly that F (x) for x ≤ 0. y (y)dy = i N(0,1) N(0,1) N(0,1) ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ i i i i i i i i i i i i i ∞ ˜ ˜ ˜ 2 F = P( T < x) = P( T < x| C − d > 0)P( − d > 0) ˜ d d ˜ dFi ˜+ − F (d + x)F (d ) + F (d ). 2π 2 2 = F (d + x) − F (d + x)F (d ) + F (d ). 2σ 2σ 2σ C ˜ ˜ d d −µ −µ d −µ ˜ ˜ ˜ ˜ ˜ ˜ C C C C C C C C C C C C C ˜ ˜ ˜ ˜ 2 C C d −µ 2 √ √ √ ˜ T T T ˜P( ˜i= ˜ix| i(y)dy √ √ σ+µ ˜iiii− ˜0ii= eFF eedσ eddσ ff−(y−µ) f˜˜Ci˜√ (y)dy = = dy dy dy = ˜C ˜C ˜function i< ˜the ˜ii− i− 2.= x< ≤ 0. 0. From definition definition of of the cumulative cumulative distribution distribution yiyifCC fi˜CTiy˜ii(y)dy = 21 C iFrom i= iiC i(x) −N(0,1) F i˜ ii= iii− i0)P( i< id iiii x −z 2(y)dy 2= i√ 2 if= i tion d= µ. .√2π dy di2−e idensity ii˜ F (x) T(13). x) P( < diC > 0)P( − > 0) TTo ˜0) 2π F (x) = P( Tfunction x) = Ti(14) < x| C (x) d− T˜(x) C= x) − P( < x| Ci − − ddii(y)dy > C C i(y)dy d√ C< C C +µ 1= −12π F −e fdiC∞2π = √ √ 2π eC2σ e(y)dy e1dz .−µ = dz = =i1 − 1−µ F−µ − dσ −µ √ F (x) = P( T < x) P( T < x| d> > 0)P( C − > 0) e e fC˜di (y)dy = dy = id− (x) we we we To prove prove the formula formula formula (14) (14) on on the the the density density function fCd˜f> fi(x) ˜TTo 2−e idz i −µ dy = ˜> N(0,1) results directly that F for ≤ 0. iix iiP( ˜ ˜ i2. iP( ion ithe i˜iT iC icumulative ithe id i˜0) i > 0)P( i−µ C i ifunction ˜−µ ˜TP( ˜xiprove ˜≤ d2σ −µ 2π 2π ˜Tii˜iT i= i i d d ˜ T iiσ TP( T i N(0,1) N(0,1) N(0,1) i i ∞ d ˜ 2π d i ˜= ˜ 2π 2. ≤ 0. From definition of the distribution i == < < x) x) = P( P( T T < x| x| C C − d d > 0)P( 0)P( C − − d d > > 0) 0) T T i ii˜T T C d iP( i 2π d −µ σ σ σ i d d i 2 ition i i i i i i i i i i i x) = T < x) P( T < x| C − d > 0)P( C − d > 0) i i i (13). √ 2. x˜2. ≤ 0. From definition the cumulative ∞∞dd2π dwe −µ i d2π i1 .σ σ 1 − −z−ze22di dz i i= i did 2π i FTo iformula i on i density i function d(14) di i ion note 2π σσ (x) wedistribution we we distribution To To prove prove the the the formula formula (14) (14) (14) on the on the the density function fcumulative σFN(0,1) idensity i −µ (x) To prove formula function fddTd˜iiii2π i2π We note ˜the ˜that ˜i0. 2π ≤iiWe 0. From definition the di the 2π (dshould + x) (d x)F (d )xderivative + F (d ). ˜i f(x) ddσare 112π −µ µ 22π ˜prove i2 i should i≤ iderivative i that Tthe Tf˜iTd˜(x) ii− 2π < x| C − dderivative ≤ 0)P( − ddensity ≤ 0). +P( T+ function function results results directly that FC F (x) = 00. 0of for for xof xfunction ≤ ≤ 0. 0. σ σ −z 2σ σ 2π . σσ i2. C C C˜i that ii f ˜ f(x) ˜Tii˜i(x) idirectly i= e = 2. − 2. x2. ≤ 0. ≤ From From definition of the of the cumulative cumulative distribution distribution xcalculate 0. From definition the cumulative distribution should only only only calculate the from cumulative cumulative distribudistribudistribuσd σσdi − ix ithe we ∞d= To prove the formula (14) on the density function e dy ∞ the Tof THEOREM If completion times independent 1σ √ − µ µ d2iσ− 2π 2∞ 2task ˜definition ˜from ˜< rom definition of the cumulative distribution i N(0,1) 2. xcalculate ≤ 0. From definition of the cumulative distribution √ function results directly that F (x) = 0from for xd ≤ e . dz dz = = 1 1 − − F Ti ˜C˜i (y)dy −z 1 µ 2F σ√ ˜ithe < x| C − d ≤ 0)P( C − d ≤ 0). +P( T ˜ ˜ 2. x ∙ 0. From definition of the cumulative distribution ˜ ˜ ˜ ˜ ˜ ˜ N(0,1) function results directly that F (x) = 0 for x ≤ 0. T ˜ ˜ i i i i i ˜ −z 2 d 1 µ. ˜ To simplify the notation we assume that d d −µ −µ < x| C − d ≤ 0)P( C − ≤ 0). +P( T should should should only only only calculate calculate calculate the the derivative derivative derivative from from from cumulative cumulative cumulative distribudistribudistribu x| C − d ≤ 0)P( C − d ≤ 0). < x| C − d ≤ 0)P( C − d ≤ 0). +P( T +P( T T HEOREM 2. If the task completion times are independent should only calculate the derivative from cumulative distribu√ ∞ ∞ ∞ 2 − µ2µ d2− di ei12π .σ id−− 2=−z 22N(0,1) 21 − F < x|≤ C d˜C ≤ C − dithat 0). +P( T function results that FxdTx0. (x) == 0i ∞for xd1(x) +µ − F i− iformula ii˜that ii 0 ˜C i i˜iresults idirectly ithat ion ix| ifor i0. iT ˜(13). 2i− i≤˜ 0. ˜i≤ ifor i˜d i− iC −(d −µ) ∞> σµk , √ −z2π −z we we To To prove prove the formula (14) on the the density function function ffT˜P( ˜(14) ˜T≤ 2 i x| dF 1dz 11 − − µ edN(0,1) dz =σdzσ−(d 1= ˜the ˜−z µσµ < x| − d− ≤ 0)P( 0)P( C − − d0)P( d− ≤ ≤ 0). 0). +P( +P( T Ti˜that d ˜for ˜dii −µ ˜density iii− id function function results directly F F (x) = = = 00− 0density ≤ ≤ 0. function that Fdirectly tion tion tion (13). (13). i< iresults i the i0 i 0. ithat iP( ifdistribution 0) ix| ionly i(14) ii(x) i(x) ixd i≤ i x| i√ d− −µ From definition of the cumulative ii2N(0,1) iN(p − − µ d directly F (x) = for x ≤ function results directly F = for x 0. kii −µ) k ) (k i= +µ F (x) we To prove formula on the function f Ti˜the T T N(0,1) σ iassume i i i 22F ˜ function results directly that ˜T (x) = 0 for x ∙ 0. 2π ˜ i i − µ d ˜ We note that i i d −µ √ √ 2 e e e . . dz = dz dz = = 1 − 1 F 1 − − F F +µ 1 − (x) we To prove the formula (14) on the density function f i 1 − F +µ 1 − F T T 2 2 σ +µ 1 − 2π i 2σ ) i ˜ i density ii tion tion tion (13). (13). (13). √ N(0,1) N(0,1) N(0,1) tion (13). √F normally e Ti−zrandom dz = 1variables FF− , N(0,1) aµ·1.p−k1)F− (kσ p˜ek2σ∼ eσN(0,1) dz =distributed − F√ ) N(p (x).di.−µ we To prove the formula (14) on the function N(0,1) d− −µ −µ 2π k+ = F= ∞ N(0,1) σ N(0,1) 2 f+µ should should only only calculate calculate the the derivative derivative from from cumulative cumulative distribudistribu√ We note that eof +µ id− i1 T˜+µ dn), −µ N(0 N(0,1) N(0,1) σ µ = σindependent σ we (x) we we To prove To Totion prove prove formula the formula formula (14) (14) on the on on the density the density function function function fTp˜i˜d2f(x) f˜˜Ti˜(x) i 2,+µ dN(0,1) − 1(15) σ σ1. value σ (13). N(0,1) i˜f−µ 1, .1,2. then the expected tardiness (12) of N(0,1) task iIf 2π 2π 2π We note that ethat note that We note that σtimes i shat directly that FTµthe (x) 0only for xcalculate ≤ We note that σ σσ (x) we formula (14) on the density fon TTdistribuHEOREM THEOREM HEOREM 2. the the the task task completion completion completion times times are are are independent independent (x) we To prove the formula (14) on the density function To prove formula (14) the density function (x) we should only calculate the from cumulative T ˜should 2π σ2. σσIf σIf 2π ˜iσ σ σ T.σfi− =the p= + .derivative . . function + p(14) ,0. and = adensity p21 + .P( .distribu.C + Tthe Ti i< 2π derivative from cumulative 2 2, 2task ˜ σ ithe iassume 1the ote that d < x). = P( C 2π √ . e dz = 1 − F i 1, . . . , n), then the expected value of tardiness (12) of task −(d −µ) − d x). = 2 2 ˜ ˜ should only calculate the derivative from cumulative distribui i ˜ ˜ To To To simplify simplify simplify the the notation notation notation we we we assume assume that that that N(0,1) T HEOREM T T HEOREM HEOREM 2. If 2. If the If the the task task task completion completion completion times times times are are are independent independent independent i i T HEOREM 2. If the i2. tion tion (13). (13). P( T < x| C − d ≤ 0) = 1 and P( T < x| C − d > 0) d −µ µ = p + . . . + p , and σ = a p + . . . + p . (15) i i i i i i 2 should should should only only only calculate calculate calculate the the derivative the derivative derivative from from from cumulative cumulative cumulative distribudistribudistribuσ − µ d i i function 1the ivariables ∈ Jassume isT HEOREM should only calculate the derivative from cumulative i−(d σthetaskp˜p˜kcompletion distributed (x) formula (14) on density fTassume −(d −µ) i distribuculate derivative from cumulative 1C ∼ ∼ N(p N(p N(p ak a,· a·times p2p·kkp))k.(k )(kare (k = ==independ random random variables variables normally normally distributed distributed should only the derivative from cumulative distribu22) 22 2. ion (13). If To simplify To simplify the the the notation notation notation we we we assume that that 2π To simplify notation we that kp˜ k∼ kk,,iµ 2σ P( TT < x| CC − dcalculate ≤ 0) = and P( < x| − dthat > 0) T˜˜iassume −(d −µ) −(d −µ) ˜˜− ˜˜− ˜simplify ˜we ˜i iC ˜−(d ˜˜< ˜˜< (13). σ the ˜the ˜T ˜idistribu˜i 0) i ix|∈ Ji σ is 2−(d (13). P( x|≤ ≤ 0) = 1i˜i< and P( T x| C d> > 0) simplify the notation we assume −(d −µ) −µ) iicase, iii1 i− iii > ˜C i˜tion idd i˜iT i0) id We have proved the first of ˜11and N(0,1) σ − µ ∼ N(p ∼ N(p N(p , integral aare ,k·a,integral pa·independen ·p) pk(k )k )of (k = (knorm = =th random random variables normally normally distributed pthe ˜− σ µk2σ dp˜iddσ random variables ition i− i˜x| id σ µ2(12) d −(d have proved first the ˜In i√ i−(d ˜P( ii˜∼ 2σ k− kσ kof k.2are kindependen > 0) 0) this the density function of the deadline for com222))expected ip 2variables i −µ) i −µ) (13). (13). (13). T T HEOREM 2. 2. If If the the task task completion completion times times (13). □ ilculate ition ition i(13). i≤ i i i P( T˜ix|tion σ σ − − µ µ d d = e + µ 1 − F 2σ 2σ ) ) the derivative from cumulative distribuThus ˜ id i0) i i i 2σ ) 1, 1, 1, 2, 2, 2, . . . . . . , . , n), . n), , n), then then then the the the expected expected value value value of of of tardiness tardiness tardiness (12) (12) of of task task task i i σ − µ d ∼ N(p ,are a ·Fpindep random normally distributed p ˜ 2 2 2 2 2 2 2π N(0,1) 22)2. √2σ√ √ = e)If+ + µ(1 11If − F(d .times i √ = = eare + µFF 1− − F µ(18). 1kof − k) ( − dσ= < x). = P( C √ T HEOREM task completion times independent 2.+ = eTµthe .(12) + µ −˜N(0,1) FN(0,1) 2σ ) e2+ −(d2σ −µ) ifunction i= ˜ To To simplify simplify notation notation we we assume assume that that µµ µ= == pcase, p11pthe + + ..the .notation ....+ + . density + p= pithe ,P( and and σ σ = a a a p p p + + + . . . . . . . + + . + p p p . . . (15) (15) (15) N(0,1 this of the deadline for tasks com2σ i.k N(0,1) HEOREM 2. the task completion √ √ i,pthe i ,and 1+ = e = e . . µ 1 1 − − e ) = F )) E( T σ σ ˜ √ i i i 1, 1, 2, 1, 2, . . 2, . . , . n), . , . n), , n), then then then the the the expected expected expected value value value of of tardiness of tardiness tardiness (12) (12) of task of task task move to proof of equality Using t 1 1 1 1, 2, . . , n), then the e= 2 2 2 2 2 2 i i = e . + µ 1 − F 2π 2 2 N(0,1) N(0,1) T HEOREM 2. If the task completion times are move to proof of equality (18). Using the To In simplify we assume that − < x). C Ccompletion σtimes ˜− σ ˜+ pletion ip= ∈ Jp1=simplify σ icompletion 2independent ∼ N(p N(p aa··(12) ppkk))(k (k random random variables normally normally distributed distributed 2π 2π µ== = µP( µP( = p˜+ .˜= .d+ ..P( + .P( pC + piand ,idd,d< and and σx). σ =notation σ= a=awe ap p+ p1 1+ . .+= .assume .2T + . .P( .p+ .ithat (15) µ+ p− +2i (15) . .random .(15) and σ = .avariables pIf + .task .+ pN(0,1) . (15) 2π ˜1i )the To the that − < d− x). < P( C.simplify C.p˜..i˜pi+ kare ka∼ kk,,= 2σ T+ T Tproved HEOREM HEOREM HEOREM 2. 2. 2. If If.the the task task completion times times are are independent independent i˜x). 1˜C 1+ < x). = ix). i+ ∈ ∈ i∈ Jpthe J0) is J,is is i2π 1.˜d 1, 2, .normally .E( , n), then the expected value tardiness of· = t √ σthe ii,ip iiinotation i.C = (1 − F (d )) T0) epk˜p˜of i= To the we 2π 1˜assume i 2π σ ˜i HEOREM 2. If task completion times are independent ˜isi2π T HEOREM 2. If the task completion times are independent iσ ∼ N(p , · p ) (k variables distributed p ˜ − − d < < x). C 2π ˜ ˜ C µ = p + . . . + p , and σ = a p + . p . (15) k k 2 i i i i We have the first integral of thesis. Now we can − d < x). = P( C F (x) = P( C − d < x)P( C − d > P( C − d ≤ pletion i ∈ J ∼ N(p , a p random variables normally distributed p ˜ i 1 To simplify To To simplify simplify the the notation the notation notation we we assume we assume assume that that that ˜ F (x) = P( C − d < x)P( C − d > 0) + P( C − d ≤ 0) i i i i i i i ˜ −(x−µ) k k i ∈ i i ∈ J ∈ is J J is i i ∈ J is density function of the random variable ˜ 1 T 22the 2tardiness i that i22for icomiproved i..,,n), N(p random variables normally distributed pvariable ˜pcan he notation we assume that To simplify the notation we assume that To simplify the notation we assume Thus density function of random Cki Tof ithe 1deadline −(d −(d −(d −µ) −µ) kp ∼ 1, 1, 2, 2, . . . . n), then then the the expected expected value value of tardiness (12) (12) of of task tas iof 22.. i We In In In this this this case, case, case, the the the density density density function function function of the the deadline deadline for for tasks tasks tasks comcomWe have the first integral of the thesis. Now we i −µ) i2π iof 2 ∼ N(p ∼ ∼ N(p N(p , a · , p , a a · ) · (k ) ) = (k (k = = random random random variables variables variables normally normally normally distributed distributed distributed p ˜ p ˜ p ˜ i ∈ J is σ σ σ 2 We have proved the first integral of the thesis. Now we can k k k k k k k k k µ µ = = p p + + . . . . . . + + p p , , and and σ σ = = a a p p + + . . . . . . + + p p (15) (15) have proved the first integral We have of the proved thesis. the Now first we integral can of the T HEOREM 2. If the task completion times are independent 2 2 2 −(x−µ) ∼ N(p , a · p ) (k = random variables normally distributed p ˜ ∼ N(p , a · p ) (k = random variables normally distributed p ˜ Thus 1, 2, . . . , n), then expected value of tardiness (12) of task We have proved the first integral of the thesis. Now we can 2σ ifunction i function 11the √2 k− kUsing kthe fThus e.the (x) = . for kwe ktardiness k −(d −(d −(d −µ) −µ) 22σ 2 value 2we iithe 12this move to proof equality (18). the designation (15), the 1ppof 1, 2, .(1 .integral .− ,− n), the expected (12) ˜T˜iof Thus 2for 2(16) hus We We have proved proved the the first first integral integral the thesis. thesis. can this In this case, the density density of the of deadline deadline deadline for tasks tasks tasks comcomcom In density deadline for tasks com 2σ 2σ i −µ) ican iof µIn=iIn pi this + .case, .case, .+ pthe , that and σfunction a+function + .the .2σ+ p1(d (15) Thus C˜+ √ √ √ have proved the integral of thesis. Now we can2tardiness proved first of thesis. Now we can σ σdof σ eof eNow eNow )˜the )= = (1 (1 F F (d (d (d ))used E( Tof we used in proof (17) we obtain the notation we assume 1∈ 1, .F.Cof .(18). ,then n), then the expected value of 2function i .= ˜C ˜iof ˜expected we in proof obtain i)T i= ithe i)) i )) µidensity = p(x) .1.+ and a.deadline pcase, .2We .have + phave .+ − µσ i+ 1e i2, i2, Jthe is C 1density 2σ i(17) + x)(1 F(d (d )) F ).J.E( F pletion pletion pletion iµ∈ ∈ JIn JP( Jpthis 22σ2σ 2(12) 2−(d case, of the for tasks commove to of equality Using the designation (15), the √ iexpected imove i −µ) µ = pand .σ .σai.,= + p+ ,=P( σ = a1p.p,− p.− + .iimove .i)) pWe (15) fthe .ii+ i.∈ 1, 2, 1, .(15) .(d ,..n), .i.E( .,,is ,n), then n), then then the the expected the value value value tardiness of tardiness tardiness (12) (12) of task of of task ˜E( ˜2, ˜first σ 2π ˜(16) ˜.2, (d x)(1 F F+ (d )∈∈ = ˜i )task ˜i+ ˜= 22C 2˜˜= 2.+ 2+ 2. .variables i2, i+ 2σ ˜i+ i2F ˜pand ˜(15) ˜.i1, iproof √ √ √ C C C move to proof of equality (18). Using the designation (15), the ∼ N(p , a · p ) (k = random normally distributed p ˜ e e e ) = ) ) = (1 = (1 − (1 − F − F (d F (d )) (d )) )) E( E( T T T to proof of equality (18). Using to the proof designation of equality (15), (18). the Using = th (1 E( T 1, n), then the expected value of tardiness (12) of task 1, n), then the expected value of tardiness (12) of task F (x) = C − d < x)P( C − d > 0) − d ≤ 0) 2 2 i C J is 2 1 ˜ move to proof of equality (18). Using the designation (15), the C C C i ˜ ˜ ˜ 2π 2π 2π i i i i i i k k k = µ µ = = p p . + . + . + . . . p . + , p and p , , and σ = = a p a + p . + . . + . . . p . + . (15) (15) ˜ i i i i . +(µ − d ) 1 − F i i 2 C C C and substitution that density function of the random variable C − µ d 2 2 i i i 1 1 1 T i ∈ J is ˜(18). pletion i˜ C ∈˜˜case, i= i∈ J− J toto(15) proof proof of ofequality equality (18). (18). Using designation designation (15), (15), the the . . . +F p˜i (x) , pletion and a1Jd+ p. .1.x)P( + p1function .i −(x−µ) In move to of equality designation µσ = p∈ +.p.˜ .iC,˜˜+ and σ = a(15) p˜1C . .d.i+ ..J˜(15) 2+ 2 12the iUsing i− i N(0,1) 2σ i2π i2π i≤ip∈ imove imove 1˜˜pletion √ move to0) proof of equality the designation (15), the −(d −(d −µ) ipletion TUsing =the (1 FCthe )) e(15), σiddd> 2π i (18). ∈Using JE( is that −(x−µ) −(x−µ) ˜˜i (d In this this the density density function of of deadline for for tasks tasks comcom= P( < − > 0) P( − 0) ii) the i∞ i0) i i −µ) 12C ˜˜−(d the ˜the ˜Jiproof 2substitution 2π ˜case, ˜ii1i1− ˜+ σσ ii− ithe ideadline ∞ ∞ 2 1 −(y− pletion idC ∈ Jx)P( and density function of the random variable C = P( Cdensity < x)P( CC > 0) + P( C d ≤ 0) T˜Ti˜(x) Fthis F (x) = P( C − d < C − d (x) 0) = + P( P( C C − − d d < ≤ x)P( 0) C − d > + P( C − d ≤ 0) . +(µ − d ) 1 − F i ∈ i J i ∈ ∈ is J is is −µ) ∞that F (x) = P( C − d < x)P( − > 0) + P( − d ≤ 2− 2∈ ˜ i ˜F ˜ 2random i i i i i ˜ ˜ ˜ ˜ ˜ ˜ i i i i i i i i i i i In case, the function of the deadline for tasks comi i 1, 2, . . . , n), then the expected value of tardiness (12) of task i i i i N(0,1) density function of the variable C and substitution 2 2 2 2 and substitution that density function of the random density variable function C of the random variable C T i J is T T i ∈ J is −(d −µ) ˜ ˜ ˜ 2 2 2 Here we will prove an auxiliary lemma, which we use in σ −(x−µ) −(x−µ) −(x−µ) ˜ ˜ and substitution that density function of the random variable C 1 −(x−µ) ) = = P( P( C C − − d d < < x)P( x)P( C − − d d > > 0) 0) + + P( P( C C − − d d ≤ ≤ 0) 0) 2π i i ˜ ˜ i i i 2σ 2σ In this case, the density function of the deadline for tasks com 2σ i √ √ iii, Cand i i= i(x) i+ icase, iF i.(d =ipP( dσiF< x)P( C − > 0) P( C − dii2x) 0)Fof = F + x) − (d + x)F (d )F+ F(d = = (1 (1 FC F (d ))substitution E( TTF e(y)dy ethat we used (17) we obtain ˜C˜i˜i i(d −(di −µ)2 √ σthat 1= 12σ 1i+ +(x) ...+ = p(x) .i√ .√ + p(d .density faJfCIn eie+ e2σ (x) = =d.− .i˜˜function (16) (16) (16) 1ii(d and substitution substitution that function of the the random variable variable C2− C− ˜density ˜proof (d + − (d + (d +itasks ).Tthe 2C˜ F i− i.(15) ix)F i ). ii))variable ii)) the the deadline com˜this σ 2 −µ) 2 2 σ i√ i iand and density function of random ˜ithe ˜function ˜in ˜random C Cdensity C2˜for C C 2σf2−(d + x)(1 F F )≤ f 1fi= pletion iiCwill ∈(d J˜Ci˜ifiiprove C C Cof ii (d iitasks ii )used −µ) −(d 2deadline 2 .which √ 2− = (1E( (d )) E( eµ idensity ˜∈ ˜i2π ˜−(d (y)dy i )) i −(x−µ) ˜ ifor iin i C In this In In this this case, case, case, the the density the density function function function of the of of deadline the deadline for for tasks tasks comcomcom− − − µ µ d d d ˜obtain we an auxiliary lemma, we use 1 2σ 2 √2σe2 e2σ 2σ 2σ 2σ iC ii−µ) i =2 = i )(17) iT ˜i2π 2σ C C we in proof (17) we obtain −(d −µ) −(d −µ) √ √ √ √ √ ) = (1 F (d )) σ σ σ E( e f f e e e (x) (x) (x) = = = . . (16) (16) (16) i i i f (x) = e . (16) C i i C ˜ ˜ i density function of the deadline for tasks com˜ 2π In this case, density function the deadline for tasks completion iHere ∈pletion J== i ˜ ˜ ˜ ˜ σ σ σ 2π 2π ˜ i i ∈ J is i we used in proof (17) we i we used in proof we obtain we used in proof (17) we obtain σ √ σ 2 and T . calculating the expected value of random variables T (d + x)(1 − F (d )) + F (d ) = F ) = (1 − F (d )) E( T e CJ C C we used in +(µ proof obtain 22 x)(1 pletion i x)(1 ∈x)(1 ˜˜i (d ˜(x) ˜˜F i used i 2π i 2σ222 and id i− i− i fF di 2π 2π ˜E( ˜˜T)i˜)i)we √ − − ddC√ .e.µ2(11) . 2σ − − − d(17) diE( )T 1(1 − 1= −− − FF F− (d F (d F iC +(µ . we (16) (d + F (d )) + (d − Fwe )) Fproof (d F = 2σ C˜e C C C di(12) −(x−µ) −(x−µ) Proof. By definition of the expected value it folused in proof proof (17) (17) we we obtain obtain 2˜N(0,1) pletion ii(d ∈ + − F )) + = F √ ˜C ˜C ˜C = )1= (1 (1 FN(0,1) − (d F F˜˜Ci )) (d )) E( T eddi2i− ˜C ii√ iµ ˜x)(1 ˜J ˜F ˜)eii(d ˜i (d i)i)d iii+ ii= i(d i(d i(d in (17) we obtain N(0,1) ˜σ 2σ i)) i2σ+ 2π 2σ µ σ σi)) 2π 2π 2π σCi˜)+(µ 2π 20. ii)) i)) C 1)+ 1F C ∞ ∞ C∈ C+ C C C˜− √ −(y−µ) √ ˜˜ithe =i+ (1 (d E( T F E( Ti− eiµ (d + − Fx)(1 F− (d (d )) + (d (d )i(d == F F C−µ) iJ ii+ iiFrom iF ithe i2. i F pletion pletion pletion iC˜F iiJ(d ix)(1 ∈ Jof −(x−µ) i2i)cumulative i )) i − i˜C i Civariables i − 2π i∈ i+ (d + x)(1 )) )i C+ = F(d xF ≤ of cumulative distribution and Tused .in calculating expected value random T2˜we −(d ii˜F i(d 1(d By 1 .and . .223 +(µ +(µ +(µ de= − d(1 )i− − 1i 1− FC2Cexpected − F ˜ifi)) ˜i0. 2. xF From the 22 definition C˜Cfunction C˜deadline Ci˜F C σσ−(y−µ) σ2π +(µ −foldi ) i − i∈ i(x) i) e density the for tasks comi of pletion J+ 22π d(11) 1 −(x−µ) i distribution σ 2π i )dof i )i 2in i˜we iwe iF i˜≤ iof C C 2σ 2σdefinition σ N(0,1) N(0,1) N(0,1) Proof. definition the value (12) it = x) − (d + x)F (d F (d ). Bull. Ac.: Tech. 65(2) 2017 2π √ √ ∞ ∞ f (x) = = e e . . (16) (16) i will i 2 1 2 −(x−µ) 2 ˜Pol. ˜ ˜ ˜ i i i i Here Here Here we will will prove prove prove an an an auxiliary auxiliary auxiliary lemma, lemma, lemma, which which which we we we use use use in in ˜ ˜ 2 2 2σ 2π 2π C C C C ∞∞))fC˜(y)dy ∞ − d−(y−µ) F 2 a random 2 1 ∞ ∞ ∞σ ∞. 1 ∞+(µ 11212 − 2∞i ) lows that √ −(y−µ) −( C√ Ciii = e2−(y−µ) = dy =2µ ˜˜ ii(d ∞ for (1 √ 2σ ˜i )in i (d + i .(d i22(d −(y−µ) σ 2variable 2σ−(y−µ) ∞ N(0,1) fx) (x) = ex)F .lemma, (16) −(x−µ) −(x−µ) −(x−µ) 2C (16) − − µ d d 1 L EMMA 2. If , 1, 2, . . , n is speci1 = − F (d E( T e = F x) − F + x)F ) + F ). −z2 ˜−(x−µ) 2σ 1 ∞ ∞ ∞ 2σ i i ˜ √ ˜ ˜ ˜ 2 i −(x−µ) −z 1 1 1 f (x) = e . i i i i 1 i C ∞ ∞ Here Here Here we we we will will will prove prove prove an an auxiliary an auxiliary auxiliary lemma, lemma, which which which we we we use use use in in function results directly that F (x) = 0 x ≤ 0. σ σ 2π 2π Here we will prove an auxiliary lemma, which we use in −(y−µ) 2σ C = F (d + x) − F (d + x)F (d ) + F (d ). ˜ = F = F (d + − F (d + x)F (d ) + (d F + (d x) ). − F (d + x)F (d ) + F (d ). C C C C 2σ i ˜ √ 1 1 2 function results directly that F (x) = 0 for x ≤ 0. σ 2 = F (d + x) − F (d + (d ) + F (d ). 2 2 2 e f (y)dy = dy = 1 1,˜C˜˜iienauxiliary ˜C ˜Cvalue i idif fC ˜value ˜). ˜= ˜i which − deF2F (x) eand .we (16) ii˜expected i(d ii˜T˜Tand i expected iprove ifF ii22C2˜C˜ii lemma, i√ iT˜T i lows 2∞ ˜˜T ˜ii− ˜= ˜ 2σ iof ivariables 2variables 2σµ ˜and ˜ii2σ C C C ie 2di dz C˜x) C C C C C˜i use C2σ C˜ithat ˜2˜iwill √ TT 2σ .i.fol+(µ +(µ − − dexpected de1 1√ 12− − √ i(16) = dy = Unauthenticated == Ffcalculating F (d (d + + x) − F (d + + x)F x)F ) ) + + F (d ). e (y)dy = dy f = (y)dy = − µ σ 2π √ ii˜C 2= 2σ C ithe ivalue i.i F i(d e = 1 − F ∞ 22π . . . calculating calculating the the expected of of random random random variables e f (y)dy = dy Here we an in i˜ ˜C ie id))i√ 2π ˜Ci˜(x) ˜e(d ˜= ˜(d ˜i (d ˜ ˜ √ √ √ i F i(x) i ... T i (y)dy i i i i i i i i = F (d + x) − F (d + x)F (d ) + ). √ f f f (x) (x) = = e e . (16) (16) L EMMA 2. If C , i 1, 2, . . is a random variable speciN(0,1) N(0,1) e dz 1 − F ˜ i i i 2σ 2σ 2σ C ˜ ˜ ˜ ˜ N(0 − µ d ˜ σ 2π 2σ C C C C C C C C i i i i √ √ i i i √ f f (y)dy (y)dy = = dy dy = = √ 2 2 2 = . (16) Proof. Proof. Proof. By By By definition definition definition of of of the the the expected expected value value value (11) (11) (11) and and and (12) (12) (12) it it it folfol2 2σ 2 f (x) = e (16) N(0,1) i i i i i i i i i i . +(µ − d ) 1 − F d −µ C C C C C C C √ fying the date completion the task (11), i= 2.C˜i xcalculating ≤ 0. From definition of the cumulative distribution e+(µ = 2π ˜ . 2function ˜i Tand ˜on ˜iand σ 2πthen ˜(x)dx di −µ icalculating i of ivalue irandom iexpected −(x−µ) i N(0,1) σ σµ ˜i (y)dy C˜Ci˜i fC − − µ dF d1iµ .fol-of. +(µ − 1˜ddy − C˜ii i To and T˜˜density T .˜T calculating the the the expected expected value value ofof of random of random variables variables variables Ton T and T calculating the expected value variables T2π we To prove the formula (14) the density function fTdefinition ddBy ddxdthe 2π dof dthe dvalue d(12) ˜(x) i )− 2π 1 we diirandom ithe 2π N(0,1) Download Date |σ 4/25/17 PM ˜iiT (x) prove the formula (14) fBy id ii ifthe − µ di2π i i it fol∞ )i . F σof σ 2π σauxiliary 2π 2π idefinition iuse i i. in )iid− = x− fN(0,1) (d. and ˜iE( ivalue Here Here we will will prove an an auxiliary lemma, lemma, which which we we use in Proof. By definition of of expected expected value (11) (11) and and (12) itit2π foli i7:48 Proof. By definition σ ˜i− ˜i 2. x≤ ≤ 0. From definition of the cumulative distribution dProof. di i.Proof. ddi−z diwe i+ T1 2π 2π σ 2π σ 2π σd 2 prove and Ti˜that calculating the expected value of random variables Tthe T− 2= C(11) di−∞Proof. 2π ...x)(12) +(µ +(µ +(µ − − )iµ dd∞ d−∞ )of − 11F 1expected F F σ the date of completion of the task (11), then 2. xFrom ≤ 0. From definition cumulative distribution 2σ 2.0. xCfying 0.= From definition of 2. the xthe ≤ cumulative 0. From distribution definition of cumulative distribution i lows id− ii1 i)) Here we will prove an auxiliary lemma, which we use in √ 2. x ≤ 0. From definition of the cumulative distribution f (x) e . (16) d − µ ithat N(0,1) N(0,1) N(0,1) . +(µ − d ) 1 F lows lows that +(µ − − F By definition of the expected value (11) and (12) it ˜ ˜ ˜ σ ˜ i ˜ function results directly that F (x) = 0 for x ≤ 0. Here we will prove an auxiliary lemma, which we use in 2 N(0,1) ≤≤ 0. From definition definition of of the the cumulative cumulative distribution distribution 0 N(0,1) L L L EMMA EMMA EMMA 2. 2. 2. If If If C C C , , i , i = = i = 1, 1, 1, 2, 2, 2, . . . . . . , . , n . n , is n is is a a random a random random variable variable variable specispecispeci˜ 2 2 ∞ = fT˜1i (x)dx = x fµ x) x ≤ 0.Here definition ofshould the cumulative distribution i From that should only calculate the derivative from cumulative i i the i Here Ti (x) −(d ˜i (d we will prove an auxiliary lemma, we use inE( σ σσ ˜in ˜iii .= ˜˜− ie) i+ − µ=xvariable d222σi2 only calculate the derivative from cumulative distribu∞.∞.that ∞ ∞ − i −µ) √ Cµ 2−z 2 that . and dz−∞ − and and T .∞lows calculating calculating the expected expected value value of random random variables variables Ti(x) T lows lows that which lows ˜value ˜,i= ˜i0T Tdistribu F ∞ σ∞EMMA 2π 2d function results directly that F = 0.0)n.of for xlemma, ≤ 0. ispecii If −z −z −z Here Here we will we will prove prove an an auxiliary auxiliary lemma, lemma, lemma, which which we use we use use in in µ σ˜(x) N(0,1) 11By − µ µµ didd1− µ ˜1, ˜iwe ˜which L Lwe LEMMA EMMA 2.will 2. If 2. If Cprove If C ,˜Ciian ,˜F ion iauxiliary = = 1, 2, 1, 2, . . 2, , . is n , n is a is random a a random random variable variable variable specispeci−z L EMMA 2. C , 1, 2, . n is a random speci∞ ∞for d − 1 Proof. Proof. By definition definition of of the the expected expected value value (11) (11) and (12) (12) it it fol fo 2,2√ 21 2d.0. iiσ −µ ∞ ∞ ∞ ∞ ∞− function results directly that (x) = 0 for x ≤ 0. T nction results directly that F function directly that F = 0 x ≤ 0. = x ≤ i prove an auxiliary lemma, which we use in Here we will prove an auxiliary which we use in 2,for 2..results and T . calculating the expected of random variables T i 0 2 function results directly that F (x) = for x ≤ 0. . +(µ − d ) 1 − F 2 ˜ ˜ ˜ ˜ i i lows that −z −z d d − 1 1 i 2σ ˜ e . dz = 1 − F i ∞ T −(d −µ) T T σ (x) we To prove the formula (14) the density function f N(0,1) and T calculating the expected value of random variables T −z 2 i i d 1 − µ = expected i (11) √ Ti,for results results directly directly that F F˜ ˜(x) (x) =√ 0C for x=xtask ≤ ≤ 0. 0.(11), Proof. the and ˜ √ 2π ˜By definition fying fying fying the the the date date of completion completion completion of of of task (11), then i0 i Lthat EMMA 2.==If iei the 1,task 2, .+ .(11), . ,µnthen isthen a random ∞date N(0,1) 2 i ) speciyof fof (y)dy 1− F ˜ ( variable √ e 2(12) dz = 1− 1− i the e e2ofdz . .it=fol-1 − F F Fvalue dz
(6)
˜.T=ii. .1. − Fi N(0,1)( √ thethe expected ofµ aofrandom variable σσ )value the expected value random variable 2 Tdz 2 dzW then random Ttask 2 σ =prove σ σσ √ √ ethe2σvalue + 1athe − Ftime ebe itask √ the−e yBelow f ˜˜of ˜will ˜i variable N(0,1) the EMMA 3.lemma, Ifcalculate C˜LLexpected EMMA EMMA is time 3. 3. If If EMMA (11) EMMA C C is isaof the 3. 3. task If time If completion C C is (11) is the the of of time time (11) (11) completion J, completion of ofafter task completion i∞completion i∈ ∈ J, J,e1). Lprove LL −zi C˜ (y)dy dto dtask lemma, which will be used Below to=first we prove the lemma, which used to calculate we the which will be used calculate i2π i˜i˜thesis. i (11) i ∈ −(di −µ)2 We have proved the integral of the Now we can i −µ i −µ simple transfo afte afte dBelow We σ σ ˜i −we ˜ ˜ ∞ 2π 2π i −µ W di = P(iσ 2 2π − µ d d < x). C − µ d EMMA EMMA EMMA EMMA 3. 3. 3. 3. If If If If C C C C is is is is the the the the time time time time (11) (11) (11) (11) of of of of task task task task completion completion completion completion i i i i ∈ ∈ ∈ ∈ J, J, J, J, L L L L √ e σ σ i i ˜ i ∞ i i i i 22σ Below ˜ 2 )2 2 2 2 afte aft aft af EMMA 3. If C is the time (11) of task completion i ∈ J, L 2 2 2 ˜ −(d −(d −µ) −µ) 2σ 2 2 ˜ ˜ d −µ EMMA 3. If C is the time (11) of task completion i ∈ J, L d − µ ˜ ˜ we prove the lemma, which will be used to calculate ˜ i then then then then then 2 + iis i(y)dy i ithe EMMA 3. If the time (11) of task completion i∈ L i∈ +µ 1 − eF−(d √ EMMA 3.a(y)dy If C isthe the time ofF task completion iJ, ∈J,J,+ L C−(d −i −µ) ddi i>>0) 0) . the . of the value of variable Tvalue expected value of variable T(11) =−(dexpected µa random 1− F .˜is expected of. equality adrandom variable TCrandom 23. 2time 2(11) iC aft y f = (σ µ ) move to proof of (18). Using the designation (15), the EMMA If task completion i L −(d i= i− i −µ) i −µ) σ σ − − µ µ d N(0,1) 2π the lemma the the N(0,1) ˜ after s i i i i i −µ) aft i i σ σ − µ − µ − µ d d d C 2 2 σ afte y f (σ + µ ) 1 − then then then then i i i i σ − µ d ˜ 2σ 2σ ) ) + 2) 22π formulation formulatio σ0) N(0,1) σ We have first integral of the the thesis. Now we can Ci We √ √µF d∞ ˜∞i2and 2σ )+ ˜2i2)− = ee − .the .variable +µµexpected 1i1function − the the the th then dWe return to Using (20), We (21) return and return the then 2σ )= . substitution the value aproved random variable i (19). N(0,1) < 2σx)P( 0)F =C˜√ ei=1>√ eeP( .− .−FFN(0,1) +i − µ d= − +2σC µ 1d+ − FFN(0,1) ˜T ∞ . .of of ∞ ∞ σ µµto then then √ i≤ d2π + µN(0,1) N(0,1) the that density the random C then i i −11µ d d d d d − µ − − − − µ µ N(0,1) the le i the σ σ i i i i i ∞ ˜ ˜ ˜ the 2π 22 iof 2the 22 of 22i task 2completion 2 22 (15), ∞ ∞ ∞ ∞ ˜is 3. If σC is the (11) ofσ3. completion ∈of If J, time ∈ J, L EMMA If move Cyi 2L isfEMMA the time (11) task completion ∈designation J, LσEMMA 2π−(d+µ 2π 1−F σtask to(y)dy proof equality Using the the We return toafter formul time ddd− diiiii− − − − µ µµ µµ 22π i3. = (σ yy2C + (y)dy (y)dy ) (18). 1= = y− y22(σ (σ F fCf(11) +µµ = (σ 1(σ 1− − + FF +N(0,1) µµwe )) 1 1d− F F ∞ N(0,1) 2 after simple ultimately obtain the simple the after simple transformations, ˜iµ ˜N(0,1) ˜transformations, N(0,1) N(0,1) N(0,1) 222Wodecki 2ffCC 2(y)dy 2(y)dy 2+ 222) 2)2= 2µ + ∞ We proved the first i integral of the thesis. Now we can C d − µthe ∞ W. and M. i −µ) − i Rajba, i (y)dy i22 we used in Bożejko, proof (17) we obtain ∞∞ 2 di d i 2σ σP(C˜di then iσ ˜˜P. − µ 2dtask 2−J, ˜ 2Ciµ d − µ σi −Fhave y y y y f (y)dy (y)dy (y)dy = = = = (σ (σ (σ (σ + + + + µ µ µ µ ) ) ) ) 1 1 1 1 − − − F F F F diy−f − µ 22fffC 22− −(d −µ) σ σ σ σ − µ i ˜ ˜ ˜ ˜ EMMA 3. If C is the time (11) of completion i ∈ L +=x)(1 (d )) + F (d ) d d d d i P(√ C˜− d < x)P( C − d > 0) + − d ≤ 0) 2 2 2 ˜ N(0,1) N(0,1) N(0,1) N(0,1) i 2 2 i ˜ ˜ i i i i i i C C C i i then then i i i i + d µ 2σ y f (y)dy = (σ + µ ) 1 − F 2 2 2 and substitution that density function of the variable C ) of iiirandom i(y)dy Cei 2σ C y f = (σ + µ ) 1 − F after simple transformatio ˜ i We We have have proved proved the the first first integral integral of of the the thesis. thesis. Now Now we we can can ˜ N(0,1) i i y f (y)dy = (σ + µ ) 1 − F d the lemma the lemma C N(0,1) √ y f (y)dy = (σ + µ ) 1 − F the lemma =first . + µ 1 − F + σ σ σ σ the have We proved have integral proved the first the the integral thesis. first integral of Now the we thesis. of can the thesis. Now we Now can we can 2 C ˜ i ˜ii i (σ √ = +µ ) 1− FN(0,1) N(0,1) move to proof of equality (18). Using the designation (15), the We have proved thefirstN(0,1) integral thesis. Now we can 2σN(0,1) idiiy i CifC˜id C(y)dy e . σσ + σ di dddddd 2 2 of ∞ ∞ i 2 ∞ then σ−µ) ii∞ −(y−µ) σ the 22 the σ 22 2π 2πσ −(di −µ) ∞ 2i2− 22µ we used in2proof (17) we obtain 1dthe lemma −(d −µ) −(d −(d −µ) −µ) −(di −µ) move move to to proof proof of of equality equality (18). (18). Using Using the the designation designation (15), (15), the i 2 −(d µ 2πσ d − µ − µ ˜ d iid i ∞ ve equality to move proof (18). to of proof equality Using of the equality (18). designation Using (18). the Using (15), designation the the designation (15), the (15), the − F (d + x)F (d ) + F (d ). 2 σ − µ d (d + x)(1 − F (d )) + F (d ) = F µ + − µ µ µ + + d d − − µ µ µ + + d d − − µ µ 2σ 2σ 2σ 2σ 2σ i i i 2 ∞ ∞ and substitution that density function of the random variable C 2 ˜ ˜ ˜ move to proof of equality (18). Using the designation (15), the ˜ ˜ ˜ 2 2 2 2 2 i i i i 2 2 2 2 2 2 2 i i i i i i i i ∞ 2σ C2σ Ci i(y)dy =C(σ 2µ Ci y− ∞F e + f˜C˜(σ (y)dy dy −(d −(d −(d −(d 2µ 2i−µ) 2−µ)2.σ id √ i 2 ) y+ fC i ˜C˜i i−−C= ii−µ) i−µ) ysubstitution f= + µrandom ) ˜C˜1i and = (σ + µ )2222√ 1+ = + )(y)dy 1√ − Fthat 2σ 2σ 2σ 2..2−z2 2d 2i+ µdµ= + + + dof ddd+ − − − µ(y)dy µµ µ µ2edistribution 2σ 2σ 2σ 2σ √ √ √ .fvariable µ 1function −of FN(0,1) ˜of ˜i (y)dyC˜C edefinition ed√ ee 2 2σ2σ22 zof .− . ifunction +µ C ≤0) 0)ethe −µ) N(0,1) N(0,1) iand N(0,1) 2−(d −(d −µ) iF ii− C C˜i ∞ irandom i≤ C − µ 2∞ 2√ 2 and substitution that density density function of of the the random variable ˜ ˜ −(d −µ) iand i2iµ µ + − 2σ −(d −µ) density function the random variable substitution that Proof. From the the the y f = e dz i 2 2 2 2 y f (y)dy = 2 ∞ ave first integral the thesis. Now we can µ + − 2σ i i ˜ d d −(d −µ) i 2π 2 2 2 ˜ i ˜ substitution and that substitution and substitution that that nsity of thedproved density function function of variable the random of C the random variable variable C C +2σ µ i 2σ 2σ 2σ 2σ −(y−µ) σ σ i σ µ + d − µ 2σ i i √ √ √ we + used proof (17) obtain µ + − 2σµ√ th Cµ ithewe i i density function of random variable eeeProof. e 2πσ ....2π .2σ 2function di dthat + + + + σ Cdiii and substitution dthe i dof + 2σ i µe i√ 2π i (σ i 2 of 2 From 1√ 2πσ di −µ y fC˜i (y)dy = + µFrom ) 2πσ 1− FN(0,1) i.−µ ddefinition id− 22πσ Proof. the definition the distribution of Cthe 2σ + + efinition =toFproof (d x)the −inFcumulative (d x)F (d + F(17) ). d2πσ 2σ 2 .2e. that 2σ ˜i, √ ˜distribution ˜i (diwe += i of i+ i )the √ i2πσ e . + 2σ= 2σe2 e y f (y)d σ ˜i C˜we C˜in C C √ we used proof (17) we obtain random variable C (15) and (16) it follows + √ 2πσ 2πσ 2πσ we we used used in in proof proof (17) we obtain obtain e f (y)dy dy ˜ i i i σ σ of equality (18). Using designation (15), the ˜ C d (17) used we in used proof obtain in (17) proof we (17) obtain we obtain Ci dyi )i )that F used 2πσ ∞2 random variable C , (15) in0 proof (17) obtain ∞x ≤ 2∞ 2i −µ) 2 ˜ , (15) 2πσ −z2random 2µ 2πσ − µ2πσ 12σ 2the di andi (16) variable and (16) −(d −(d −µ)2it follows that i −(y−µ) −(d −µ) 2πσ d− ddi di i− 2π =the for 0. we of have proved first integral the thesis. i we can ithe ithe = 2( i From +1dsubstitution +iFrom µ 2σ 2σ µ Proof. +dz di= µFrom T˜i (x)definition ˜iµand From definition Proof. of Proof. the theC definition definition distribution From the the of of definition definition the function distribution distribution of of of the thedistribution function distribution function of the function the1 i − µNow ∞ ∞ ∞of 2 the 2 −zoo √ . e 1 − F 22 2Proof. that ty the random variable C 2 2∞Proof. 2 2 2 0.function From of the cumulative distribution −z −z of 1function 1 2 ∞ ∞ ∞ 2σ N(0,1) −(y−µ) −(y−µ) 2σ 2σ 2 2 2σ √ √ √ e f (y)dy = dy = 2 √ e e . + + d1 −µ. + e . 2 −(d −µ) 2 ∞ ∞ ∞ ∞ ∞ Proof. Proof. Proof. Proof. From From From From the the the the definition definition definition definition of of of of the the the the distribution distribution distribution distribution function function function function of of of of the the the the ˜ −(y−µ) −(y−µ) −(y−µ) 1 ∞ ∞2 i i ∞ ∞ C −(y−µ) σ 2 2 2 µ + d − µ 2σ we a˜C˜(d (14) on the density function f1T˜i (x) 1wed obtain 1(y)dy i toin of(17) equality (18). Using the designation (15), ˜= ˜˜2πσ ˜(16) ˜ithe Proof. From the distribution function FCve (d ).proof 2π √ √itdistribution 2that 2 e of √ eµ (16) dzthat + µfollows dz µdefinition +2σ ivariable ∞ 122π == Proof. From the of distribution function of ∞ 2(16) −(y−µ) i ). i proof random random C= variable random random C C variable C C and it ,,variable (15) follows (15) and and that (16) ,i ,z∞the (15) (15) it follows follows and and (16) that it+2σ it follows that σ the 2σ 2σ22 Proof. From the definition ofdof the distribution function of the Proof. From the definition of the function ofthe the 2= 0 2 random −z √ √== dof 1From − i , (15) i2˜definition i˜+2σ 2πσ 1the eevariable f√ (y)dy dy dy 2πσ 1the sed i˜i (y)dy Proof. the definition function diof −µ + µz∞the )= 1e= − =yµd2(σ = (F ( ifC 2σ 2σ 2σ 2σ idistribution ˜Ci√ ˜0. esults directly that FieT˜fi Cf(x) for xed≤ √ √ √ e . + i −µ i −µ e = f (y)dy = (y)dy dy = = dy = ˜ ˜ 2σ 2 dy 2 2 (y)dy f ˜ ˜ e (y)dy = dy 2 i ˜ 2π 2π random random random random variable variable variable variable C C C C , , , , (15) (15) (15) (15) and and and and (16) (16) (16) (16) it it it it follows follows follows follows that that that that 2π 2 ˜ C C 2σ ˜ on ˜ √variable √ e C .that dz 1and − F,and ithe random Ci i d variable = yfollows y,˜= f˜˜iCiii˜i,,i (y)dy eitthat dy. . σ Cµ +2σ = = =(2y( σ (16) hei derivative from cumulative distribui σ d −µ z √= substitution that sity function of Ci and ddidi ddi i 2π 2π random variable Cand (15) and (16) follows that N(0,1) ˜ variable C (15) it follows i drandom −µ 2πσ d d d 2π 2π 2π random (15) (16) it random variable C (15) and (16) it follows that d d 2 2 2 2 2 = i i i i i d d 2π i i i random variable C , (15) (16) it follows that i i i i i ∞ = (σ ∞∞ ∞−(y−µ) σ2π −(y−µ) ve formula (14) From on1 the∞the density function fT˜i (x) 2 −(y−µ) −(y−µ) −(y−µ) =(σ 2 tive distribution distribution 2πdistribution =2π −z ∞ Proof. dwe µ i ∞22 i2 of i the −(y−µ) 1function 11 the nive d∞∞ d∞∞ σ i −the ∞ From definition of22 the Proof. function of the definition the∞ distribution Proof. From definition the 222of 2√ n the 2 σof 22 22 11 function distribution 221∞ ∞ ∞ ∞ ∞yy yff √ ∞ ∞ ∞ ∞ used we obtain 2fof 22 22 −(y−µ) −(y−µ) −(y−µ) −(y−µ) in proof (17) ∞ = 2µ 2σ 2σ 2σ 2σ 2σ22 dy. 22dy. −(d √ 111∞ 1= 1= √ √ . ddi iy−− e dz 12−−z F = ˜ (y)dy = (y)dy (y)dy = (y)dy (y)dy y y y f y y f e e e e e dy. dy. dy. ∞ −(y−µ) ∞ 2y2222√ ∞ ∞ −z ˜ ˜ ˜ ˜ ˜ 2σ 2= −(y−µ) µ µ 1 1 N(0,1) √ d − µ 2σ + d − µ 2 2 2 2 2 2 2 2 ∞ e f (y)dy dy = d ly calculate the derivative from cumulative distribu2 ∞ ∞ C C C C C −(y−µ) 1 0. 0. ∞ ∞ −z2 ∞ −z −z d −µ −(y−µ) ˜ Introducing, the same as in the pro d d 1 1 − µ − µ d − µ i i i 2− i i i i i ˜ ∞ ∞ ˜ Proof. definition of.it the the −(y−µ) 2fffthat 22ddfunction 2 22 2dy. 22σ 2 √ i Ci , (15) idz di1i1From 2σ 2σ 2σ 2σ 12π −z − µF that Ci i 2 variable 122√ √ √ √ 1 yLemma yy2yyy2same f+ (y)dy (y)dy (y)dy = = = y= y2yyyof e2e2σ eed√ dy. dy. dy. σ 22(16) random variable C and itHEOREM follows , d(15) and (16) it= follows that we e 2 dzdi=√random random variable Cthe ,. (15) and (16) follows Introducing, the as in the proof of substitutions, 1 ˜C ˜C ˜ii˜i(y)dy d(σ ddiF d(σ dithe ddistribution √ √ 2π 2π 2π 2π 2π i . e e dz = = − − F 2 2 i 2 22 1 µ ) 1 − ( + µ ) + e = = (σ i i i i i i i i + µ ) 1 − F ( = C C 2 2σ (y)dy y f e dy. 2 √ σ 2 d N(0,1) N(0,1) 2π √ e e . . 1 − F dz = dz − = F 1 − F i T 2. If task completion times are independent (y)dy = f e dy. ˜ N(0,1) 2 N(0,1) i √ 2σ ddz d−µ −µ=N(0,1) ˜ e . 1 − F ∞ we ∞ √ −(y−µ) y f (y)dy = y e dy. √ N(0,1) N(0,1) C (y)dy = y y f e dy. 2σ C ˜ i i √ N(0,1) ˜ i d −µ d −µ y f (y)dy = y e dy. d d d d d d d d 2π 2π 2π 2π ˜ i 1 σ σ 2 2 (x) (x) we nction ction f f C ii i the we get Ci that σ = idi −µ σ( Introducing, as indthediiproof of Lemma 2 substitutions, 2π that 2π ˜i i C˜same Ci , (15) and (16) itddiiiifollows σσσ e2π σrandom σ etion di2π 2π(σ π weiT˜σTiassume 2π u σvariable we get we + µ ) 2πσ 122−substitut FN(0,1) Introducing, 2 2 N(p 2π 2π 2σ 2 dy σ(y)dy ran ∞random ∞distributed ∞Lemma isame f =∞ √ = ∞ ∞−(y−µ)2 normally −(y−µ) dIntroducing, i proof −(y−µ) 2π σ2 ,same ·dproof )as (k = variables p˜kidi∼ theIntroducing, Introducing, same as indi1the Introducing, the same same of as as the in in the the 2iipksubstitutions, as in of of in2π the Lemma the Lemma proof proof 22of of substitutions, substitutions, Lemma Lemma substitu ithe ∞ distribukthe 1adiproof C˜i−z T TT HEOREM 3. If d 1 − µ we get 2 1 2 2 2π 2 ulative ulative distribu2 2 i 2 2 2 T HEOREM 2. If the task completion times are independent 2 d d Introducing, Introducing, Introducing, Introducing, the the the the same same same same as as as as in in in in the the the the proof proof proof proof of of of of Lemma Lemma Lemma Lemma 2 2 2 2 substitutions, substitutions, substitutions, substitutions, 2 √ e 2σ 2σ ∞T ∞ of Lemma i iy u- the √ √ −(y−µ) y2,√ (y)dy = ythe fCN(0,1) fCthe eget dy. dy. ∞yIntroducing, = y2σ e1get ∞ ∞fC˜ (y)dy 2 2we −(y−µ) 2dy. ˜i (y)dy = ˜iwe the same as in the proof 2 substitutions, 1, e dz = 1 − F 1 Introducing, the same as in the proof of Lemma 2 substitutions, ∞ plify notation assume that −(y−µ) 2 1, . . . , n), then the expected value of tardiness (12) of task 2 we we we get get we get Introducing, same as in proof of Lemma 2 substitutions, i Introducing, the same as in the proof of Lemma 2 substitutions, 2 −z 1√ same di −µσ = a 2 2the random ran ran n2 Introducing, as the2π proof of Lemma 21) substitutions, T H , and (15) σdi 2πdi random dipi . dget di2 in dy. 2σ 2 variables dget T 2 y2π 2 p√ σ ∞ 2π p1 2+ . . . + dy = y e ii ∼ N(p , a · (k = variables normally distributed p ˜ 2 2σ (y)dy = y f e we we we we get get 2 k k k 2σ ˜ √ √ dy = dz = (σ z + µ) y e e ran ran ran ra di −µ task Ciwe we TTheorem HEOREM If the completion times we get di −i µ 1 −z2.2.If the ∈ J independent is are di −µ di ∞∞ 2π ∞dindependent get −(y−µ) getget we get ran task completion times are random 2 di 22 1,−z 2, . . . , n), then 1, 1, the 2 we 2π rando ran ∞ ∞ ∞∞ 22value i 2, −(y−µ) −(y−µ) −(y−µ) σ2 2 dzT d∞ 2π √ 1, .substitutions, . .(k ,in n), then the of tardiness (12) task eand .are 1task − 2F+ 2 .task −z2 2 substitutions, −z22 22 11 −z −z22rand i same 1as 1 1expected 1−µ) 1 23.2−(y−µ) 1∞∞ 1of 1 2π T= HEOREM HEOREM 2. 2. If If the the task completion completion times times are are independent independent proof σ Introducing, the same as in the proof of Lemma Introducing, 2 the as in the Lemma Introducing, the same the of Lemma 2 222proof 2isubstitutions, N(0,1) T 2. completion If the 2. task If times the completion are completion independent times are times independent independent 1, 1, 1, 1, 2 2 2 2 2 2 2 2 −(d pHEOREM +random .T .HEOREM + p , σ = a p . . . + p (15) d.itask −µ T T HEOREM If the task completion times are HEORE indepe ∞ ∞ ∞ ∞ ∞ ∞ ∞ ∞ T HEOREM 3. If the task c2= ∼ N(p , a · p ) = variables normally distributed p ˜ 2 2 2 2 2 −(y−µ) −(y−µ) −(y−µ) −(y−µ) HEOREM 2. If the task completion times are independent 2 2 2 2 yTIffunction of the deadline for tasks com i 1the 2√ 2zz∞ 2−z 22 dz 2−z k k)(k = 1, 2, …, n), k y √k 2σ 2σ 2σ 2σ 2σ∞ (σ σ ˜k » N(p variables normally distributed p 22dy i σ k, a¢p −z −z 1 1 1 1 1 1 1 1 1 √ √ √ √ √ √ √ √ e e dy = dy = = dy dy = = dz = dz dz = = dz y y (σ y y z + µ) (σ + + µ) µ) (σ (σ z z + + µ) µ) e e e e e e e e 1, ∞ −(y−µ) 2 2π 2 ∞ −(y−µ) 2 1, 2, . 221 2)) 2))(k 221 22 1∞−z22 −z z+ 22 substitutions, ∞F ∞ d∞ −z −z2 −(y−µ) 12σ σ ∞ (σ ∞ 2222 2σ −(y−µ) 1,1, 2,. dof −µ dii−µ −µ ddi −µ 1random ∞ Introducing, the same as in the proof Lemma i ∈ J is 22dz −(y−µ) 1 2221 2σ 2σ 2σ √ ∼ ∼ N(p N(p , , a a · · p p (k = = random random variables variables normally normally distributed distributed p ˜ p ˜ ∞ ∞ ie i −µ ) = (1 − (d E( T e 2 −z 2= √ √ √ √ √ √ √ √ 1 1 dy dy dy dy = = = (σ (σ (σ z z z + + + µ) µ) µ) µ) dz dz dz = = = = y y y y e e e e e e e we get we get 2 2 we get T HEOREM 3. If the ta ˜ 2 k k k k k k i i −z 1 1 2 ∼ N(p , a · p ) ∼ (k N(p = ∼ , a N(p · p ) , (k a · = p ) (k = dom normally random variables distributed variables normally p normally ˜ distributed distributed p ˜ p ˜ 2 d d d d d 2π 2π 2π 2π 2π 2π 2π 2π 2π 2π −z −z 1 1 2 2 2 C ∼ N(p , a · p ) variables normally distributed p ˜ random var 2σ 2 2 random variables normally d 1, 2, . . . , n), then the expected value of tardiness (12) of task N(p random normally p˜kkkof∼task iii √ i2i 2σ 2k dz k value of k distributed kk k ki 2 J then variables the expected tardiness (12) dy = yeσ2σ (σ e= 2e = dddiσ dσ −µ −µ −µ 2y 2σσz + 2k˜ dz √ √ 2=+k2 zizµ) + µ) ki, a ·ispkk ) (k = y 2√ 2 dy eE(dz σ 2µ) i−µ i−µ √ z(σ + = etask eµ) √task √2π 5) 2dz dy= = dz= = (σ + µ) y2of 2σ 2= 22√ d+ −µ √ −(d dy (σ dz e2 eez√ dddidiiy 2π 2π 2π 2π 2π ddi(σ Tnormal ( √2σ √ edy zee2π µzi −µ) z√ dz e, density function of deadline forexpected tasks comd2π −µ ddz −µ iiσσ 21,2, −µ weindependent get σ σ2σ i. .),= i2π 1, 2, .tardiness . .the .,expected ,n), n), then then the the value value of oftardiness tardiness (12) 2.2expected 2π di+ −µ iσ ddi˜iiσ −(y−µ) −(x−µ) 2π 2π dd∞ −µ d∞ −µ2, EOREM 2. If,n), the task completion times are then expected n), 2π of ∞∞ e∞ random variables i (12) i1, 2, the .p.the , 1, n), 2, . then . . , value expected of the expected value (12) value tardiness of task of tardiness (12) of (12) task of task 2π 2 of 2 2, . . 2π 2∞ ∞ d σ∞ 2π i i . , then the expected value of a random 1, 2, variable . 2 1, . . . , n), then the expected 2π 2π i ∈ J is + + p . . (15) (15) σ 1, 2, . n), then the value of tardiness (12) of task 1 ∞ ∞ ∞ ∞ ∞ ∞ ∞ ∞ −(y−µ) 2 2 2 2 2 2 2 2 2 2 22 n), d −(y−µ) σ 2π 2π 2 2σ σ 2π 2π σ ) i Ti )−z= √ (1 − F (d )) E( e i i −z −z −z −z −z −z −z −z −z −z −z −z 1 1 1 1 1 1 1 1 1 1 1 1 1 σ 1 1 5) ˜ 2 i σ σ C 2σ 2 22 22 2 2√i1 ∈x) √ is enormally = .2∈√ (16) ∞ ∞ ∞ µ) ∞ ∞ ∞ ∞+ 2 2= 2∞ 2222zdz 2dz 222dz 2dz y·2−(d ∈ Jdistributed Jisis e 2σ2 p˜dy 2 2−z 22 2−z 2−z 2−z −z −z N(p (k m variables 1, 2, ,−z n), 1µ 1111µ1µ.= √= √ √ √ √ √ √ √−z2 the √µ = (σ zp∞√ + µ) dz dy = (σ z−z +2−z µ) eedd2−µ edy2yz2= e+ e2(σ e2µµ2√ dz zzd2222σ z2dz + 2σ dz dz dz zz2σ 2σdz e.e2.dd−z zzthen ee dz dz σσ σ σ (19) =2σ + e σ√22σ 2−(y−µ) e1−µ ∞ ∞ JJis iσ mdz∞ 2π ∞ 2 ee ∞ ∞ k ∼ k, a ki)−µ) 2z 2 i∈∈JJ2π is If the − ∞22expe 22 2∞ 22σ 2i1 22√ iyitask ∞ ∞+ ∞ diσ −µ ddie di −µ dii−µ −µ d∞ di2i−µ −µ −µ −z di −µ 11112π 22∞ 21 22dz 1 2221 22 2π −ze −(x−µ)22 i−z i −µ i−µ i −z i22−µ 1 2zzzz 2e −z −z 22 22 1+(µ 2−z √ √ √ √ √ √ √ √ 1 e e dz dz dz dz + + + + 2σ 2σ 2σ 2σ µ µ µ µ z z z z e e e dz dz dz σ σ σ σ 2 2 ˜ HEOREM 2. completion times are independent −z 2 −(d −(d −µ) −µ) 1 1 2σ 2 2 d d − µ d 2π 2π 2π 2π 2π 2π 2π 2π 2π 2π 2π 2π d 1 −z 2π 1 2 2 2 2 √ 2 e T ) = (1 − F (d )) E( . − d ) 1 − F 2 2 e for for tasks tasks comcom2σ i i 2 2 i i i i √ √ −(d −µ) −(d −µ) −(d −µ) √ √ e z dz + 2σ µ z e dz σ ˜ dddi(σ diσ −µ −µ −µ −µ dddiσ diσ −µ −µ −µ −µ 2N(0,1) 2zz = dz = y z + µ) e e i i σ σ σ σ (12) σe σ i dy σ σσ2σ σ˜σ22 dz √ √ 2 2 2 σ σ 2 2 dz + 2σ µ z e 2 2 2 -. . , n), then −(d −µ) 2 √ the expected value of tardiness of task i i i C +µ 2 i i i i √ √ e z dz + 2σ µ z e dz √ √ m˜ σ σ σ 2 e dz + 2σ µ z e dz σ i i 2 d −µ d −µ 2σ ddi −µ √ √ σ 22 e z dz + 2σ µ z e dz σ 2π 2π 2π 2π 2π 2π 2π 2π ddσ −µ dd) −µ √ E( T ) = (σ + µ 1 − F E( √ f E( T ) = (σ + µ ) (x) = e . (16) +µ e dz. (19) ˜ ˜ i i d −µ σ 2σ 2σ ˜ −µ d −µ i i 2 2 2 −µ −µ 2π N(0,1) √ √ σ σ σ σ σ σ σ i ) ) = = (1 (1 − − F F (d (d )) )) E( E( T T e e i2π 2 i + C T˜i )ilemma, i di −µ 2σ kC i iip i˜)) N(p pi∞∞k2) (k2π= dom variables normally distributed di −µ ∞ iσσd∞i−µiσ∞ 2π 2π 2π )isan = (1 − FE( ))=E( (1 )i )2π F = (d − FFC˜Ci˜√ (d T˜T− e˜ (1 e∼ √ e˜,Cei˜aidi·2σi2σ auxiliary which di − σ 2π σ2π k2in √ 2π−z−z22 E( i˜ i )) = (12σ −we (d E( ∞ ∞11µ ∞2∞ 2 2π 2 −z∞ 2σ σσ −z2π σ 2 +2 C˜ii(d C i )) σ√ σ1 2π22−z σ −z −z2 −z 1 iuse 12π 1− 1212π 1 −z−z222 +(µ 12π T˜i )σ= (σ µ 2π 2∞ σ . F 2 2π i value 2 22− d1σi ) 1−z 2 2 2√ 2π 2π 2 ∞ ∞ ∞ ∞ 222dz. 2dz. N(0,1) 2π ˜ ˜ 2 2 2 2 2 2 2 2 2 2 2 Pr 2 −z −z −z −z . . . , n), then the expected of tardiness (12) of task 1 1 1 1 √ √ √ √ √ √ √ √ 6) √ √ +µ +µ +µ +µ +µ e e z dz + 2σ µ e dz z dz + 2σ µ z e dz e e e e e dz. (19) dz. dz. (19) (19) σ σ e e z dz + 2σ µ z dz σ Ti −(d . i −µ)dProof. d value of random variables Ti and ∞ ∞ (11) 2 2dσ−µ∞it 2fol∞ 22 22 21 2 and ∞ −zcan −z 1 ∞−z1 1 definition of the value (12) di −µ lemma, −µ di2−µexpected di −µdi −µ d∞ dii−µ −µ d−z di22−µ −µ ddBy −µ 1 σ 2 Now we individually calculat − µ 2 −z i i i 2 i 2 −z 2 2 √ √ √ √ 1 e will prove an auxiliary which we use in +µ +µ +µ +µ e e e e dz. dz. dz. dz. (19) (19) (19) (19) i 2 2 −z 2 1dz −(di −µ) 2π 2πµindividually 2π 2π 2π 2π 2π 2π we+can calculate each of the above integrals. 2 2z √ √ F ˜−(d 2 µdz. 2 σ dz 2 √ 2+ ddiσ dσ −µ −µ 2 2 d+µ z2π√ eNow 2σ e−µ σ σσ σ dii)) σ2π σ σ σ 2π µ√ die− +µ dz. (19) 2σ the µ(19) +ded )(16) = (1+(µ − e− µ2σ 2σ J) is E(T˜i(16) −FdN(0,1) i−µ i√ 6) e2π (19) √ 2dz. +µ dz. (19) d2σ −µ d+µ C +µ e2 ee√ dz. (19) 2π 2π ddiσ −µ Proof. From Pro Pro i2π i random that i −µ µ ) 1 − µF dof d˜diilows di2π −µ iiσσ 2σ 2 ˜i2 . di.σ−µddi i−−µµ2π −µ σ√ The first integral. Since i− i− µFT 2µ √ + e . + 2π di −µ = 1, 2, . . . , n is a random variable speciand g the expected value variables T 2− σ 2π i1 2π . . +(µ +(µ − d d ) ) 1 − − σ The first integral. Since 2π σ −(d −µ) 2π 2σ σ Proof. By definition of the expected value (11) and (12) it fol∞ ∞ i i Pro Pro Pr Pr ∞ 2 2π each 2individually N(0,1) .we − di ) 1 − +(µ FN(0,1) − d+(µ −− −ddiF)i )N(0,1) 11−−.FF i 1 . N(0,1) in σ 2individually individually σ−z2above σ i) 1 . can +(µ −z −z Now Now Now we we can can calculate Now Now we we each can can of individually individually calculate calculate each calculate integrals. calculate of of the the above each above each of of integrals. integrals. the the above above integ inte 1 N(0,1) σthe 1 2πσ 2π ∞ ∞ Pr N(0,1) σ σ 2 2 √ 2 + Proof Pr ˜ 2σ√σσ e 2 dz. σ σ Pro 2 dz. 2 individually letion of the task (11), then √ 2 2 ) = (1 − F (d )) E( T e Now Now Now Now we we we we can can can can individually individually individually calculate calculate calculate calculate each each each each of of of of the the the the above above above above integrals. integrals. integrals. integrals. √ which hich we we use use in in √ +µ +µ e (19) (19) e +µ dz. (19) ˜ ∞ i i 2 −z −z Now we can individually calculate each the above C n.n 2. If ˜lows that 21 2integral. 2eachof 2integrals. ˜i , i = Now we can individually calculate each of the above −z i is a random dvariable difirst −µ dx −µ we can individually calculate of the above integrals. −z E( T (d + x) ) = f (x)dx = x f −z −z −z −z22 integrals. A C 1, 2, . . . , n speciThe first integral. The The Since first first integral. integral. The The first Since Since integral. Since Since ˜ ˜ i −µ value Now we can individually calculate each of the above integrals. i i iNow 2 Now we can individually calculate each of the above integrals. Proof. By definition of the expected (11) and (12) it fol2 T C 2π Now we can individually calculate each of the above integrals. 2 − µ d 2π 2π 2π 2 2 i ) = −1 · e + (−z)e (−ze i2 2−1 (−z)e ˜ .By √2 first 2 , +µ e0integral. Since dz. σ· e σ of 2 (19) = + · (−z) = expected z From e 2 − (−ze The The The The first first first first integral. integral. integral. Since Since Since Since and T˜T− bles bles T˜Tii˜−µ) 2Proof. −∞ σand ∞definition The definition ofdefinition thei expected value (11) and value (12) it(11) foliand di −µ −(d .and +(µ 1of − FN(0,1) i i .d Proof. Proof. By By definition of the the expected expected value (11) and (12) (12) itintegral. it)∞integral. folfolThe first integral. Since Proof. From the of the Proof. From Proof. theevalue definition of .ion i )definition The first integral. Since date of completion the task (11), thenit(11) oof. Proof. By the definition expected By definition of value the expected of (11) and expected value (12) value fol(11) (12)and it fol(12) The first integral. Since The first integral. Since − µ dthe 2 ititfol2−z 222π 2 2−z 2 22 2222 2 −z 22 −z 22 22 −z lows that Proof. By of the expected value (11) and (12) folThe first Since i ˜ σ −z −z −z −z −z −z −z −z −z −z −z −z −z −z −z −z −z22 −− σ i-e of E( T (x)dx = x f (d + x) ) = x f 2σ 2 ) lows ∞ that ˜ ˜ 2 2 2 2 2 i i Now we can individually calculate each of the Now above we integrals. can individually calculate each of the above integrals. Now we can individually calculate each of the above integrals. T 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 C ) + µ 1 − F ( Proof. From the definitio 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 i 2then lows lows that −z −z −z −z −z −z −z −z −z −z −z −z −z2− −z −z −z N(0,1) )+ = −1 · e= (1 + ))(−z)e = =(−ze −1 −1 e2·−z (−z) )−z )22x22= + + = (−z)e −1 (−z)e = −1 e2x)dx. + ·− ·+ (−z) (−z) (−z)e e(−z)e = , zz222e2e·2·−z (−z) (−z) −e= e= z−z ,e∞ −z2− − (−ze (−ze (−ze ∞ 2i ··e 2z··ee 2= 2z22,e 2 ee ∞ σ di − µ ∞ (−ze ws that lows that 22−∞ m variable specispecithen 2 that 0 −z −z −z −z lows that 2 2 2 2 −z −z −z −z −z −(d −µ) −F (d )) f (d + (d ) f (d x) dx − F 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 π-i-variable −z −z −z −z −z ····ee−z 2−ee−z . +(µ −The d ) 1 − F −z −z above −z −z − −z ,,,,, ˜x22 f (d 2 2 ˜ ˜ ˜ ˜ ifirst i i i i ) ) ) ) = = = = −1 −1 −1 −1 e e + + + + (−z)e (−z)e (−z)e (−z)e · · · · (−z) (−z) (−z) (−z) = = = = z z z z e e e e − − e e (−ze (−ze (−ze (−ze 2 Now we can individually calculate each of the integrals. −z −z −z i C C C C N(0,1) ˜ 2 2 2 2 2 − µ σ d = ˜ integral. Since The first integral. Since 2 The first integral. Since i i i i 2 2 2 2 2 ∞∞ 2e ·(−z)e e(−z)e + (−z)e ·= (−z) −,, eTi ) C= ,˜−z2 2 2)(−ze + == 2)) (x)dx E( T2(−z) ∞∞ x fT˜i value Tthe (x)dx =∞(∞ ∞iand f)C˜i (diti + x) = −1 ·· = (−z) (−ze 2· e 2 2)= 2− ∞ definition −z −z i2))∞=expected −1 ·−ze)·2e·= +2 + (−z)e ex2zz2e2feeT˜2i= − ez− (−ze i · )(−z) .(y)dy of∞ (11) folσ 2σ =−1 −1 e02−1 + (−z)e (−z) = − (−ze 22· ∞ 2z ∞ x(12) hen enBy = e2 ee−z2, 22= ,E( (−ze (17) 2z −z =√ eE( +−∞ µ E( 1− F)∞= 0 z2 e˜i22 ˜Ti˜)ix) 2 − e 2 dz N(0,1) then then then then 2 −z2 then 2 −z2 0(d The first integral. Since ze−∞ e+ E( T = x x f f (x)dx (x)dx = = x x f f (d (d + + x) x) ˜i )2π=E( ˜Ti˜)−z E( T ˜ix ˜+ ˜2C ˜(d i i 2 (1 2 x)dx. 2 . T (x)dx = x (x)dx f (d = (x)dx + x = f x f x) (d + x) =that xE( fT˜i T x = f x f T T C 2 2 2 2 2 2 2 2 2 2 2 2 2 2 σ z e − e dz = z e dz − dz = −ze ˜ ˜ ˜ ˜ ˜ i ) i i −F ) f (d + x) dx = − F (d )) x f (d E( ) = x f (x)dx = f (d + x) i i i i −z −z −z −z −z Ci TT Ci We −z −z ithen i−z i then then then i˜i −z C˜ −z C C˜i 2 2i2−z 22 −∞i0 2 Ti −∞ ˜2i ythen 2 22 2 2∞ −z ˜i −z C2Ci˜∞ +−z2x. =2−z d2·2i(−z) −∞ 2C i i 00 a 2substitution −z−∞2 02 +−∞ 00(−ze 2 22ithen 2)then 2 2 2− 22−z 22− 2 e 2∞, ∞ −z 2222 2 introduce 2then 2 ,0 = −∞ then ) = −1 · e (−z)e · (−z) = z − e = , −1 e + (−z)e · (−z) z e (−ze ) = −1 · e + (−z)e = z e e −z −z −z −z −z −z −z −z −z −z −z −z −z −z −z −z −z −z −z of.−∞ By definition of the expected value (11) and (12) it fol 2 2 2 2 2 then (17) µµ (−ze ddi i−− −z 2 2 ∞∞−z 2 2 2 −z 22 −z ∞ di − µ ∞ 2z 222− 22222dz 2dz 2= 22dz 2dz 222dz 2dz− 22dz 22 = 22−z 2−z2−z 2−z 2−z 2−z 2= 2−z 2−z 22−z 2−z 2−z 2−z 22 2− ∞∞ −z xe2ex∞2integral −z −z −z −z (1− + 2= z= ei + zz= z2−z e− − dz ee·e2(−z) dz ee2e−z zefz22C222˜222e22e(d dz dz dz = − − z2− eee2F .the = dz =+ −ze −ze 1 −−F (−z)e x)dx. (di + x).∞ dx = (1(18) − FC˜ (d )) x2 fC˜)i(d 2z 2x 2−ze 2= 2−ze = = + x)dx (d )− f−z x)dx )= ((E( 222− 2e 2= 2dz 2..(d = fC2−z (d + x)dx ∞ z2 ˜i (di ) fC˜ Hence definite 222e 2e2∞−z ˜−z −1 · ee 2Hence = z− ,−z (−ze −z−z −z −z−ze i= 2− C) dy = ,1)that −z −z −z ˜i )FN(0,1) if ˜ (x)dx ii +i x) 22dz 2˜22i2dz 22˜22i ...i. i −z 2222definite 22222yintegral 22i2e 2C 2i2C the s0,1) −z −ze −z −ze −z −z ie 2− 2− −z −z −z −z T = x f (d x + x. We introduce a substitution = d z z z z e e e e − − − e e e dz dz dz dz = = = = z z z z e e e dz dz dz − − − e e e dz dz dz = = = = −ze −ze −ze −ze 2 2 ˜ −z −z −z −z i 2 2 σ σ 2 2 2 2 2 0 T C 2 2 −F −F (d (d )) )) x x f f (d (d + + x)dx. x)dx. (d (d ) ) f f (d (d + + x) x) dx dx = = (1 (1 − − F F 2 2 2 2 2 0 0 0 0 z e − e dz = z e dz − e dz = −ze . i − FC˜i (d x2 − feC˜− (d x)dx iF 7) σ i− i− i+ i(1 dz dz dz = −ze then 2e + 2 2.x2 f... then −F + Cx) )−F fC= (d (1 − )i )fx) (d dx ))= x) xi fF = (1 (diiµ + x)dx. xi )) xxfx)dx. x)dx. 2dz + x)d C˜x) Ci˜(1 C CC Ci˜(d C˜Ci˜zii )) ∞dx = − = ˜i(d ˜i (d ˜Ci˜C(d ˜CiC i−F i i+ i i+ i+ i )) fF dx − (d f˜C˜˜(d x)dx. − dz== =z ze2zze2 ee2dz2dz dz−− −e e2ee2dz2∞dz dz−ze = −ze i dx ii˜˜(d i i then −∞ ˜(d F C C˜Ci˜(d C˜i (di + C C zei2izze20ee2− . C˜i (d if)) i+ ie e2idz d0˜˜= −ze i˜i (d ∞ ∞(1 ∞i iii− 2 i ∞ −z= ∞definite (17) (17) = Ci i 2the 00 Hence 22the integral then −z 2 0 0 i x. 00(18) ∞2definite ∞ integral 222 ∞−z0 sub ) Hence definite Hence integral the the definite Hence Hence integral integral the definite . f (y)dy 1 − F −zI 2 2 2 2 2 2 2 2 2 2 2 2 7) + We introduce a substitution y = d ˜ ˜ N(0,1) i −z −z −z −z −z −z −z −z −z −z 2 remains to ecal I −z −z −z−z 2(d−z E(CTi i ) = x f∞C˜2i (di2 +2x)−z2 − d −µ 2 22z e It dz 2x f T˜2i (x)dx = 2 Hence 2 −2x f ˜=(d(1 2 2)) Hence Hence Hence the the the the definite definite definite definite integral integral integral integral (1 − F + x)dx σ z e dz e 2dz = −ze . ˜ d −µ d i i 2 2 2 2 2 2 2 2 2 2 − F (d ) x f (d + x)dx. I i = (1 − F (d ) + + x. x. We We introduce introduce a a substitution substitution y y = = d d C C z e z e − e dz = z e dz − e dz = −ze − . e dz = z e dz − e dz = −ze . Hence the definite integral i i z e − e dz = z e dz − e dz = −ze . = ˜ ˜ ∞ Hence ˜ 2 definite 2 2 Hence definite integral i integral i i i 2 ∞−z2 Hence di the −µ dii−µ the definite integral +y0yix. ubstitution introduce a+= substitution aax. substitution C ∞ C Hence the definite integral (diWe )We fC˜i introduce (d x)d−∞ dx = (1y−=FdC˜ii (d ))=∞ddi i++xx.fx.C˜i (di + 2x)dx. −z −z −z dii−µ i+ introduce substitution i ∞∞i Hence definite ∞ ∞ iy 0∞ 222−z∞ 2 the σ −z22Ci σ∞i=sub sub dIt C˜2016 2the 0−z substitution ∞ ir( 8) −z22(1y −z −z222integral −z −z22−z i ∞∞ 22∞∞ σ σ 2∞ 2 dz 2 . 5 ∞ ∞ ∞ ∞σ ∞−z z∞∞e 2 − e 2 dz = − FIt f ∞ (d + 2 −z 2 2− −z −z −z −z22 −z = z e e = ∞ ∞ ∞ dz ∞ ∞ ∞ ∞22−ze ˜su 0x 222dz 2dz 2dz 22 22= sub sub su 2 2 2 2 2 2 2 2 C (1 − F (d )) x)dx ∞ −z −z −z −z −z −z −z −z 2 2 2 2 i z e dz − z z e e e − − = z z e −ze e e e dz dz − 2dz − 2dz = = . e −ze e −ze 2dz 2dz = −ze −ze . . . . ∞ ∞ ∞ ∞ ∞ ˜ ˜ i i 2 dd −µ ∞22 ∞ ∞ −z2 . . (18) (18) ∞ ∞ 22 dd−µ ∞ su C C 2 2 2 2 −z −z −z −z ∞ ∞ −z 2 2 2 ∞ ∞ ) i i substi d −µ d d −µ d −µ −µ −µ −µ d d −µ −µ −z −z su 2 ∞ ∞ii iintegral 8) 2222 dz 2222 ddii−µ subs −z −z−z i+ ii −ze i= i= 2 −z diTherefore −µ −µ2 .... di −µ ∞ −ze 2i i − −z ∞ z2eee−z ee−z dz dz dz − − −2 ∞ddz e−z 2dz 2dz 2dz = = −ze −ze 2−ze 0))− the Hence the definite 2 22dz i −µ (1 F (d )) xx)dx. xfCf˜Ci˜(d (d +x)dx x)dx Hence the definite integral F− (d ) fXX(Y) (dHence + dx = (1 (d(1 x(d f+ (d 2σ z2zzz−z −z i˜+ ii )) 22 z 2i−ze ˜)) i(1 i+ 1c.: − (1 − F xx)fF (d (di definite x)dx (d xFiintegral )) fx. x− x)dx − 2dz = .dremain dddiσ dσ −µ −µ −µ −µ ddiσ dσ −µ −µ −µ −µ Therefore remains to calculate Itintroduc x−ze fC2remains +dddx)dx. Ci˜(d σeσ− σ It to dz eeee2dz 2dz −ze + ntroduce y+ C˜Tech. Ca i˜)) i )) −= FC˜dCi˜− (d xfx)dx f˜CFiC˜˜(d x)dx 2016 ˜i (d i i iC i i iσ iz i−µ ie i−µ d5 −µ −µ −µ σ σ2 f0 ˜ x e ze22It dz − =σe=σ= i substitution i FC˜ii (d C˜C Ci˜i (di i + C 2 2 iC dz− − 2dz = −ze . dBy ..calculate 0−ze dei2−µ di2dz −µ x iσ iσ i.−µ i−µ i˜i (1 z dz e . ddiσ ddiσ 0 0 ii i Hence −µ d −µ d −µ 0 i i d Ci ( −µ −µ σ σ σ σ σ σ i the definite integral 0 0 0 d −µ d −µ i i ∞ d −µ i i i 2 σ ∞ 2 2σ∞ dσiσσ−µ ∞ a substitution σ We introduce y = d 0 ∞ i + x. ∞ ∞ It remains i 2σ substitution i= i2di −µ σ to −z σσ d√ ∞−z 0 we√ 2substitution 2 −z2 ∞ ∞∞ + x we get y substitution σcalculate + x get y d σ σσ = ∞ −z2 σ W. ∞ 2 i −z −z i 1 −z −z −z 2 2 σ σ 2 σ ˙ Bo z ejko, P. Rajba, and M. Wodecki Therefore Therefore Therefore Therefore σ z e 2 dz − 2 2 Therefore −∞. = 2π−ze 2 √ e. 2 d −µ 2π 5. e∞z−z 2 dz 2 2− + introduce a(1substitution z2 e2e−z zy2= e x2dfCi˜dz − e−z 2dz =z∞2−ze e 2dzdz 2dz e 2 dz = −ze substitution ∞ i ∞ y = Therefore Therefore Therefore − FC˜i (ddii)) (dx. i + x we −z d −µ i + x)dx d∞ −µdi −µ d∞ 2 − −z 22 ∞ ∞d −µ di −µ −µ di −µ ddiTherefore −µ ∞di2∞ 5 Therefore i5 i −µ √ iTherefore d −µ i −µ Therefore 2 −z ∞ ∞ ∞ ∞ ∞ ∞ 2 2 2 2 2 2 2 2 σ √ √ √ √ 5 5 5 i i Therefore 2π 2 dz −z e −z −z 2 12 22−z−z −z −z −z −z −z 2 1 1 1 1 ∞ σTherefore 0 ∞ 2 2 σ σ z e − 2dz = −ze . 2 σ σ σ 2Therefore σ σ σ5 2 2 σ −z −z2 ∞ ∞ ∞ zzee∞ ∞ ∞ ∞ ∞− fC˜x)dx (y)d 222dz 2√ 2222−√ = √ √ (d x)dx (y d√ )22∞fCfe˜Cie˜(d (y)dy xσ− di −µ x= ∞Wodecki 2−z 2−z 2 2−z 2−z 2y 2i + i−z −z −z 112π 11∞2π √ −µ − zddieiσ−µ dz −−zfdCzi√ z√ e2π e2 i2∞+ dz −√ ee2 −z dz 2˜e ∞ ∞ W. Bo ejko,d(y P.and i 2 22dz 22 ze M. √ i = i 2π 2Bo 222d2i−z ∞d∞ −z˙F −Rajba, ) f2CWodecki (y)dy 1 Using the equality dz −e ∞2π ∞ dii−µ −µ −µ ddz di −µ −µ d√ dand −µ −µ d−z di22−µ −µ √ −z 2 1 √ ˜and i ))z˙Rajba, ∞ σC˜i (dP. 2 2 2 2 ˙ 2 2 2 (1 − FC˜i (di )) x fC˜i (di + x)dx =W.(1Bo d −z −z √ ejko, M. W. z ejko, P. Rajba, M. Wodecki i −µ i i i i 2 0 d 1 2 0 −z −z √ √ √ √ 1 i z z z z e e e e dz dz dz − − − − 2π 2π 2π 2π e e e e 2 σ i −z 2 z e −z i 1M. 2π 2π 2π 2π (di +Th 2 2z 22e 2 2 dz 2 x fC˜−z 2 =xd0 i −µ) 2π dddiσ dσ −µ −µ −µ −µ dddiσ dσ −µ −µ −µ −µ −(d σ σ2π σ− √ −z Therefore Therefore dz 2π d− di P.σ P. 2e ˙− iz iWodecki iµ i√ i2 ) W. Bo Bo Bo z˙ ejko, z˙z˙ejko, ejko, Rajba, P. Rajba, Rajba, and and and M. M. M. Wodecki Wodecki Bo ejko, P.2π Rajba, and i− eW. dz2zdz 2π e2π √ 2−µ 2 σ2σ= e 0 ze de(−ze dz − 2π e√ ∞ Therefore W.W. d2π −µ √ e z − lim (−ze + 5 2π 2π ddiσ −µ ddiσ −µ 2Wodecki i i 0 2 d −µ d −µ i i 2σ −µ −µ σ σ σ σ σ σ 2 2 e = lim ) + . 2π z→∞ i di −µ iσ i 2π ∞ ∞ ∞ ∞ ∞iσdi −µ ∞2π ∞2 ∞√ σ 2 2 ∞ −z σ −z Therefore 2π The Th 2 σσ−z2π 1 ∞ −z−z 22 2∞ Therefore 2−z 2 2∞ 2 2 21−z −z √ − i −µ) −z −µ)22 d −(d −µ) −(d −(d −µ) 2 −(d 21 2 ∞ ∞∞ ∞ ∞√ ∞Using σ2d −z ∞−z22 −z 1σ− 2i2ze 22−(d σ2 σ1 σµ 2ii−µ) id FC˜i (d )) (y − di ) fC˜= (y)dy the equality = −e y2= −z −z −z µ2−z µe−z − µget dz→∞ d2−µ) d2,Ciequality 2(yi−z 2(d Th Th Th Th 2(1 2−z 2d 2dz 2−z 22f−µ) 2f 2d 22− 2(y)dy 2µ i− ii − iwe = = (y)d y f (y)dy − y + = f (y)dy − ∞ √ √ 22π 2Using 2 −z z e dz (−e ˜ ˜ ˜ 2 d 2 i2)) dz 2 2 ˜ i = (1 − F=C˜i (1 (1 − F (di− )) − ) f (y)dy (y − d ) f (y)dy i i Using the equality the ze dz = −e , we get ze dz = −y) −z −z −z −z −(d −(d −(d −(d −µ) −µ) 2 2 i 2 2 2 2 √ √ √ z e dz − 2π z e dz e ˜ ˜ ˜ C C z e − 2π e C i i 2 ∞ ∞ Th 2C 2−z 22σ 2µ 2e iii2i−µ)i 22−(d 2σ 2σ √ i The i d2lim i2σ C5 Ci− F ˜ (d −z −z −z − − − −= µ µµ µ dddddddz e−e eget e y2σ = (−ze )i˜(y)dy = = + lim lim (−ze (−ze = = )i)lim + .dz (−ze )dii)−µ) + + .equality .i get .d.i 2 There di(y ∞ Th )) yi f˜C˜(d (y)dy −(y ddUsing f−µ . e 2−z = (1 i˜the −z2lim 2 2π 1 −(d i2idz i= i−µ −µ df(y)dy −µ −µ di −µ d) i˜−µ iisimple After transformations we i −µ) 2+2(−ze 2−z 2simple 2ithe 2−µ) 2,, we d d −(d d d iF i == = (1(1 (1 −d− − F F (d (d (d )) )) )) (y (y − − d − d ) d f ) ) f = (1 − F )) − d f (y)dy (y)dy (y)dy − µ d −(d C C i Using Using the the equality equality equality Using ze ze ze = −e −e , we we get ze dz = 2 2 2 2 2 2π 2 −z i i i − d d ˜ ˜ ˜ ˜ ˜ −(d −µ) i i i i i i i i i i = f (y)dy − 2 2 2 i 2 2σ 2σ 2σ 2σ After transformations we obtain the equality −z i 2π 2π i i µ 2π Ci i Ci CC Ci z→∞ z→∞ z→∞ z→∞ − d− e 2i22σ2 22eσ = = = lim lim lim lim (−ze (−ze (−ze ))d+ + + + 222d σ(−ze σ 2 ∞σ.....2σ(1−F i d− i √ iσσ z e 2z→∞ −σ di2πσ=Cid= e2 −z2)−z(−ze C˜i σCi C σ) σd 2 µ∞eeeee2σ i21 −(di −µ)(1− 2 )) x(1− f = lim )σ + µ = lim (−ze + σi∞ 2 i) ddz −µ −z −z 1z→∞ 1 i z→∞ z→∞ z→∞ lim (−ze e . 2 2+ C˜i (d d2i −(d. i −µ) lim (−ze ) + . 2σ2σ i= σ σ ∞ ∞ di ddii ∞ idiσ−µ∞ 2lim 2σ e = (−ze ) + . 2π −z −z −z 2 z→∞ 1 1 1 1 1 (1− (1− (1 (1 z→∞ σ 2After 2 dz 2 ) ∞ 2∞ 2σ ∞ σ 2 (1 22. 0 −µ) ∞ ∞ 2= 2 ∞ simple σz→∞ √ √ √ z→∞ 2−(d Therefore Therefore z e = e (−e Therefore z→∞ σ 2 2 the 2 2−z σ −(d −(d −µ) −µ) 21 −z 2the 2transformations 2we 2equality 2obtain 2√ −(d −µ) σ 2 2 After simple transformations After After simple After transformations transformations we obtain simple simple transformations equality we we obtain obtain the we equality obtain the the equality equality −(d −µ) −(d −µ) 2σ i i i −(d −µ) √ 2 2 (1−F d −µ √ √ √ √ √ −z −z −z −z −z −z i z e dz = ) z = e e dz = . (1 (−e (−e 1 1 1 1 1 1 1 1 1 1 2 . − F y f (d )) (y)dy − d f (y)dy = (1 i i i−z Using 2i − we d2i d− µ µ2 2π(1 (1− i µ −z −z − µwe d.fi∞ ∞∞ − ∞∞performing ∞ 2π −−z dsimple i and i −integral 2dd idz C˜i Lemma Ci˜)) i −µ 2i d di −µ transformations 22σ 22 .(− 2π 2π zthe dz = = (1 − FC˜i (di )) C˜i i y=fC˜lim (y)dy d2i )=+(1fdC˜− y∞ fsimple (y)dy − (y)dy . lim (y)dy F∞dµ i2σ After After After simple simple transformations transformations we we we obtain obtain obtain obtain the the the equality equality equality 2 2the 2) 2e . id−µ 2 2σ 2σ −µ Finally, equal to 2etransformations ˜2σ ee− Therefore = iσ2 (−ze √ √ √ √ √ √ √ N(0,1 2σ 2 dz zsimple ztransformations zf√ = = zequality )√ )third eequality = dz = e2is (−e 2After C˜ii (d C˜iAfter C zsimple dz = 2π(1 − F= (2σ(−e .F )) + e∞√ 2π 2π 2π 2π d(−e σ2π= ef˜ ˜the e√ .(−ze lim )eie2−µ) .idthe After we obtain the etransformations = )i simple +simple .+2dz ∞we ∞ i di (−ze id i 2−(d 2= simple transformations we obtain the equality N(0,1) i −µ σ σthe de −µ dd dtransformations −µ −µ −µ After obtain the equality y f y y f f y f (d (d (d )) )) )) (y)dy (y)dy (y)dy − − d − d (d f f (y)dy (y)dy )) (y)dy . . . − f (y)dy . == = (1(1 (1 −− F −C˜FF = (1 − F d(y)dy −µ −z After simple transformations we obtain the equality − µ 2 di −µ 22 After simple transformations we obtain equality i i i i σ d d −µ −µ d d d d ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ i i i i i i i nally obtain thesis of theorem. After transformations we obtain equality i i i i i −(d −µ) −(d −(d −µ) −µ) −(d −(di 2 2 C C C C C C C C C C C i i ∞ ∞ ∞ ∞ ∞ 2 z→∞ z→∞ σ σ z→∞ 2 2 2 2 2 2π 2π 2π 2π 2π 2π 2π 2π 2π 2π 2π σ √ √ √ √ σ σ σ i ii i ii i i ii i√σ 2 id(d 2 i i i (21) −z −z −z −z −z σ σ σ σ d d d d − µ d − µ − − µ µ d − − − µ µ µ d d − − µ µ 2σ −2d (1−F (1−F (d ) y f ) x f (d +x)dx = (1−F (d 2 2 2 2 (1−F (d ) x f (d +x)d ) +−z i ∞ di ddi i di ddi i di (−ze di e√ ˜2i i 2i2 . ∞ C˜σi ii iσσi1 C˜i −z 2= 2lim i ii i−(d i C˜i Ci2˜ii2)(2 22σ −(d −(d −(d ∞ ∞2π(1 2+ 2i−µ) 2−µ) 22F 22dz C˜2π(1 Ci˜ii 222−(d √ √ √ fi-∞∞∞ 2−z 2−z i(21) i−µ) i2−µ) i (( d 2σ −z ddddi− − − − µµ µ2σ µ)) − − −µ µµ µ µe0)) we Using Lemma 2 and performing simple transformations dz = = = 2π(1 dz dz − − = F = F+ 2π(1 − FFµN(0,1) zperforming e fi-z→∞ zz22222e2e2∞− zthe (C2√ e2π(1 e third (.(µdddddiiii− )) ei− )) + ed−(d )) + + ee(21 −µ) 2z −(d −µ) √ iiequal 22dz σ N(0,1) N(0,1) N(0,1) N(0,1) (20) 0 2 d.i.f i(21) −(d −µ) ∞ i −z d − − µ −(d −µ) ∞ 2 2 2 2 d −µ d d −µ −µ d d −µ −µ √ −z 2 − − sing Lemma 2 and performing simple Using transformations Lemma 2 and we simple transformations we fi√ −µ) i (1−F (d ) x (d ∞ i i Finally, integral is to 2 2 2 i 2 √ 2σ 2σ 2σ 2σ eieeintegral = −(21 iσ iσ i transformations ii we i= dσi(F µ d+ −id− µ ˜µ 2ithe −z dz After simple transformations weLemma obtain equality After simple transformations we the equality iF+ − µ − After simple equality i dz z2zzz−z zobtain ee−z eethe dz dz dz = = = 2π(1 2π(1 2π(1 2π(1 − − − − F F FFN(0,1) (d((− (to )) )) )) )) + + 22e σ= obtain σ σ....C.1˜2σ id µ N iσ iσC 22eσ 2simple 2√ N(0,1) N(0,1) N(0,1) third integral is equal Finally, the third is i+µ e i 2equ id− d+ −µ 22used 2 02σ Below we prove the lemma, which be to calculate Using Lemma performing simple transformations we fiz e dz 2π(1 − ( . )) dfiddfidiσ −µ −µ −µ −µ σ and σ σ dz = 2π(1 − F )) + e Using Using Using Lemma Lemma Lemma 2 2and 22and and performing performing performing Using simple simple simple transformations transformations transformations 2the performing we we we fi-Finally, transformations we fiThe second integral. i 2 2σ 2 2σ idσ iThe iwill nally obtain thesis of the theorem. N(0,1) z e dz = 2π(1 − F ( . )) + e 2−µ N(0,1) z e dz = 2π(1 − F ( . )) + e 2σintegral σ σ σ σ σ σ σ σ 2π d second integral. z e dz = 2π(1 − F ( . )) + e −µ N(0,1) N(0,1) i Finally, Finally, Finally, the the the third third third integral integral integral is is equal is equal equal Finally, to to to the third is eq ∞ ∞ σ ∞ ally obtainnally thesis of the theorem. nally obtain thesis of the theorem. di −µ After simple transformations we obtain the equality iiσσ−µ N(0,1) dσ σ σ σ σ σ d −µ 2 (20) (20) (20) Using Lemma 3 U U ∞ 2 σ σ 2 2 2 ˜ i 2σ−µ) σ σ σ thesis of the theorem. σ σ 2 −z −(d −(d −µ) . the expected value of a random variable T 1 − µ d ∞obtain ∞ −(d −µ) ∞ 2 2 √ √ √ nally nally nally obtain obtain obtain thesis thesis thesis of of of the the the theorem. theorem. theorem. nally obtain thesis of the theorem. ∞ ∞ σ i 2 2 i µ y f ˜ (y)dy (20) −µ di − µ2 −zσ i σ 2 i di − µ 1 di−z d1i − −dFµi+ −√−2d µ die− (20) (20) (20) (20) dµi2σi(∞2 −2d f1C˜i (y)dy. −z y2 f∞ (y)U − ˜(20) i 2σ 2 2 dz = i 2 −z2 2di−z C C ). (22) (20) i 2 ∞ ∞ ∞ i 2 2 2 2 2 2σ The second integral. The The second second The integral. The integral. second second integral. integral. 2 1dz We toe the). (19). U− (20) z prove e dzthe = lemma, 2π(1 −which FN(0,1) dz =(. d2π(1 (1− .− ))=+tod2π(1 e FN(0,1) )) z∞( e be dz .+return )) −µ √−µ+dFito −z1 −z −z e calculate e(20) (22) − ( N(0,1) dz =yUsi 1e = 1µµ µ (20) dformulation d√ Below will used calculate √z − d−(d d2 −z N(0,1) i −µ) id iσ i− −z2the i F i− diintegral. − d√ µ di −µwe dbe f1U −2d −µ di −µd i2π Below we prove the lemma, which will Below used we prove to lemma, will be used i −µ The The The The second second second second integral. integral. integral. i 2 dz i −µ 2 iscalculate 2dz 2 dz σ (11) C˜ σ(iintegral. σ∈√ 2= σ integral. √ ). ).).ei we (22) (22 (22 dz =6N(0,1) = 1− 1.1σ− F −N(0,1) Ftransformations, FN(0,1) (σd((−µ = σN(0,1) Ifwe C time of task completion i√ J,e σee2 eafter L EMMA 6which 2the 2σ The second 2π 2π 2˜zσ The second integral. iprove e dz = 2π(1 − F )) + σwe σcalculate σ3. Below we prove the lemma, which will be used to calculate ˜ The second σ d −µ d d −µ −µ simple ultim The second integral. d Below Below Below we we prove prove prove the the the lemma, lemma, lemma, which which which Below will will will be be be used used used the to to to calculate lemma, calculate which will be used to calculate N(0,1) the expected value of a random variable T . The second integral. i i i i i 2 σ σ σσ 2π3 we 2π2π i 2value σ2π 3 we e expected value of avalue random variable the variable expected T˜i2 . diσ−µ random variable T˜Wodecki (20)M.We (20) σ σσ σ get theLemma conclusion Using offinally The Le (20)finally Using i ˜.return z˙aejko, P. Rajba, and 2..W. of 2 .Using the expected of random ˜..˜ii26value toLemma the formulation (19). Using (20), (21) and (22), the lemma thethe the expected expected expected value value value ofof of a random aaa then random random the variable variable variable expected T˜T˜ii2TT of random variable T W.Bo Bo z˙aejko, P. Rajba, and M. Wodecki 6 6 6 6 We return to the formulation (19). Using We return (20), (21) to the and formulation (22), (19 i (20) Using Lemma 3 we fin The second integral. ˜ The second integral. The second integral. The second integral. 6 6 6 6 EMMA˜ 3. If Ci is the time∞∞(11) of task completion i ∈6J, We We We return return to thethe the formulation formulation formulation (19). (19). We Using return Using Using (20), to (20), (20), the(21) formulation (21) (21) and and (22), (22) (22 ofu( 6after simple transformations, we ultimately obtain theand thesis 2(19). dreturn − µto If Ci is the˜ time (11)of∞Ltask EMMA 3. If C˜i ˜is i2∈the J, 2time (11) of task J,2 ultimately L EMMAL3. to 2we −z2i ∈ −z icompletion 6 after ∞ transformations, 6 2 completion simple transformations, we after simple obtain the thesis of 6 ˜ ˜ σ 2 The second integral. ˜ 2 2 y(11) fof (y)dy =If (σ +µ 1J, FN(0,1) EMMA 3.3. 3. If C(1iis CCis is the the the time time (11) EMMA of task 3.completion completion completion Cii 2 J, is thethen i) time ∈ii ∈ ∈−J, J, (11) ofthe task completion i −z ∈=J, L EMMA LLEMMA L −zsimple − F (dtime (y − )task fC˜task (y)dy Using equality IfIf=CIf (11)(11) of task completion ,,ultimately we get y2obtain ze dz −e we ˜ iof ithe i is ˜time i )) Cd after after after simple simple simple transformations, transformations, transformations, after we ultimately ultimately transformations, obtain thethe the thesis thesis of we oo i )) i the lemma √ fC˜obtain (y)dy =thesis en thenLemma 3.= 2 , we σequality (1i − FC˜iC(d Using the ze 2 dz we = get ddii (y i then i ) fC˜i (y)dy d lemma the−e lemma 6 the − di i then then then then d 2π ∞ 2 Bull. Pol. Ac.: Tech. XX(Y 6 6 i ∞ ∞ di P. W. W. z˙z˙ejko, ejko, P.∞Rajba, Rajba, and and M. M. Wodecki ∞ 1 −(d thethe the lemma lemma lemma 2 ∞ 2 the2 lemma diWodecki −µ i −µ) Bo − −z −z 1 1 2 2 2Bo 2 ∞ ∞ d d µ − µ −z 2 σ 2 2 ∞) ∞ i 2√ 2∞e 2σ 2 2 = 2 2 − µ2 −(di −µ) z √ ∞ e1 2 +− µ ) 1− −µ)2 σ 2 (σ 6∞2F i˜ (y)dy 2 µ2 √ 2 = ∞∞y fC ˜ (y)dy = . −(d −z ∞ σ(−e µ + d i∞ −z 22 dz 1 = √2d−z y2 fC˜i ∞ (y)dy (σ + (σ 1 d− F 2π y2 f d22dz diσi− d2i d= − − µµ µ − z e−zy∞2−z (y)dy 2222e∞ ∞f2σ 2 i 2 y fC2˜i F (y)dy −ydifN(0,1) . i+ =2 (1 −i F ∞y fC ∞∞ ˜i2π 221 (y)dy 22= 2Ci + 22N(0,1) 2 µ e) 2 2σ 2i −µ i ))µ )2∞221 −z 1 d∞ 2− 2 .N(0,1) C˜i (d= C˜i (y)dy iz−µ d2i −µ √ −z 2∞ (y)dy = dz f 2π σ σ σ 2σ= √. 2σ i∞ √ √ √ √ ˜ ˜ e (−e z dz ) = e −z −z −z −z y y y f f f y f (y)dy (y)dy (y)dy = = (σ (σ (σ + + + µ µ µ ) ) ) 1 − 1 1 − F − F F (y)dy = (σ + µ ) 1 F σ σ σ 2 2 2 2 2 2 2 C C di d ˜ ˜ ˜ ˜ d −µ d 2πσ µ σd −µ N(0,1) N(0,1) N(0,1) 22 i i y yyf ff(y)dy dC C 2 y 2dz 2 dz i i f ˜ (y)dy e 2 dz + µ2 dN(0,1) . √iσ+2σ √ √ (y)dy == = ezi zee√ fidz i −µ 22 ddz − −FFCC˜˜ii(d (dii)) )) d=i dd(1 (y (y− − −iFCdCdCi˜iiii))(d ffCC˜i˜i)) (y)dy (y)dydi y fC˜i (y)dyi − 2π2π ˜i CC ˜˜i i(y)dy Using the equality ze ze dz = =di −e −e we Cget get 2πσ σequality σσ Ci the iiz −µdz 2π ,,σwe 2π C˜i (y)dy = √ di i Using di −µ ddi id−µ −µ ii (21) σ i (21) 2dµ 2π σσ −(di −µ)d2i d d d d 2π 2π 2π i +d d d 2 2 i i i i − µ 2σ i i σ σ σ 2 2 −(d −µ) −(d −µ) Using Lemma i i simple transformations ∞ ∞ 2 i the ∞ Finally, integral + dperforming µ + dfi2σ 2µand 2σ µ2we −z2 to 1 equal 22. i2− i−µ 2σ 2−µ) 2 2 ∞ third ∞ ∞ −z2 is 21 2µ −(dii−µ) −µ)22 2 −(d −µ) −(d Using 2 1∞of 1−z22 22dz−z2(21) +∞ Proof. From definition function the 2σ ithe i i −µ) 1∞ ∞∞−z2∞ 2 2 2−(d −z −z −z −z2µ √ √µthe 11of 11e−z2−z 112i −µ) ed+i d e+2σ . −(d . + thesis + 2σ 2σ µ2√ + µ2πσ + − di2σi− − µ2 µµe−(d +22 ddistribution −µ 2σ 2σ √ √ e z dz + µ +2σ obtain of the theorem. i ∞ ∞ nally ∞ Lemma 2 and performing simple transformations we fi∞ ∞ ∞ 2 2 2 2 2 2 2 2 2 d −µ d −µ √ √ √ dz + µ (−e d .−µ 1dz 11z −zd−z 11 + µ 1−eµ2 −zdz 22 dz 22z ))third 2σ +2σ 2σ 2σ √ √ √ √Finally, isee−z equal i e integral √ √ zz+√ e√ e2πσ dzfollows = = e√ =1√ .. i22−µiµσe √ (−e e ee2σ 2σ2σC˜.,dd.(15) .−µ +2πσ + +ff √ ∞ the 2d i−z 2= di −µ2to di −µ 2π 2π 2 −z 1 2 2 2 2 2 2 −µ i random variable and (16) it that )) )) y y f f (y)dy (y)dy − − d d (y)dy (y)dy . . σ √ √ √ √ √ e e e z z z e e e dz dz dz + + + µ µ µ z dz dz dz e dz +2σ +2σ +2σ µ µ µ +2σ µ i ii ddii−µ −µ2 ˜˜ii ˜˜ii theorem. ii ii of the 2π ) 2 + + σµ () 2π FN(0,1) (σ nally Cobtain thesis C CC 2π 2π 2π 2π 2π 2π 2πσ 2πσ 2πσ 2πσ σ √ di2π σdi −µ ((22) −µ ddiei−µ −µ dz == d1 −µ d− di i−µ −µ ). 1 − F σ σ i N(0,1) σ σ dProof. dii ddii the lemma, d −µ ∞ 2π 2π 2π 2π 2π σ Below wethe prove which will be usedfunction to calculate From definition of the distribution 2πσ σσ1of 2π σ2π σσ ∞of From −z2 µ di − 2 iσ function 2σσµ ∞ function roof. From the definition of the distribution Proof. of the of the the1 distribution the −(y−µ) −µ)2 (21) (21) 2 .the definition 2 −(di(22) 2 dz ˜ d − µ 2σ + d − µ √ 2 e ). = 1 − F ( −(d −µ) i i the expected value of a random variable T 2 2 ˜ 2 2 N(0,1) i 2 Proof. Proof. From From the the definition definition of of the Proof. the distribution distribution From the function definition function of of the of the the distribution function of the Proof. From the definition of the distribution function of the 2µ d − µ 2σ µ + d − µ − i be di −µ1 − d random variable C , (15) and (16) it follows that 2σ Below we prove the lemma, which will used to calculate nd nd performing performing simple simple transformations transformations we we fifii i i 2σ √ 2 22. i (y)dy = y y f e dy. ˜ ˜ √ 2 2 2 2 + µ ) F ( ) + e = (σ 2 2 2 ˜ σ −(d −(d −(d −µ) −µ) 2 µ ndom variable Ci , (15)˜and random that variable Ci ,˜ (15)the (16) it follows and (16) it + follows that 2πN(0,1) )(19). CiFinally, 2σ id i i−µ) third integral is equal to Finally, the third integral is equal to Weµ return to2 F the Using (20), (21) and (22), √ ) 1 − ( + µ ) 1 − F ( . + e ) = (σ = (σ d d d − − − µ µ µ 2σ 2σ 2σ µ µ + µ + d + d − d − − µ µ − µ σ formulation i i i i i i i N(0,1) N(0,1) ˜ ˜ 2 2 2 2 2 2 2 2 σ d d 2π 2 22 . 2πσ ˜ i variable iand (16) it random random variable variable C C random C , (15) , , (15) (15) and and and (16) (16) (16) it it follows it follows follows that that that , (15) follows that random variable C ˜ 2σ 2σ 2σ i i i i . the expected value of a random variable T of of the the theorem. theorem. √ √ √ 2 + + µ + µ µ ) ) ) 1 − 1 1 − F − F F ( ( ( µ ) 1 − F ( ) ) ) + + + e e e = = = (σ (σ (σ = (σ σ σ EMMA 3. If C is the time (11) of task completion i ∈ J, L 2πσ 224 Bull. Pol. Ac.: Tech. 65(2) 2017 ∞ ∞ i i −(y−µ) N(0,1) N(0,1) N(0,1) N(0,1) ∞ after simple transformations, we ultimately obtain the thesis of 2 2 1 ∞ 2 ∞ ∞ ∞ σσ(19). Using (20), σ 2πσ 2πσ 2πσ 122 We−(y−µ) return (21) and (22), −z −z − µµ TσHEOREM ddii − ∞∞y2 f ˜ (y)dy ∞∞y2−(y−µ) −(y−µ) ∞11of 2σ 2to the formulation √ 2= 1∞ edy. Introducing, the same in the = proof Lemma 22σ substitutions, then y2 f ∞ 21 2 dy. ∞2the ∞2 2fas22dy. −(y−µ) −(y−µ) −(y−µ) 3. If the task completio Unauthenticated 2σ C 2lemma 2 1 √ √ ˜ (y)dy = (y)dy y y e e 1 1 iis they time √ √ e e ). ). (22) (22) dz dz = = 1 1 − − F F ( ( ˜ ˜ EMMA 3. If C (11) of task completion i ∈ J, L 2 2 2 2 2 2 2 2 N(0,1) N(0,1) C C ibe di y yy d will 2πe dee2σ 22σ2σy22dy. ddii−µ −µ= ee the the lemma, lemma, which which used used to calculate calculate we ultimately obtain the thesis of √√ √ √ (y)dy =to = = (y)dy y yyfC˜be ff(y)dy fi Cdy. e 2σ 2∞ transformations, dy. Download ∞ i iwill σσ random 7:48 PM di di we di 2π 2π 2π simple ˜˜i(y)dy ˜dy. get 2πy after |∞4/25/17 i di − variables normally distributed 2Date iµ σσ 2 22i..CCi= −z2 then σ di2d)di i 1 − d d2ifdd˜i i(y)dy d 2π 2π 2π 2π ˜ ˜ y (σ + µ F i i 2 2 e of of aa random random variable variable T T the lemma Ci same 2 dz Introducing, the in the Lemma substitutions, ii theas √ z ethen f3. (y)dy = task Introducing, T HEOREM If(20), the times 22 1ofN(0,1) σreturn 1, 2,completion .and .d.T ,times n), the value of C˜itask ntroducing, as in proof of∞proof Lemma 2 −(y−µ) substitutions, theWe same as in of Lemma 2 2ysubstitutions, the ∞the i i −µ T HEOREM If the completion HEOREM 3. expected Ifaretheindependent task comp ∞ dsame We to the theproof formulation formulation (19). (19). Using (20), (21) (21) (22), (22), d−z 2π and are independent i Using 2 2 1 3. dreturn − µ to σ
□
□
2 22 jba, jba,and and M. M.M. Wodecki Wodecki 22P. 2 −z2 = 2−e −zRajba, −z 222 dz Rajba, and Wodecki W.−z Bo ejko, M. Wodecki −z Using equality ze2and 2−z 2 2 , we get 2the 22 −z 2z˙−z the −z2 −z −z −z 2ze 2 −z −z ∞ Using the equality ze dz −e ,−z get 2 ,−z 22ze 22−e 2 = 2we Using the equality dz = ,2we we get −z −z −z2 Using equality dz = −e get −z −z −z2 −z −z 2 2 2 , we get 2 2 2 2 2 2 (y)dy 2ze ze 2dz 2,−e 2get Using the equality Using the equality dz = −e , we get ze dz = −e Using Using the the equality equality ze ze dz = = −e , , we we get ng the equality dz = −e we get ˜ −z −z −z −z 2 2 2 2 2 2 C 2i ) dz 2 dz = −e 2 , we get ing the equality ze dz = , we get 2−e 22 ,, we i∞= (1 − F (d )) (y − d f (y)dy Using the equality Using the equality ze dz = −e we get ze Using the equality ze = −e get −z −z −z −z 2 ˜ ˜ ∞ i 2 2 2 2 −z 2∞ C −(di−z −µ) ze −z 2C W.2the Bo z˙= ejko, P.222Rajba, iequality i2and the e22equality Using ze dzequality −e ,ze we get dzM. ∞we −z 22−z −z 12 1 2−µ) 1Wodecki fthe ∞ 2 get1 −(d −µ) Using equality ze = = −e ,−e ,2we we get get 22 = ,get −e 2 dz Using −(d Using 12 12 1= 2dz 2 √ 22−µ) 22 equality 2idz 2−(d ∞−z ∞ ∞ i −µ) (y − di )the (y)dy ∞ −z equality Using the ze −e ,−z∞−z we ze −e = −z 2 dz 2−(d) −µ) 12d−ziz22√ 1i= 2σ 2 , .we ˜1 −(d2i−(d −µ) −(d −µ)2 −z get ∞ −(d −µ) 1 1 21 222 ∞ 22√ e dz e (−e 2i1 22 1 2 C 222 i= ∞ 2zdy i 2 i 2 2 ∞ −z −z −z −z −z −z −z −z 1 1 1 1 1 1 1 1 2σ −z −z 2 −(d −µ) 1 1 1 2 √ √ √ 2 2d ∞ d −µ z e dz = ) = e . −(d −µ) (−e −(d −µ) −z −z 2σ 22 ee122 22 i1 . 1zget we get 22 1 2σ √ √ √2 e)2−z −z −z i2−z ze= dz = e222−(d .√21 e 2σi 21 . e−µ 2√ di −µ 2 2σ 2 .2−z √ √ √ 222σ ∞√ dz = = e (−e −z2 1∞2zeget 1 1 1 −(di −µ) 1∞ 2= 1dz 1 i,,zwe d2iz −µ 2(−e 22ii2.) 2−zdz 2−z √ 2σ 2σ 2π 2π 2π −(d −µ) −µ)= 2d2we ∞ 2√ 2 2 2d)i) ∞ ∞ √ √ √ √ √ √ √ √ dz = z dz = e −z −z (−e (−e z dz = = ) ) = = e e . . (−e (−e √ √ −µ 2 2 e ) = e . (−e i i d −µ .dz σ 2 dz = = −e −e , , we get −(d −(d −µ) −µ) 2 −z −z −z −z 2 1 1 1 1 1 1 i 2 2 2 2 di −µ −µ 2σ of 2π 2π 2π Stable scheduling single machine with probabilistic parameters i 2 2 2 2 d 2 2 √ √ √ 2σ 2σ i i 2σ σ 2 2 z e dz = ) = e . (−e d −µ d −µ ∞ ∞ d d −µ −µ i √ √ √ √ √ √ −z −z −z −z −µ √ √ √ z e dz = ) = e z . e dz = ) = e . (−e (−e 1 1 1 1 1 1 2π 2π 2π z e dz = ) = e . (−e f (y)dy . σ 2π 2π 2π i i 2 2 −(d −µ) −(d −µ) i i di2−µ −µ dd−µ − 2(−e 2= ddi −µ (y fi C)) (y)dy Using dz σ −µ 2 2π i = C 2σ 22π 2σ 2 σσ − iσ2π 2π 2π get 2π1 2π d2σii−µ −µ d2σ 2π 2π 2π d) −z −z −z2π Stable scheduling of single machine wi 22i e= √f= 1ez√ 1ze 1(−e 1diσ−e e˜d˜ii(d dz e2π dz = eiddσ−µ )= .2π .−z,2we (−e σ (y)dy (y)dy .12π 1√ 2f2C 2√ σ√ i −µ 22π 2 2 22π 2σ −µ ˜dz ∞ ∞ ∞ σC σ i2π √ √ √ √ √)d2π z˜zi2π ey√ dz = )2the )σ)ziσequality == ee√ (−e (21) 2π C˜σii2−µ ii−µ −(d −(d −µ)= 2π 2π 2π 2π 2 ..(21) 2 di −µ di√ −µ 2π di F 2π 2π i2i−µ) .2σi2σ− 2 2 i 2σ 2σ d −µ d −µ σ σ d d −µ −µ √ √ √ σ √ √ 1 1 z e dz = e dz = e . ) = e . (−e (−e σ 2 2 ∞ ∞ i i 2π 2π 2π 2π i i d −µ σ (21) di f 2π di 2πd −µ ddi i−µ −µ σσ (21) −(d −(d −µ) −µ) 2 2 2 2 tions we fiσ 2π 2π 2π 2π 2π 2π i i (y)dy − d (y)dy . (21) (21) 2σ −z i= i (21) di −µ 11 ..the third 1we 1˜fiσ integral ˜i 2σ i√ √ 2π = eeCFinally, C −(di −µ)2Stable scheduling of single 2π−z2σto 2π 2π −z2 ∞ diσ−µ (21) is (21)(21) i)) fiwe fi-−µσσ−z (21) (21) machine with 222πis σ to 1σσequal 1 2π 1 (21) 2σ 2 ))∞Finally, 2σ 2σ Finally, the third equal √ √ simple √mma ddii−µ σ d2π = = eintegral eFinally, ..the (−e (−e ansformations fi-the (21) (21) i we the third integral isthird equal to 2π Finally, third integral is equal to In If the times of the tasks are independent random 2 2 (21) (21) 2σ 2 execution 2 and performing transformations we fi√ √ √ Finally, the third integral is equal to integral is equal to Finally, Finally, the the third third integral integral is is equal equal to to z e dz = ) = e .(21)Stable (−e Finally, the third integral is equal to d d −µ −µ σσ Finally, Finally, the third integral is equal to If the times of the tasks execution are independent random ii the (21) 2π 2π 2π 2π fiFinally, the third integral is equal to d −µ we scheduling of single machine the third integral is equal to Finally, the third integral is equal to Finally, third integral is equal to y)dy − d f (y)dy . ∞ i −z2 di −µ i integral σ˜σi ∞ C If the times of the task 1 − µ d 2π 2π 2π y, the third Finally, is the equal third to integral is equal to i (21) variables with a normal distribution( p˜ip˜∼ N(pi , a · p), i 2 J J), we ∞ fingthesis simple 2 σ 2to i ), i ∈ ), ithe Finally, Finally, the the third2∞(21) integral integral is2is equal equal todz −zfi 12 we − µ− dF in theorem. Finally, 2 σ of dtransformations variables with a normal distribution i N(0,1) −z −z ∞ ∞third 12e1−µ 2− i » N( pi, a ¢pirandom 2 2√ of ∞∞ e∞1Finally, ).equal (22) − (µµ Inwth If the the tasks execution are(scheduling independent the is equal third integral is (21) (21) −zintegral −z −z −z 1N(0,1) − −times µto d= d(22) 1121√ − µ µ(µdµdi i− d1dthe −z 11∞ − µF dd1ito d−z ). (22) − F ( iµ i when ∞d1third ∞ id iµ ∞ 2 to calculate 22 = variables with a machine normal d Stable Stable Stable scheduling scheduling of of of single single single machine machine √ −z i−z22 eedz ). dz = 1 − F (21) √ ). (22) dz = − ( −z 1 1 − − µ d d − i W we then calculating the value of the function W σ i to to −µ 2 2 2 2 N(0,1) ate rem. 2π 2 ∞ ∞ i i i 1 2 N(0,1) 2√ei√ 2d √ √ e e ). (22) ). (22) dz = 1 − F ( dz = 1 − F ( then when calculating the value of the function W and W we e e ). ). (22) (22) dz dz = = 1 1 − − F F ( ( d −µ √ ). (22) dz = − F ( culate d −µ σ σ culate ∞ ∞ −z −z 1 2 2 1 1 − µ − µ d i 2 2 N(0,1) N(0,1) N(0,1) N(0,1) If the times of the tasks execution are independent random 2π N(0,1) i 2 2 variables with , a · p(22) J), we wi 21 − e2π (22) dz −z = F= (FFN(0,1) i). di √ −µ ddi −µ −µ √ √ aenormal al lsimple is isprove equal equal to to be used σ dto −µcalculate e−zFfi). (22) ). ithen dz = −= dzµ=distribution( 1 − FN(0,1) ( p˜i ∼ N(p transformations 1which 12π − − µµ( µ d(d().ithird i ), i ∈calculating 2π e√ 1di1i −µ − .σ(22) σ√ 2π N(0,1) ∞ ∞(22) iσ 2dz 2 (22) N(0,1) σ σ σ1we σ d ithe 2iddii−µ 2( σ σ when th 2π −µ d −µ 2π −z −z √ 1 − − d d 2π eculate will be used to calculate e e ). ). (22) dz = − dz 1 − F ewe iσ−µ lemma, Finally, the integral is equal to 2 2 will use Theorem 2 and 3. i i i Cas σ ((N(0,1) σ 2π σF σσ √ will use 4the 5.distribution( √√ ee e N(0,1) ).). (22) dz = == 11− − FN(0,1) σσ ). 2π 2πTheorem di −µ 2 . di −µ dd − 2π N(0,1) 2dz σ µ with aof normal p˜are ∼independent N(p 1, ai· W pirandom i ∈ J), w σ− F ddi i−µ −µ we then when calculating value of theσare function √ σσ return µbe e 2(22) (22) ).and (22) 1σ − FN(0,1) (σ dz =variables 1and (ofof σ (20), i independent m. 2),random to the (19). (21) (22), ii − 2π 2π 2 .formulation N(0,1) I3 If If If thethe the times times times the the the tasks tasks tasks execution execution execution are independent random ˜dz iF dWe dσ −µUsing σ (( will σ calculate 2π 2π a, te which used to willWiuse Theorem 2 and W We return to the formulation (19). Using (21) andand (22), Using ed value of a random variable T iii−µ i(20), − − µ µ d σσd ). ). (22) (22) F We return to the formulation (19). Using (20), (21) and (22), σ σ We return to the formulation (19). (20), (21) (22), ∞ 2π 2π i 2 N(0,1) N(0,1) etion i ∈ J, We return to the formulation We return (19). to Using the formulation (20), (21) and (19). (22), Using (20), (21) and (22), σ σ We We return return to to the the formulation formulation (19). (19). Using Using (20), (20), (21) (21) and and (22), (22), −z 1 − µ d We return to the formulation (19). Using (20), (21) and (22), i W we then when calculating the value of the function W will use Theorem 2 and 3. Case C i ). ). (22) (22) dz dz = = 1 1 − − F F ( ( 2 1ii), 2J), after simple transformations, we ultimately obtain the thesis of σ σ We return to the formulation (19). Using (20), (21) and (22), J, ˜ N(0,1) N(0,1) variables variables variables with with with a normal a a normal normal distribution( distribution( distribution( p ˜ p p ˜ ˜ ∼ ∼ ∼ N(p N(p N(p , a , , · a a p · · ), p p ), i ∈ i i ∈ ∈ J), J), we w w We return to the formulation (19). Using (20), (21) and We (22), return to the formulation (19). Using (20), (21) and (22), ˜ We return to the formulation (19). Using (20), (21) and (22), i ∈ J, 2 ˜ i i i i i i i . m variable T i ∈completion J,toafter simple transformations, we ultimately obtain the thesis We return to the formulation (19). Using (20, 21) and (22), Case B.((problem: 1jd , c¢d √(21) eand ).of (22) dz = 1of − F ∼ N(d , · d )| w T ) Case B (problem: 1| d i » N(d after simple transformations, we ultimately obtain the thesis of ic i)j∑w iT i )) itransformations, inde turn the formulation We return to (19). the Using formulation (20), (21) (19). and Using (22), (20), (21) and (22), sk i ∈ J, ∑ σ σ after simple transformations, we ultimately obtain the thesis of N(0,1) i i i i i d −µ ˜ We We return return to to the the formulation formulation (19). (19). Using Using (20), (20), (21) and (22), (22), which will be used to calculate after simple after simple we ultimately transformations, obtain the we thesis ultimately of obtain the thesis after after simple simple transformations, transformations, we we ultimately ultimately obtain obtain the the thesis thesis of of r simple transformations, we ultimately obtain the thesis of i will use Theorem 2 and 3. C J, A If C isreturn the time (11) of ultimately taskwe completion iσthe ∈ J, ,i 3. σ(20), iafter thetransformations, lemma 2π(21) We to the formulation (19). We Using return (20), toobtain the formulation and (22), (19). Using (21) and (22), er simple transformations, we obtain thesis ofthesis ˜We ii indeW we then then when when when calculating calculating calculating the the the value value value of of of thethe the function function function W W after simple transformations, we ultimately obtain the after thesis simple ofthen transformations, we ultimately obtain the thesis after simple ultimately obtain the of simple transformations, we ultimately the thesis We assume thatdeadlines deadlines of tasks completion are independent ∈ J, 1W 1i1 W 2W2we 2 we Case Bof (problem: 1|dE(α the i∼ 2lemma We assume that of tasks completion are the lemma mple transformations, after simple transformations, ultimately obtain we the ultimately thesis ofobtain the thesis of lemma ˜the .lemma Using Using (20), (20), (21) (21) and andwe (22), (22), ˜ after after simple simple transformations, transformations, we we ultimately ultimately obtain obtain the the thesis thesis of of . variable T the lemma the lemma me J, (11) of task completion i ∈ J, the the lemma lemma ∼ N(d , c · d )| w T ) Case B (problem: 1| d indepe ∑ ˜ i − µ i i i i i after simple transformations, we after ultimately simple obtain transformations, the thesis of we ultimately obtain the thesis of lemma ation tionthe (19). (19). Using (20), (20), (21) (21) and and (22), 2 ∞ 2 ofUsing the lemma random variables distributed (di » N(dWe will will will use use use Theorem Theorem Theorem 2 and 2normally 2 and and 3.3. 3. the lemma lemma Cas C C lemma ∞ the −z ∞ (22), 2 ˜ assume i, c¢d i), i 2 J ∼ N(di). , that c · dead pendent random variables normally σ ∞ vari ma the 2 22 2 We to (19). Using (20), (21) 2 imately mately obtain obtain the the thesis thesis of of 2 y22f ∞ ∞ lemma −zformulation ∞return ∞∞ ∞∞2the the the lemma dultimately − −zz2 e 2 ∞dzWe N(d , c · ddistributed )| wi Ti ) (diare Case 1|and d˜i2∼(22), 2 2σ σ ilemma assume that deadlines of tasks completion indein ∑ ∞ ∞ ∞ ∞2µ ,W λ E(α 2∞ σ 2−z 2 B (problem: 22 222 −z µ∞∞completion (y)dy = i i 22 √ Tardiness of task i 2 J the lemma the lemma i ˜ 2 ns, ns, we we ultimately obtain obtain the the thesis thesis of of d − −z −z 2 −z 2 σ σ σ σ −z ∞ σ C 2 2 i (11) of task i ∈ J, √ z2−z e 2z2 deie−µ fyCy˜− = dz ∞ ∞∈ J −z2 ∞222 y ∞ J). Tardiness pendent random variable ∞ iσ 20,1) 22 ∞ 22 dz 2−z 2d −µ 2 i 2∈ 22transformations, √ ∞ = (σ (y)dy 2√ 22 fCF˜Ci∞˜= (y)dy =z2simple dz we σ ), of task i d f (y)dy = after ultimately obtain the thesis of σ σ 2 2 i 2e 2−zz d i 2π 2 ∞ ∞ 2 2 fC˜i (y)dy + µ ) 1 ˜ √ √ √ √ z e z e y f (y)dy = y f (y)dy dz = dz z z e y y f f (y)dy (y)dy = dz dz i d −µ √ 2 2 2 2 2 2 y f (y)dy = e dz i 2 2 d −µ ˜ ˜ ˜ ˜ σ We assume that deadlines of tasks completion are indei ∞ ∞ ∞ ∞ −zd z −z 2Cidz random variables ie σ2 y fCC˜˜id(y)dy E dC− variab ˜i dd∼ ˜˜iinormally 2π 2222∞22dz B Cµ C i, c · −µ dpendent −µ Case de diσ −µ iy √ dσ −µ e−z e··i )| = σ√ 2π fdfiiCCii˜= = zz i −z 2di√ 2√ N(0,1) ∼ ∼ N(d N(d N(d , c,,z·ccddistributed ddi2i)| )|dz w w Tii)TT Case (problem: (problem: 1|1| d 1| dσ i∞ 2π inde in in ˜ii(y)dy i −µ ∞ ∞f(problem: ∑ ∑ ∑iw 2= 2 σ J). Tardiness of t diii)()),di i∼∈ N(d 2y 2i √ 2d2ii dz C˜i (y)dy di ddiii y 2π −µσ22 σσ22−z di i−µii 2π dfdiC˜ (y)dy σ2π σ 2d ByB y(y)dy (y)dy z√ e−µ = zdzdz e2 2 2π dz iσ Case i2π the lemma ˜−µ σ √σddσii−µ µ )∞ 1 −y−z F √ σ ˜ 2C 2 −z2 of z z e e y f f (y)dy (y)dy = = dz f2π 2π di= ∞ 2π 2π dC d2 −µ d ˜ ˜ N(0,1) 22i i i∞ ∼ N(d , c · pendent random variables normally distributed ( d i i 2 σ ), i ∈ J). Tardiness task i ∈ J d va C ∞ i i 2 2 ˜ ˜ σ σ −z d d −µ −µ √ √ σ i i y f (y)dy = z e y dz f (y)dy = z e dz i i i ˜∞ ˜ di22 d2π 2π −z i i −z We We We assume assume assume that that that deadlines deadlines deadlines of of of tasks tasks tasks completion completion completion are are are indeindeinde1 1 C − d , if C > d , σ E(α E E ∞ ∞ i )2 ∞ ∞ C C 2 2 i i i i 2 2 d −µ d −µ d d σ σ 2π 2π i i ∞ 21 11−z −z 2 22 ∞e∞σσ + i− i −z ∞2 µ ∞dµ 1 −µ)12122−zz√−z−z i∞∞dz i 2 σσzz ee 222σ µ+2σ ˜ of task i ∈ J 2 d2∞ dz 22 22 dz −(d i+ ∞ d−z ∞1 2π 2 22 2 √−z2 e dz ∞ ∞2π (23) (23) 21 i−z2−z ∞ 1i∞ 2−z J). dpendent ddi.1 −µ σ∞ σ −z2i d−z −z Tardiness −z2Ti = −z −z 1µ σ 1z121√ −z ∞µ ∞ ˜˜ii ∼ 1+ 1d∞12i2∞i∞µ− 2√ i ), C i−µ d−z e−µ22 eedz e∞2222.2eedz +2σ √ √ ∞d21 ∞∈random ∞ 2+ y= = zz+2σ eeµ dz dz 2 −F+2σ 222 dz 22π 2 1 22 √ 2dz 2µ ∼ N(d N(d N(d ·· vari pendent pendent random random variables variables variables normally normally distributed distributed (d˜(i(dd∼ √ √ −z −z i22−µ i−z ze+ µ dz +2σ 1 va vi √ dz + µ −z −z N(0,1) −z 1 1i distributed 1 i , ciT i˜,,·cc= 1 2σ dµ −µ −µ 2z 2 2µ 2 2 2 2 ≤ 0,d˜inormally C ddiiµ −µ −µ 2π ˜i ,d−z˜2i 2, 2 2 ∞ ∞ ∞ √ 2 2 2 2 e + σσ √ √ √ √ √ √ √ √ e e z e dz + µ z dz + µ dz +2σ y f (y)dy = z e dz e e µ z z e dz dz + + µ µ dz dz +2σ +2σ i i d −µ d −µ √ √ 2 2 if if 2 e z e dz µ dz +2σ µ −(d −µ) d −µ d −µ ˜−z − , C > d C σ21 σ ∞ ∞ ∞ ∞ −z −z −z −z +2σ 2√ 21 1 1 i1 dz i dz 2µ 21 22π σ i 2π 2π i i 2 2 2 i i C 2πµi +2σ 2 2 √ √ e z e + di −+2σ µ 2π d −µ d d −µ −µ d −µ d −µ d d −µ −µ d d −µ −µ √ √ √ −z −z −z dσ −µ d −µ √ √ i e e z e dz + µ dz z e dz + µ dz +2σ µ µ 1 1 2π e z e dz + µ dz 2 2 µ σ2πσσ σ 2π 2π σ i i i i i i i i i ∞ ∞ ∞ ∞ whe i i 2 2 2 2 ˜ σ d −µ d −µ 2 2 2 2 2 2 2πσ 2 σ 2π 2π d d −µ d −µ d −µ d −µ 2π 2π 2π 2π 2π ), ), ), i ∈ i i ∈ ∈ J). J). J). Tardiness Tardiness Tardiness of of of task task task i ∈ i i ∈ ∈ J J J d d d (23) = T d −µ d −µ −z √ 1ee√ 1eze√ 2π e −z z√ e 2π dz +−µµ e+ dzσ i ii iσ 2 2σµ∞∞of +2σ 0 i µ 2σ i i 1 22−z 22−z i iidz σ σe σ2+ edi −µ .µ σii µ σσµ √ √ √ zz√ dz +σi+ µµdz dz dz1dz +2σ +2σ 2π 2π ction the 2π 2π 2π 2π −µ di −µ 2π 2π 22ddσ ,σ Ciif≤ Cd˜ii ,> d˜i , 2dzdz 2 dz + 2 C σ 2 σ random −z −zµ −µ d+2σ di −µ √ √ T˜ = σσ zdi√ σσ µ 121+2σ ez2π e edi2π µ e− dzi − d˜iif σσ∞∞ he 0, 22√ −(dµ 2π 2π i i−µ i−µ isµ2also of the 2µ 2 22the of −(d i −µ) σ −z 22 di −µ diµ −µ di −µ variable. σ√ σ σ σ2di −µ 2π 2π 2π 2π d − 2σ µ + d 2 (23) 2a i −µ) −(d −µ) 22the 2 −z −z −z ution function of i i 1 1 − i √ σ σ σ σ e e µµ dz dz i −(d −µ) 2 2 2 whereF µ∞the + − µ 2∞a .random 2π 2πvariable. 2π 2π 2−µ) 2 2 2 2i −µ) 222 =2 2 2−(d i is also i+ 2 σof − − 2µ e 22 dz the of2the 2 eom −(d (µµ2 22σ 0, ˜ ˜˜ ddz dii−µ −µ −(d −µ) σ+e σ −(d − 2σ µµσd+ − −z −z −(d −µ) 1 i −µ) 1dd2√ −dFi −ddiµifunction )2σ + (σ 22 )dz 2i−µ) 2σ 22 e 2σ ii−(d if C ≤˜ dis˜˜,also a2 random variable iµ iµ √ √ 2 eµdistribution + +definition µ dz1
2 2σ iµ ethe .(σ 22σ −ii − − di2σ ,ddii,,2 µif+ ififCdiC C> > dµi ,ddii−(d ,, i −µ) CiC C 2µ+)µ2 1) dµF µ2σ − 2σ µ2.2+ d2diddiµ −− µ(µ 2σ 2σ µ− + + ddi2dd√ − −− µ2µµ 2−(d 2d.i − µ ii > −(d dd1i1iF− µ + dσ µ 2−d 2µ of22that √ 2π + (− )µ+2σ e2σ2σ22ii2−µ) −(d −µ) i− iµ iµ i −µ) 22π 2N(0,1) √ d2di2i−µ 22˜2(σ − (2σ )2σ + e22dz = (σ 2σ + d2i2+ N(0,1) 2iiz2σ σσ2= − e22σ + µ )F(F− . d −µ4.e√variable. )+ √ µ + ws eT= e2πσ + µ µ µ µµ + ˜i˜ii−2= i+ 2)− N(0,1) 2σ −(d −µ) i −tardiness also a2random ii+ 2−(d w √ √ √ (23) (23) (23) func = T˜id2T + )µ −1 (− + µ )d+ 1√ − F (i−µ) . µdz) distribution e− + (σ = (σ EMMA Cumulative of + µ )σσ−µ )F 1F − − (µF (N(0,1) .µ.22σ ) + + e)2σ−(d = == (σ Fro 2and 22σ.is 2√ 22+ 21 2e 2π 2π + µ ))i2µ,= 1+ − . ) e iL he fσ=the distribution function of the di −µ σ −(d −µ) −µ) d − µ 2σ µ d µ − µ 2σ µ + d − µ ariable C (15) (16) it follows that N(0,1) N(0,1) 2σ N(0,1) N(0,1) 2πσ N(0,1) √ i 2σ 2 i i σ 2σ 2 µ(σ 1 − F ( . ) + e (σ i i i i √ √ σ √ µ ) 1 − F ( + . ) 1 − F ( . ) + e + e = (σ = (σ d d − − µ µ 2σ 2σ µ µ + + d d − − µ µ 2πσ + µ ) 1 − F ( . ) + e (σ 2+ 2 2 2 2 ˜ ˜ ˜ 2πσ N(0,1) −(d −µ)−2σµ22πσ σ2σ −(di −µ) if if 2π 2π 2iid iiσ N(0,1) N(0,1) N(0,1) σ σ σ 2 2 2 2 i σ ≤ ≤ ≤ d , d d , , 0, 0, 0, if C C C 2σ 2πσ σ 2πσ 2πσ 2 2 √ √ d d − µ 2σ µ + − µ µ + d − µ 2πσ µ ) 1 − F + ( µ ) 1 − F ( . . ) + ) e + e = (σ i i i i i i Lemma 4. Cumulative distribution of tardiness 2 2 i i i i 2σ 2σ EMMA 4. Cumulativ L σ 2 is also a random variable. N(0,1) N(0,1) √ √ 2 2 2 2 + + µ µ ) ) 1 1 − − F F ( ( . . ) ) + + e e σ σ = = (σ (σ 2πσ σ 2 2 functio −(d −(d −µ) −µ) 2πσ 2πσ 2 2 −(y−µ) (16) it=µµ follows N(0,1) ii N(0,1) 2σ whe w w √ 1 − F ˜ (C 2)σ(σ+ + µ √ 2σ + +dd+ − µ) 1σ∞ µ2that − FN(0,1) (ii−µ) )2πσ 1 −2πσ FN(0,1) .) +4. Cumulative e (2σ L EMMA ei −2σx)F ˜.(Ciof (σ = 22 2πσ ii − ∞ distribution −(y−µ) 2µ ), tardiness if x > 0, −(d −µ) σ σ he distribution function 2πσ 2− 2 of 2πσ 2µ 1−(d 2σ di −(d di d+ − − µ√ 2σ µµee+ + dd2σ µ.. the √ 2σ σ σ is 2 µ= 2dy. 2µ ii 2σ ii − −µ)2 2πσ 2πσ fu 2 dy. (x) F (24) 2 2 i d+ 2σ ˜ 2 d − µ 2σ + d − µ is is also also also a random a a random random variable. variable. variable. √ 2σ 2σ y f (y)dy = y e F Ti √ √ i iCumulative .. 2 )) 12πσ + ee ˜+ 1 − Fd˜i (C −(y−µ) of tardiness C2πσ 6) ∞σ itσd follows i that □ ) L+EMMA 0,F ˜ (C √ 4. . e 2σ 2 distribution (σ + µ 2 ) 1 − FN(0,1) ( di 2π3. If=the FT˜i (x) = 1 − if if x >x ≤ 0, 0, 2πσ 2πσ i 2√ 2σ 2 T bstitutions, y e dy. di i − x)Fd˜i (Ci ), HEOREM task completion times are independent fun fu fu σ ns, 2πσ (x) = 4.4. F TEMMA (24) 0, utions, T−(y−µ) HEOREM 3. 3. If the tasktask completion times areare independent tions, ˜L 2 HEOREM 3.completion IfHEOREM the task completion times are independent EMMA EMMA 4.Cumulative Cumulative Cumulative distribution distribution ofof of tardiness tardiness tardiness L L TTHEOREM If the times independent di T HEOREM mma 2Tsame substitutions, 2π i are if 0, x > 0, (24) 1 − Fd˜i (Cdistribution Ttask 3. If3. the task completion 3.substitutions, times Ifdistributed the are task independent completion times independent HEOREM HEOREM 3. 3. If IfIf the the task completion completion times times are are independent independent i − x)Fd˜i (Ci ), T 3. If the task times are independent 1T 0, ifindependent x≤ ng, s, in the proof of Lemma 2completion ∼ N(p , a=· pdensity ) F(k = random variables normally pindependent ˜independent and function 2 the T,yHEOREM HEOREM 3.as the task completion times are independent T T HEOREM 3. If the task completion times are HEOREM 3. If the task completion times are 2 If T HEOREM the task completion times are k k k (x) = (24) In s 2σ utions, ∼ N(p , a · p ) (k random variables normally distributed p ˜ ˜ √ e dy. k p˜are kare k··pp ∼ N(p ,a(k a(k p˜= (kN(p = ˜T2,i a · p )(k0,= random variables normally distributed p˜are T HEOREM REM 3. If the task completion 3. If the times taskp˜distributed completion are independent times ∼ N(p (k = random variables normally ka k,p kkk),)independent kkk))∼ T T HEOREM HEOREM 3. 3. If If the the task task completion completion times times independent independent ∼ N(p · p ) (k = random variables normally random distributed variables p ˜ normally distributed ∼ ∼ N(p N(p , , a a · · p p = random random variables variables normally normally distributed distributed p ˜ p ˜ ∼ N(p , a · ) (k = variables normally distributed edom s, proof of Lemma 2 substitutions, ≤ and density function k k k k k k k k k k k k (C (C (C − − − x)F x)F x)F (C (C (C ), ), ), if if if x > x x > > 0, 0, 0, 1 − 1 1 − − F F F 1, 2,then . If . . the ,normally n), then the expected value of random variable 2 ˜22 Ti normally HEOREM If the task completion HEOREM times independent the completion times are N(p ,independent aN(p · 3. pare dom variables normally distributed p˜k T∼ 2π ˜i dd˜˜ii i ii p˜kd˜i∼ Theorem 3.3. task completion times i ii k , a · pk ) (k = 2 random ∼ ,(k prandom = random variables normally distributed distributed ∼ ,N(p a2arandom ··= pvariable )a) 2(k = krandom k )ka dindependent dd˜˜ N(p kp kIf kktask −z 1,T2,1, .are .variables .variables ,.independent n), the expected value ofpp˜·˜are ak∼ variable 2 (C 2a= 1, 2, .independent .then ,n), n), then the expected value of avariable random ∼athen N(p )aN(p ∼ = ·(k pT˜variables = random normally variables distributed normally pdistributed ˜,distributed distributed p(k ˜N(p (x) (x) (x) == = T˜ F FT˜F Fvariable (24) (24) (24) 2, .then ..,expected then the of random variable T˜T(k ik )density tion tion times ˜T(k ˜iand kof kthe kN(p k variable ˜normally In suc fd˜i)(C(ki − x),ii if x > 0, ˜ 2 ∼ ,aTvalue aT ··a2 ppT )2),of (k random random variables normally normally TT˜˜function 1, .n), .2, .2, n), then the expected 1, 2, . .expected value .distributed n), of avalue random expected random 2. .2, i iand 1, 1, 2, .,..then ...variables ..,are ,2.n), n), the the expected expected value value of of a,ap˜apa˜random random T ˜i , ai ) ,variables .dz ,times the value random variable k kkp kvariable k,variable k˜2, kkip 2,2= idensity ii p ˜ = d ˜ ˜ normally distributed p » N(p , a¢p )(k = 1, 2, …, n), function ˜ i i ∼ N(p , · ) (k = ∼ N(p · p = random variables normally distributed random variables ˜ distributed ˜ i 2, . . . , n), then the expected value of a random variable 2 −z k k completion completion times times are are independent independent 1 k k 1, , n), then the expected value of a random variable 1, . . T . n), then the expected value of a random variable T 1, 2, . . . , n), then the expected value of random T 0, 0, 0, if if if x ≤ x x ≤ ≤ 0, 0, 0, k k k k k k 2 2 ∞then −(y−µ) 2 of (x) = (25) f i In 2 of Lemma 2 substitutions, ˜ ˜ zroof = i i ˜ i 2 2 = −z and density function 1 1 T n), then the 1, 2, expected . . . , n), value the of expected a random value variable a T random variable T 2 HEOREM 3.di the task times 2 n), ˜˜icompletion µ diT− ∼ ∼ N(p ,,athen a··expected ppkkthe )) (k (kexpected = = ted ed p˜p˜k2, i are independent2 = ) √ 2value ˜variable 1, 1, .2, .e.2σ.N(p .then the expected value of of a2)variable random random variable T i ke2, kthe 2T 2a µ value of a random TiIf ˜i x), 0, if x≤ 0, f ˜ (x) = Fd˜i (Ci ) iT˜ 2 value ˜2pip2+ √ − µ d i− = dz = (σ z µ) e 1, ..2,.,n), . dz ,pkdy then the expected value 1, 2, . of . . a , n), random then the variable expected of a random variable T 2d −µ 2 − µ d ∼ ∼ N(p N(p , , a a · · ) ) (k (k = = yy√ distributed distributed p˜n), ˜kkthen E( T ) = (σ + µ 1 − F (C ) f (C − if x > 0, F ∞ i 22 i k k k k ˜ ˜ 2 2 2 i i ˜ i N(0,1) 2 2 2 − µ − µ d d 2π − − µ µ d d T −z − µ d 1 d d = ˜= random )Ti˜T + )µµ ˜2π −11dFi− ˜i2i2)22)(σ iµ idensity id i i , a · pk i) (k = N(p variables p˜k ∼ 2 ˜E( 2T 2N(0,1) 2 22T 22 µ 22 E( E( =2= (σ + )2F − F In s −F of of22π aa random random variable i+ E( = )1F − di µ σdistributed (x) =function (25) 0, InIn ˜avariable ˜2= iN(0,1) and and function function ˜+ 22E( ˜density i − µ if x > 0, ii − N(0,1) ivariable + µ − )N(0,1) (σ + )normally T)(σ T(σ = = (σ + µ )2T )F 1 1F − − σ(σ 2˜F 2f Tdensity 2k F √ 22+1(σ 21 E( )µ) = µ ))2dz ˜+ −z (σ e= and ˜E( σµµ i= (σ 2 µ= − d1T d N(0,1) ˜1, iE( i− )˜N(0,1) f2d˜i (Ci −dx), ted ted value of random random variable T iT i˜˜)2)222(σ N(0,1) iof σ−µFN(0,1) E(zTT )TaE( + µ 1)µ− − i− i1 ˜i (C iT σ d2i −µvalue E( T ) + µ ) − F E( T ) + µ ) 1 − F − − µ µ d d ) = (σ + µ ) 1 F 2 2= 2(σ 2N(0,1) L i i 0, if x ≤ 0, N(0,1) d i i i N(0,1) N(0,1) ˜ ˜ σ σ i i i = 2, . . . , n. σ σ i 2 2 2 2 2 2 σ 2 dz 1, 2, . . . , n), then the expected value of a random variable − µ − µ d d E( 2π T ) =−z(σE( ) 2T 1i − )+ = )FFN(0,1) 1 −2σFN(0,1) ˜ ˜i2iµ)˜)2E( fT˜i (x) = (25)(25) i i σ 2FN(0,1) 2iσ 2 −z E( T T = (σ (σ + µ(σ µ∞µ))2+ 11µ− − σ)−(d σ ∞ 1+ 1iz + −z −z 2µ µE( 2− Tei = )2 = (σ + 11− FN(0,1) E( Tµ˜σ = −µ) (σ ) 1 − FN(0,1) σ2N(0,1) σ 22 + µ √ dom ii1− z(σ edµ) dz 2d 2σ 0, ≤ i−µ) √ (C (C (C − x),x), x), if ififx i> xx=> > 0,1,0, 0,2, . . . , n. F F dz = 2σ µeµ + ddz − 2 µ 2)+ −µ) ˜F ˜˜ii(C ˜˜ii(C i ) ifi)d)˜iff(C i− ii − 2 d 2 2−(d 22 i 2 i 2 √ √ i d d d d d −(d − − µ µ d e z dz + 2σ µ z −(d −µ) FN(0,1) µ 2σ d − σ σ 2 i ∞ 2 2 i i 2 i 2 i+ 22N(0,1) 2 µ2+ 2iµ e+ d2σ− µ. −(d 2µµdd 2σ µ−+ − i i= 1, 2, . .The .f,T˜n. 222σ −(d −µ) −(d −µ) −µ) di −µ 2π −µ 2σ µ dd√ − (x) (x) = = =µis similar as (25) (25) (25) L EM ff(x) −z2σ −(d −µ) + 2i−µ) 22 ˜˜iproof − d ii−(d 2π12σ iµ iµ iµ 2 22 2σ iµ Proof. Lemma 1. 2σ − µ T T 2σ + + d d − µ i 2σ 2 ) ) 1 1 − − F F −(d −µ) µ + d − 2 2 2 √ 2 σ σ 2 2 + e . −(d −µ) −(d −µ) i i i i −(d −µ) i i 2π 2π i i N(0,1) N(0,1) √ 2+ i + e˜ ..)2σ√ µ d+ − 2ii 2σ 2 .. + µ ) 2σ1 22πσ 2 − .F 2ii2.eT σµ √ 2σ 2σ +e2dd√ µ2σ−(d µ +0,d0, −µ 2µ +σ2σ µ d −µ2σ z+ dz 2σ µµ2σ + − if ififx Proof. ≤ xx ≤ ≤ 0,0, 0, The proof isdom 2 µ2σ E( = (σ 2σ −(d −µ) i0, √ √ + e− e2 + + i −µ) i =N(0,1) 1, 2, . , n. −µ) 2222i−(d +σσe2d√ − µ22i√ µ− + µ 2σ similv ∞ √ 2σ−µ) i i i−µ) 2σ 2 .i 2σ√ .2σ i+ iee.− √ + +.i.−µ) e 2σ 2 . 2σ 2σ µ+ µ + + dd2e2πσ − µ µ2edµe−(d 2πσ (19) 2√ 2πσ −(d −(d +√ 2π iid i e .2σ 22πσ 2σ+ i = 1, 2, …, n. 2πσ 2πσ 2πσ σ 2 2 −z √ 19) 2σ ∞ µ − µ + d − µ 2σ 2πσ 1 + e . . 2 F Proof. The proof is similar as Lemma 1. 2 2 i i 2σ 2σ (19) √√2πσ (19) do −z + ee e 2σ 2 .. . 1+ 2πσ√ −(d −(d −µ) 2πσ 2πσ 2as in Case A we will now prove two theorems alz. 2 + 2 22(19) L √ i= + value 2σ µii−µ) z√ e2πσ dz 2 2πσ iProof. iSimilarly = = 1,1, 1, 2,e2, 2, . .The .2σ ..,2..n. .,,proof n. n.. 2πσ √ −(d −(d −µ) −µ) +µ e ) dz. (19) ) 2 2 d −µ 2πσ 2 i i Proof. From the definition of the expected 2σ 2σ i + + d d − − µ µ is the similar as Lemma 1.of random di −µ the definition ee (19) .. Proof. −z 1ii Proof. 2 µ + d2πσ Similarly as in Casedom A −(dus 2πσ 2π From of the expected value lowing to calculate expected values variables 22 From i −µ) Proof. From the definition of the expected value σ 2σ − µ 2π F the definition of the expected value do d 2σ 2σ 2 ˜ i e e . . σdefinition √integrals. Proof. The proof isA similar as now Lemma 1. two theorems al□ Proof. the Proof. of(19) the From expected the definition of the expected value Ti ( e From 9) dz. Proof. Proof. From From the the definition definition of of the expected expected value value ve of. From the definition of expected value Similarly Case weexpected will prove value 2σdefinition oof. From the definition of the the expected value √ + etheTThe .inproof als. ∞ value 22as Proof. From the definition ofthe the expected value Proof. From ofis the value Proof. From the definition of the expected egrals. ˜ ˜ lowing us to calculate the grals. 2πσ 2πσ ∞ oraz . T Proof. Proof. Proof. The The proof proof is is similar similar similar as as as Lemma Lemma Lemma 1. 1. 1. 2π 2 From the definition Proof. From of the the expected definition value of the expected value fs. the above integrals. i 2 2 ∞ ∞ ∞ −z Proof. From the definition of the value each 1an Proof. Fromcalculate the definition of the expected value From the definition the expected value ˜ )expected 2πσ as in the Case A we will nowofprove twovariables theorems al- and lowing us toi calculate expected values random ∞ ∞the individually of above (19) ∞˜2of xvalue fT˜i (x)dx E( T . Proof. 2∞ 2From the definition of Proof. the definition of theSimilarly expected value ˜i2 . 2the 2 dz. From x2∞=fxTx˜2integrals. (x)dx TE( ∞= ∞ e Proof. Ttheorems T˜i oraz egrals. ˜T˜i= 2 ˜E( 2(x)dx 2∞ 2 22∞ 22iexpected 2 2)T )(x)dx fT˜Ti˜˜i(x)dx E( ix ) = f ˜ ˜ 2 ˜ i Similarly as in Case A we will now prove two −∞ i ∞ xpected pected value value 2 2 ) = x f (x)dx ) = x f (x)dx E( T E( ) ) = = x x f f (x)dx (x)dx E( E( T T ˜ ˜ 2 2 2 2 2 2 ) = f E( T ˜ ˜ ˜ ˜ ˜ F ∞ ∞ −∞ calculate expected values oftwo random variables ulate each of the above integrals. Ti . us iE( i iT i˜˜ ))x= Ti Ti oraz lowing T−∞ T 2π tof integral. Since HEOREM If the expected deadlines for the completion T fT2˜T−∞ Ti ) = = (x)dx ) in = xthe fTwe (x)dx E(to T˜4. fifTiT˜˜ii(x)dx i˜i (x)dx −z2expected n2s. of the the expected value value Similarly Similarly Similarly as as in in Case Case Case A A A we will will will now now now prove prove prove two two theorems theorems theorems al-alal˜we ∞T2x ∞ ˜ 2E( ias and de 2fii ˜E( 2ix −∞ −∞ E( −∞ 2 i = i E( xT (x)dx )−∞ = x2 fT˜ithe (x)dx T˜i−∞ ˜∞˜2iT2i−∞ allowing us to calculate the expected values of random variables Proof. From definition −z2− 2 , 2 E(Ti ) 2˜T˜(x)dx 2 of theT˜expected 2 T˜ 2 .value ∞ ) ) = x x f f (x)dx E( T −∞ T2i= −z 2 e −∞ −∞ random HEOREM 4. If the ex T ˜ ˜ oraz T −z −z ∞ ∞ of tasks are independent variables normally distributed i i i ) = x f (x)dx ) = x f (x)dx E( T E( T 2 2 2∞∞ i˜4. −∞−z2 i2 −∞ 22 ,each lowing lowing lowing us us to calculate calculate calculate thethe the expected expected expected values values values ofof of random random random variables variables variables an x)dx ∞ −z 2T˜i∞− ∞∞ ,z∞∞2e ∞of i∞+ T˜i us −z −z(d −∞ −∞ ate above integrals. 2 the 2 , 2 HEOREM If the expected for the completion T ∞ ∞ T˜i and Tto .to ∞∞ ∞ FC 2(x)dx ∞∞ x(−z) fC˜i (d x2x)dx fiC˜i (d + x)dx = deadlines f)−z ˜ 2,fi )˜22(d + i∞ 2∞ 2 · 2e− 2e2 + 2f i ·+ 2 i )− = x−z∞= f2xCx˜2i2(d x)dx −−∞ F2− (d ∞= ∞e2i)) ∞ ˜i−∞ ∞ ∞ of tasks are independent r = −1 = z e 2T 2i2·..di ), then the expected 2− 2(d 2∞ ˜i2F 22(−z)e 2 2ix ∞ i˜(d TT˜)2˜ii(x)dx (d + x)dx F x f (d + x)dx d ∼ N(d , c value of tardiness (23) ˜ ˜ ˜ ˜ ˜ ˜ = f (d + x)dx (d x f + x)dx 2= 2i C C ˜ ˜ = = x x f f (x)dx (x)dx i i ˜ ˜ ˜ i i i 0 0 i 2 2 ∞ ∞ ˜ ˜ oraz oraz oraz T T . T T T 2 2 C C C −z −z , = x f (d + x)dx − F (d ) x f (d x f + (d x)dx + − x)dx F (d ) x f (d + x)dx = = x x f f (d (d + + x)dx x)dx − − F F (d (d ) ) x x f f (d (d + + x)dx x)dx C C C 2 2 2 2 2C˜Cx)dx 2C˜Cx)dx = xxTT= (d + )i)2CF x(d (d + i˜˜ ) = ix iixCf˜Ti˜are ˜F i 0 i = i i(d i iF iT ∞∞ ∞∞ i 2i − ixi C˜f))fiC˜i˜ii i 2iE( ˜˜i (d HEOREM 4. − If Frandom the expected deadlines for distributed the completion and ii ffC˜˜i ii∞+ x)dx of tasks independent variables normally 0Ci˜ixx Ci˜− Ci˜ixx C Ci˜ii˜˜ (d (x)dx = (d − F (d (d CC af an i2+ i+ i i˜ix)dx i˜2˜ (d i˜ix)dx i˜˜ (d 0 0 = f (d + x)dx − f (d + x)dx f (d ) x f (d + x)dx f + x)dx f + x)dx ˜ i 22 · (−z) 2 0 0 i ˜ ˜ ˜ ∞ ∞ ∞ T −∞ −∞ i i i i i i i i i C C −z)e = z e − e , i 2 2 2 2 d C C C C C C ∼ N(d , c C C C 0F+ 0−∞ x)dx 00i x)dx 00i(d i i i = 2˜˜−(d x 20f0C˜ = (d xiC+ f(d (d ) ii + xFF0˜0˜f0(d (d ) 2˜˜ (d fC˜x)dx (di + x)dx i completion i · dof i ), then the iiF ∞∞ ˜+ ˜x)dx ˜iii− i+ 2ii xi+ 2then =0 = (d x)dx − −x)dx (d + x)dx C(d Theorem 4. If (d the expected for the ˜i − 0F 0 deadlines 00xx xffC 00xxi ∞xffC iC i i x)dx i x)dx i))∞ii∞C)˜i= of tasks are independent random variables normally distributed Pro CCF C02ix i(d d ∼ N(d , c · d ), the expected value of tardiness (23) − µ − µ C C 2 −z 22 i 2C 2+ − f (d + + x)dx (d ) x f + x)dx ifi C˜ i(d i i C˜∞ i C˜ fi(d i i i i ˜ ˜ ˜ 0 0 0 i i i i i −z −z −z 2 HEOREM HEOREM HEOREM 4. 4. 4. If If If the the the expected expected expected deadlines deadlines deadlines for for for the the the completion completion completion T T T 2 ∞ ∞ C C C 00 + ˜0i ) independent −z i i˜ (d i i∞+ x)dx. i E(T i ∞ ∞ FC 2 00 2022x)dx ,∞F = (1)∞2− )˜220(d x fCi˜i (d Civalue FN(0,1) )2)−z2.2 · x2(−z) x .f2fCC˜˜ii(d (d 2 2− random normally distributed −z ∞− ∞ ∞are ˜i tasks = (d + ∞2i)) eiz−ze =iiz+ e2x)dx − e2(1 variables (23) ∞ the of tardiness C ix fix 2∞ 2 x)dx. 2 ˜i2F i˜(d i0˜(d dof ∼ N(d c=independent · dFiN(0,1) ), then expected = (1 − (d + x)dx. = (1 f+ x)dx. C Cx .200F −z −z −z i2,independent − F (d (d ) ) x f f (d (d + + x)dx x)dx σ σ fC˜C−z i0xi˜2+ i i+ 0fC˜F i− 2x ∞ ∞ 2)F C 2˜(d ˜˜ii−z ˜ ˜ of of tasks tasks tasks are are are independent random random random variables variables variables normally normally normally distributed distributed distributed i i i i = (1 − F = (1 (d ) x f (d + x)dx. (d ) x f (d + x)dx. = = (1 (1 − − F F (d (d ) x f f (d (d + x)dx. x)dx. 2 2 of L 2 e2xx2z22− dz = −ze . = (1 − F (d ) x (d x)dx. i i 2 ˜ ˜ C C C C ˜ ˜ ˜ ˜ i i i i i i ∞ ∞ i i i i ˜ ˜ ˜ ) = FN(0,1) E( T˜ i − µ − µ C C i i C C C C C C C C 2 2 − 2 x)dx. i = (1 − F (d x f + x)dx. C CC˜i2 (d = x f (d + x)dx − F (d ) x f (d + x)dx i i d » N(d , c¢d ), then the expected value of tardiness (23) i)2F i i i iC i˜2˜ (d i i 0 = (1 − F = (1 − F (d ) x f (d + x)dx. (d ) x f (d + x)dx. = (1 ) x f (d + 2 ˜ ˜ ˜ ˜ i − e dz = z e dz − e dz = −ze . i i i i i 0 i Proof. ˜ ˜ ˜ ∞ ∞ ˜ i i i i i i i i 0 0 C C C C 2 2 2)2 C C i 2 −z. 0F 0 0˜0i(d C(d 0f0C(d i i = (1 − F (1 xFF˜−z (d + x)dx. xii0+ f2C˜x)dx. (di i+ x)dx. −z˜ (d ii∞ Ci FN(0,1) E( T0˜i )i , = ˜˜ii ∼ ˜iii− i )(1 iC 2 N(d = = (1 − −= ofof d˜ii dd∼ N(d ci,,·+ ccF dC·N(0,1) ·iix)dx. ), ddii), then ), then then the the expected value value of tardiness tardiness tardiness (23) (23) 0the ˜i2iC(d i))calculate i 00xxi ff2˜˜fi(d i x)dx. C− µ(23) 2Cexpected C CF i − µ value − = − FC˜iBy + (di )introducing x∼ fN(d σexpected e 2 ∞ dz e 2Ci dz −ze .i(d It = remains x+ fx)dx. +(1x)dx. (Ci −µ) ˜to ˜0i (d ˜i (diia 0(1 i )00∞ x2∞∞fxC2C02i(d i x)dx. ˜i (d P i∞By of Lem ∞ −It remains C C C ˜ C σ µ i Ci−σ i to calculate + introducing a − ∞ ∞ ∞ C ) = F F E( T ˜ ∞ i 2 2 2 2 It remains to calculate x f (d + x)dx. By introducing a 2 (d (d + + x)dx. x)dx. � 2 i i It remains to calculate x f (d + x)dx. By introducing a 0 C N N(0,1) N(0,1) ˜ 2 i ˜ ˜ 0 0 ∞ i i ˜ 2 2 ∞ i i 2 2 2 0 C 2σ ∞ ∞ 2 C ∞ It remains to calculate It remains to calculate x f (d + x)dx. By introducing x f (d + a x)dx. By introducing a ItxIt remains remains to to calculate calculate x x f f (d (d + x)dx. x)dx. By By introducing introducing a a 0 C √ tiCi)definite to calculate x f (d + x)dx. By introducing a It remains to calculate s x f ˜+ (d + x)dx. By introducing i 2 2 − µF + e 2 ˜ ˜ i)remains iremains (C −µ ˜ ˜ integral −z −z i i i i i ˜ i 0 C i σ σ 0 0 C C d∞ xsubstitution f f (d (d + + x)dx. x)dx. + x we get substitution y = d 0 0 C C N(0,1) i It to calculate x f (d + x)dx. By introducing a 002 + xCC introducing ∞= ∞ ˜˜ii 2 ii i It remains of σ − Pro − − − µx)dx. µµ By −ii − − µaµµ C CiC C i˜˜ (d remains to calculate calculate xi get (d2iiiintroducing +(d x)dx. By− introducing x−µ)fC2˜iC(diC i2ix ItItcalculate remains to calculate ffiget + x)dx. By introducing aax2 fto ˜i∞ 2yd C C i ii+ ∞ we get y 0 0 C 0 C 2 P P ix σ = (1 F (d ) (d + x)dx. 2 i (C + x we substitution = d dz − e dz = −ze . i ains to It remains to calculate x f (d x)dx. By x f + x)dx. a By introducing a i 2π ˜ ˜ ˜ ˜ ˜ + we substitution y = d ∞ i i latin 2σ ˜ ˜ i i i ∞ ∞aa 2 Ci E(E( 00substitution Ci C Tix)dx. )TTσ= )) = = F− FFN(0,1) FN(0,1) FCN(0,1) E( ItIt2 remains to calculate calculate xwe x xffC2get (d (d + x)dx. By By introducing introducing −µ y = d C Cix)dx. .−z + we get x we get substitution d= = diremains + xsubstitution xi xwe get substitution y+ yto dwe didcalculate ˜˜i0fiy a CiC iiF xix0= get stitution ya substitution = i(d i+ i i+x)dx. N(0,1) N(0,1) ii + x ∞ 0get 0 C 2 Itremains remains to It calculate + By introducing (did+ + By introducing we get bstitution =yddi= + √2π e Now ∞y 0 xa fy ∞ ∞to + xdwe we get +√xii weeN(0,1) get substitution y= = dix+ substitution ˜2 ˜i = we get 2σ i+ di −µ substitution 2σ σ σ σ σ σ i − µF i∞i2+ 0 0 C C −z2dy −z 2 i of of o L ∞ ∞ (C −µ) N(0,1) + x we get + x we get ion y = substitution y = −ze . 2 −z 2 ∞ di∞ + +zsubstitution x)dx. By awe − iBy i= introducing introducing ∞ σ − i 2 + + x2dz xawe get y= = dedi2∞ i∞ 2∞ x)dx∞∞ 2substitution 2x)dx. i−µ σ i∞d σC ∞ ∞ e ˜˜2(d −ze ∞dz ∞f∞ xget =. d )y2 = (y di )x2∞we fC∞˜i (y)dy 2π + 2∞ Similarly, d −µ ∞2y∞ ˜i (d i xi we get get y222By = substitution d2(y)dy + x)dx. x)dx. By introducing aa= 22(d C ixCx if+ −µ di∞ −µ in of a proof of theorem −z 2 Proof. √ as + x)dx (y fi)C)˜2i− fintroducing ∞ ∞ − µF +By e the2σ∞2case ∞ 2= 2∞− ˜+ 00−z 22to C Cii (d∞iiiσ+ i− xx2∞ffsubstitution 2fxx)dx 2 should latingL d −µ 22fd 22 f (d + x)dx (y d (y)dy N(0,1) f (d + x)dx = (y − d f (y)dy ˜ ˜ i i It remains calculate x f (d + x)dx. introducing a 2 2 ˜ ˜ i i 0 d i ∞ ∞ C C ˜ i x f (d + x)dx = (y x − ) (d f + (y)dy x)dx = (y − d ) f (y)dy 2 (y)dy (y)dy x x f (d (d + + x)dx x)dx = = (y (y − − d d ) ) f f i C C 2 2 2 2 2 x f (d + = (y − d ) f (y)dy (C (C −µ) (C −µ) −µ) i i 0 C σe σ0 ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ i i i i x)dx i ∞ ∞ ∞ ∞ 2dz = −ze . i i i i σ ˜ ˜ i i i i i i d C C C C 2π C C C C σ x f (d + x)dx = (y − d ) f (y)dy C C σ σ C − − − µ µ µ σ C C la i i i i i i i i i 0 d (y)dy (y)dy x f (d + = (y − d ) f x f (d + x)dx = (y − d ) f (d + x)dx = (y − d ) f (y)dy x f 2 2 2 2 Proof. Similarly, as inhav th ˜iii i2i2 ∞C − − 2 22 2. i i∞ C i ii ∞2i2∞ 0= get get x 0f˜ (d ˜˜ii−µ ˜˜ii ) f (y)dy i iif (y be this∞time the usee− of N C˜i + C˜i dC C di get 0C0˜i i 2x)dx ddi + ix + (d− d= x)dx )i 0dfdiC˜didi(y (y)dy =− (y − dC(y)dy 2σ 2σcase 2σ Proof. did √ √ √2as i+ idsubstitution i2i + xC˜iwe y f= −− µF µF + + ee lemma − µF EM L 2C 2C (y)dy x x f f (d (d + x)dx = (y − d ) ) f 0Ci 0 C˜x)dx Similarly, in the of a proof of theorem 2 should ∞ ∞ ∞ −z00 0 d N(0,1) N(0,1) N(0,1) ˜ ˜ ˜ ˜ i i i i i i i i i σ C C z2 0 x f (d + x)dx = (y − d ) f x (y)dy f (d + x)dx = (y − d ) f (y)dy i i C˜ dii2 i i C˜ C˜∞ ∞ ∞ ∞ ˜ 0 d be this time the use of lem i i i σ σ σ 2 ∞ ∞ ∞ 2 2 2 i ∞ ∞ C 2π 2π 2π ∞ ∞ d− di i 2d ∞y f ∞ (y)dy latin la la 2dz = f−ze .i y∞ f ˜∞(y)dy i ∞i + d i C˜Proof. ∞ ∞ 00 0222 d=−µ dd∞∞ (y)dy. (y (y (y)dy (y)dy 2∞∞ ∞fthis ∞i∞2d ˜ d t 2(y)dy ˜= ˜i i(y)dy di ∞itSimilarly, as in the2.case aproof of theorem shouldhave ∞2time ii))2 f∞ the use of lemma 22 f ˜ (y)dy. i f be ∞− Ci2d CC fi˜y√ − (y)dy ∞ ∞ ∞ ∞ 2i ydf2iC˜yy ∞ compute ∞ ∞ iThus, i− 2−∞ 2∞∞ + remains theofexpected value of the2 vari20C 22i 1− 22 y˜i2(y)dy − (y)dy df(y)dy. dx dx = == (y (y dd2˜2iif(y)dy ))2fy˜ ˜− ffC(y)dy i+d = (y)dy + fC˜C˜i i(y)dy. fdifC˜Ci2d (y)dy − 2d ffC˜Cdd˜i+ C∞ ∞−z= ∞ ˜− 2= 2(y)dy. to ˜C if(y)dy d(y)dy. i(y)dy i (y)dy i ∞ ∞ ∞(y)dy ∞i 2+ 2 i = C y f − y f y f − 2d f y f + d f = y y (y)dy − 2d 2d y f (y)dy (y)dy + d d f (y)dy. (y)dy. −z −z i i 2 2 2 1 2 2 (y)dy + d f (y)dy. y f (y)dy 2d y f i σ ˜ ˜ ˜ ˜ ˜ 2 i ˜ ˜ ˜ ˜ i i i ∞ ∞ ∞ ∞ ∞ ∞ i i ˜ ˜ ˜ x f (d + x)dx = (y − d ) f (y)dy i i √∞= d d d ifC i˜i (y)dy. C C C C Cy C CdC C C C C i 2fCC˜i(y)dy = 2 e ddyidiz= − 2d y f (y)dy + d ˜ ˜ C i i ha i iy i i i i i i i 2 i i i i i i d d d = = f (y)dy − 2d y f (y)dy + d f (y)dy. y f (y)dy − 2d y f (y)dy + d f (y)dy. be this time the use of lemma 2. (y)dy. f (y)dy − 2d y f (y)dy + d f 2 2 2 ˜ i C C d d ˜ ˜ ˜ ˜ ˜ ˜ 2 2 ∞ ∞ ∞ ∞ ∞ ∞ i i i ˜ ˜ ˜ ˜ i i i Thus, it remains to com i iii y f d(y)dy. iC 22d− eddiii − L Cii i f ˜ (y)dy. Ci dT Cthe C d2Ciiii d+ di C√ able E( ). i 2 asdas Proof. Proof. inin the case ofof ai proof aa proof ofi of theorem theorem 2 should 22 should Similarly, as in the case of proof of theorem should di ii dd did+ d(y)dy ∞i i case (y)dy (y)dy y ddfdiiC−µ (y)dy fyC˜− fi− − 2d ii2y 2π ii C iSimilarly, √2π i0i dd2 ˜iid+ dProof. iC i Similarly, = yi= y2dz (y)dy 2d yydfifyeiC˜˜iCfiC(y)dy (y)dy (y)dy. d˜iiid2d Ci diThus, C di d∞ d∞ ˜˜ifi(y)dy. i˜i= i −µ ii i did2π remains to compute the expected value of the i f ˜ (y)dy. iiffC i1 C Therefore = C+ +yii2dfi2Cdd˜idd(y)dy (y)dy. (y)dy +the diuse yffCe2C˜d˜i−z fii(y)dy − 2d yithis fCthis 2π id(y)dy ˜i2 ).vari˜di i− ˜iit dC˜idii(y)dy 2˜ idi i ii C i bebe σ 2dd 2= 2Therefore √ i i∞ able E( T ∞ ∞ σ Therefore ∞ C C C d d hav ha h i time time the use of of lemma lemma 2. 2. be this time the use of lemma 2. and Therefore i i i i Proof. Similarly, as in compute the case ofthe of theorem 2 y)dy y)dy + +ddid−µ (y)dy. ˜˜ 2(y)dy. di ffCC di Therefore ∞ di ∞ di E( able 5. ∞Ifdthe Therefore Therefore i(y)dy iTherefore i todeadlines erefore Thus, expected value 2ofshould theranvariT˜ ofa proof tasks are independent ∞ (d −µ) di erefore ∞ −(d −µ) ∞ THEOREM (y)dy. yTherefore y2Therefore ffCC˜˜iσi(y)dy + + d2i2i2 ∞ ffCTherefore i ). it remains −ziid 2 the useof 2 1ddii 2π ˜˜ii(y)dy. 2 2d C ∞Therefore be this time lemma 2. □theIfand reπ2σiddi2iTherefore Therefore = f (y)dy. y f (y)dy − y f (y)dy + d 2 i 2 2 ∞ ∞ −z ∞ ∞ ˜ ˜ ˜ 2 − µ d i ˜ Therefore i √ ˜ d d i e C C C HEOREM 5. T . able E( T ). ∞ ∞ ∞ ∞ ∞ ∞ ∞ ∞ 2 2 i i i i i ∞ ∞ (1−F (d ) x f (d +x)dx = (1−F (d ) y f (y)dy dom variables normally distributed d ∼ N(d , c · d ), then 2 22(d 2 ˜∞ Therefore d= iexpected i value iofof 2σC˜i Therefore Thus, Thus, remains to compute compute the expected expected value the the varivariThus, it5. remains to compute the value of the vari-the de ∞ ∞ C C˜ix fixCx d2i (1−F ditheto e(d lim (−ze )i 2i)+ .i (1−F (1−F )(d +x)dx = ∞ independent ∞ i −µ ∞ i it iit iff˜ −(d −µ) 2 remains ˜i2i(d ˜2i2(d ˜2id2(y)dy 2i2 yi ∞fyCy i˜ i˜2)(d 2˜C (1−F (d +x)dx = (1−F (d )yf∞Cy˜ (y)dy )2+x)dx HEOREM If(y)dy deadlines ofthe tasks are ran 2π (1−F f= = (y)dy C C˜i)(d ˜T i)∞ ˜Cx µ d i i+x)dx i(y)dy 0fC˜Ci˜i (1−F dC˜C ∞iz→∞ ∞ ∞ 2 2 σ− 2 (1−F (1−F (d ) f (d +x)dx = (d (1−F ) x (d f ) (d +x)dx y = (y)dy (1−F (d ) y f (1−F (1−F (d (d ) ) x x f f (d (d +x)dx +x)dx = = (1−F (1−F (d (d ) f f (y)dy (y)dy i i C C d 2 2 2 2 2 2 −F (d ) x f (d (d ) y f i i i i ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ σ ˜ i i i i i i i i i dom variables normally ∞ ∞ ∞ ∞ i i i i i i ˜ ˜ ˜ ˜ 2 2 2 i i an 2 0 d y−F ∞= Cy C(1−F Ce C0C CdC (dCii) CCCiC2Ci˜i˜ (d (dCii)2C x2σ fC0C0˜ii (d CC˜i (1−F C˜i(y)dy C˜i (1−F expected of variable Titasksy are ˜i˜i (d i i(1−F iable i+x)dx iC i˜˜ (d i˜˜ (d 0fiC (1−F (d +x)dx (diy)d)i ∞f(d (y)dy (d xE( (1−F fvalue (y)dy fdfiiCCii˜˜ii(y)dy f= = 2C˜ C 2ii= . ∞Ci2ix)2xi+x)dx E( Td˜ii value TfT˜C). able ). ). i )able ˜ii)of∼ Thus, itCiremains to2compute the expected ofthen the vari2i2y C C˜i (d C˜·i d i 0))2iii))+x)dx 0i (1−F diiii2)y∞ HEOREM 5. a Ifrandom the deadlines independent ranTE( 0 i (d ifC˜ (y)dy i normally dom variables distributed d N(d , c ), the (1−F d+iequality )C(1−F x−(d f0C(d x(d f+x)dx +x)dx )= =˜y(1−F y i ii(1−F idifdiC˜))i(y)dy ˜−µ) ˜i i(d ˜i dC ˜i (d ˜i (d i i i Therefore 2 2 (1−F (d x x f f (d (d +x)dx +x)dx = = (1−F (1−F (d (d y f f (y)dy (y)dy C C C C C e ∞ ∞ d d 0 0 ∞ ∞ σ 0 d ˜ ˜ ˜ ˜ ˜ ˜ ˜ i i i i i i i i i 2 expected we = CC C Cdifi i˜+x)dx value of a rando ) i∞ (1−F (1−F xCC0∞∞ fi C˜ (d (1−F (d ) 2ddx ii fC˜yiC(d (y)dydom = (1−F (di )normally (y)dy ˜ 2 d2˜ i∼ N(d iobtain ˜˜ ). ∞i +x)dx 2i iiC˜ (di )22 ple the equality d2 i −transformations ˜∞ ˜iii(d 0µ di C and an a E(TC iCi −(d C C˜distributed C∞ 2 y fvariable ∞∞ i i+∞ i ϑi2 T i2 i −µ) variables , c · d−i ),ϑi2then the Pro i i d f ˜ (y)dy. expected value a5. random ∞ ∞ ∞ Table 2∞ i are iµ )µ ∞∞y+ fC∞˜id(y)dy −F (d y.y00∞∞ff0CC2˜−2d (y)dy 2σ σ σtasks 2˜˜2 (d ∞i∞∞∞i22 −2d i∞i of 2 ∞ ofof + e ˜ii(y)dy di the ∞˜d0(y)dy. ∞If i(y)dy iof i2i)) 2σ HEOREM HEOREM HEOREM 5. 5. If If the the deadlines deadlines deadlines tasks tasks are are independent independent independent ranranranT T d−F 22 i i Ci 2 − C C i −µ) y f f ∞ ∞ i i i −µ) 2σ 2 ˜ 2 2 we obtain the equality 2 −(d −µ) 2 y f (y)dy d f (y)dy. −2d x)dx x)dx = = (1−F (1−F (d (d ) ) y y f f (y)dy (y)dy . e i y f (y)dy + d f (y)dy. −2d √ √ √ 2 C C 2 ˜ ˜ ϑ + e F (ϑ ) e − 2 2 i i ˜ ˜ ˜ ˜ i i 2 ˜ ˜ i d d d d i i i σ ∞ ∞ ∞ ∞ d − µ i i ˜ 2 C C 2 C C C C i N(0,1) y f (y)dy + d f (y)dy. y f (y)dy + d f (y)dy. −2d −2d y y f f (y)dy (y)dy + + d d f f (y)dy. (y)dy. −2d −2d i i C C i i 2 2 2 2 (y)dy. −µ) i (di +x)dx ˜ normally f ˜ (y)dy. ii µd 2 . 2 ..i √−2d expected value of2 ya2πrandom 2σ i yyiffiC˜˜dd(y)dy i(d i i+ ∞∞C˜d ˜C (1−F fC+ 2σ ˜˜iii ∼ C˜ix C˜dom C˜− C˜+ C2i˜C 2−2d ˜i (y)dy ˜i (d −z dii (1−F −2d 2 variable dfiifdCdd2Ci− µd∞C∞˜idi )∞d−(d iy i fi ˜i i −2d i+ (y)dy (y)dy dϑidistributed −2d + ffi˜Ci2σ ˜i˜ (y)dy. dom dom variables variables variables normally distributed distributed d2˜ dare dT ∼ ∼ N(d N(d N(d cii,,·ccFd··i ), ddϑii2irandom ), ), then then then thethe theσ ϑ C di −µ) dC ∞ ∞ i di = ∞i )y fnormally ˜d˜iiii(y)dy ˜˜idiid(y)dy. i )) e iyiii 2σ iid−2d i(y)dy i(y)dy. C Ci σ 2 f .˜C(y)dy. iid d.i C (ϑ 2σ µi ,2π d−2d d+ Theorem tasks independent i σ i 2π f− (y)dy dfiiyifi2˜if˜fi−µ) fC˜+ (y)dy. (y)dy iiii i2d+ ifi C i+ id i ided i) √ 2y 2+ Proof. i i dd2 0(y)dy. dthe N(0,1) i 2π(1 .obtain dz = F ( )) .22σ − 2i diof C − 2 di 5. If i deadlines y y (y)dy (y)dy f (y)dy. −2d d˜iid d+ C Ci −(d ∞ ∞−2d d d ˜ ˜ N(0,1) i i i σ i i 2 2 2 2 i i the equality i i i i C C C C d − µ − µ √ √ √ ˜ ˜ ˜ . eµ the ϑexFTheorem. −variable 2 y2σσ (y)dy + yTheorem. fC˜i (y)dy + di (ϑ di Using −2d −2d Using Lemma 3 weget finally get the conclusion of ifid ifi C˜di(y)dy. 2π i(20) iLemma 2, c ¢d2), i )fnormally i 2e ˜idiifinally expected expected expected value value value ofof of a random aaϑσ random random variable variable Ti TT+ i ofof N(0,1) σdget ϑi2 d˜ » N(d 22. (20) ithe ∞ ∞+ C˜i (y)dy. 2C 20) Using Lemma we finally conclusion didi ie3 3 ddi i the we get conclusion Theorem. variables distributed then i i i 2σ σ i ϑ i Using Lemma 3 we finally get the conclusion of Theorem. i i P ( . )) (20) Using Lemma 3 we finally conclusion of Theorem. f f (y)dy. (y)dy. − 2 − 2 σ −+ 2π 2π 2πi √ ˜˜ii 22Lemma di finally dwe dTheorem. di Theorem. ∞get ∞of N(0,1) (20) Using 3 σfinally we finally Using get Lemma the conclusion 3conclusion finally of of Theorem. Using Using Lemma Lemma 333−(d we we finally get get the the conclusion conclusion of Theorem. i of i the conclusion C C Using Lemma 3 we get the conclusion Theorem. 2 √ 2√get the ϑ e F (ϑ ) e − 2 ˜ (y)dy + + d d f f (y)dy. (y)dy. Using Lemma 3 we finally get the conclusion of Theorem. i i 2 σ N(0,1) (20) 2 Using Lemma 3 we finally get the conclusion of Theorem. Using Lemma 3 we finally conclusion of Theorem. ))y)dy Using Lemma we finally get the of Theorem. ˜ ˜ −µ) √ + µ F (ϑ ) + 2C e d d □ pected value of a random variable T i i 2 2 2 2 2 2 C C i i i i i 2 2 2 2 2 2 i N(0,1) i i d − µ d − µ ϑ ϑ ϑ ϑ ϑ ϑ fC˜i of (y)dy +d fC˜i (y)dy. σ σσ2π − −−i ii −2d (20) Lemma 3 we Using finally Lemma 3 we conclusion finally get of the Theorem. conclusion Theorem. i Using i i of y 2σ µ2π µµ σσ2πϑ 2 2σ2σ i ii 2 Pro Using 3e3get we finally finally get the the conclusion conclusion Theorem. Theorem. dLemma 2the PP iLemma i Lemma 2π 2σ ( integral. . get )) d+ 0) (20) Using 3we we finally get Using the conclusion Lemmaof Theorem. finally geti Fthe conclusion of ϑTheorem. σ√σ√ 22−− 2 22 + µ FN(0,1 di3ofwe dF ond 1) √√ √ √ √√ √ e−ee−− ϑ + Fi N(0,1) (ϑ(ϑ (ϑ e)iiee+2 2C − −+2i+ i )2ii) iϑ N(0,1) N(0,1) σ he he conclusion conclusion of of σTheorem. Theorem. √ + µ F)N(0,1) (ϑ e 2 i i 2 2π2π ϑ 2π 2π 2π 2π 2π 2π 2π σ (ϑ−i ) i , ally ally get get the the conclusion conclusion of of Theorem. Theorem. Bull.Tech. Pol. Ac.: Tech. XX(Y) 2016 −2Ci µF2 (ϑ(ϑ F2π i ) i+C Bull. Pol.Pol. Ac.: XX(Y) 2016 N(0,1) (20) Using Lemma 3Ac.: we finally get the conclusion of Theorem. +2016 µ N(0,1) FN(0,1) ) +i2C e 2 222 Bull. Pol. Ac.: Tech. XX(Y) 2016Tech. Bull. Tech. XX(Y) 2016 i√ Bull. Pol. Ac.: Tech. XX(Y) 2016 Bull. Pol. Ac.: XX(Y) Bull. Bull. Pol. Pol. Ac.: Ac.: Tech. Tech. XX(Y) XX(Y) 2016 2016 Bull. Pol. Ac.: Tech. 65(2) 2017 Bull. Pol. Ac.: Tech. XX(Y) 2016 ϑi ϑϑii 2016 225 −2Ci µFN(0,1 Bull. Pol.Bull. Ac.: Pol. Tech. XX(Y) 2π Bull. Pol. Ac.: Tech.2016 XX(Y) 2016 2016 Bull. Pol. Ac.: Tech. XX(Y) Ac.: Tech. XX(Y) σ σ σ 2 2 22 (ϑ ) +C F − − − Sinc Bull. Pol. Ac.:Bull. Tech. XX(Y) Bull. 2016 Ac.: Tech.2016 XX(Y) 2016 Ci −µ −2C µF (ϑ ) 2, 22 Bull. Pol. Pol. Ac.: Ac.:Pol. Tech. Tech. XX(Y) XX(Y) 2016 √ √ √ i i i + + + µ µ µ F F F (ϑ (ϑ (ϑ ) + ) ) + + 2C 2C 2C e e e N(0,1) N(0,1) i Unauthenticated i i i i i i where ϑ = . N(0,1) N(0,1) N(0,1) i Pol.σ Ac.: Tech. Bull. Pol. Ac.: Tech. XX(Y) 2016 Bull. XX(Y) 2016 2 2π2π C −µ 2π i −2Ci µFN(0,1) (ϑi ) +C (ϑ ,ϑi = Download Datei |F4/25/17 7:48 Bull. ull. Pol. Pol. Ac.: Ac.: Tech. Tech. XX(Y) XX(Y) 2016 2016 i ) PM N(0,1)where Ci −µ σ . Since Bull. Bull. Pol. Pol. Ac.: Ac.: Tech. Tech. XX(Y) XX(Y) 2016 2016 3). where ϑi = . Tech. Bull. Pol.σAc.: XX(Y) 2016 23 2(use 2 Proof. Similarly, as in the theorem of lemma S −2C −2C −2C µF µF µFN(0,1) (ϑ(ϑ (ϑ +C FFN(0,1) (ϑ(ϑ (ϑ i ) i+C i)) +C i ) ii)), ,, N(0,1) N(0,1) i FiiN(0,1) where ϑi = Ci −µi . ii N(0,1) Proof. Similarly, as in th
αα (29)of task (x)Erlang = 0, α λ ),ifif α E(α, then havefof distribution time the use of lemma 2. i∼ EMMA 6.(x)dx IfC(1 the the T˜the 0, xfor 0. 0,0.completion ˜remains Proof. Similarly, as the case of proof of theorem 2value should hus, ititit to compute value the varii the Thus, Thus, to to compute compute the the expected expected value of the the varivariremains Thus, to the compute itexpected theof to value the of expected the varid ∼ N(d N(d ,Thus, c,remains cbe ·, dc· dithis ),id),itthen then the the expected value ∼ value N(d of ,of tardiness cvalue tardiness ·compute di ),aofof then (23) (23) the expected value of tardiness (23) x vari− Fdeadlines (a)), xfxC˜fLiCf(x)dx (x)dx − − FFE(α+1,λ (a)), (a)), (30) (30) iremains iremains iin iexpected x=f= = (1x f− (x)dx = (1(30) − FE(α+1,λ (30) expected E(α+1,λ 0, ifx≤x≤0. ≤ ˜Ci˜(x)dx ˜i (1(1 E(α+1,λ ))E(α+1,λ ) (a)), ) (a C= C˜)i F d˜∼ · ), then the expected value tardiness (23) i i ∼˜ 2˜N(d i i − µ − µ C C λ ˜ ∞ 2 2 x λ λ i i 2 2 λ λ xproof x have µ CE( ˜). ). ∼ E(α, λthe ),the then Erlangbecause distribution xis x it it ˜iFthis ˜compute C Proof. Proof.The µ able the of the vari isthe omitted is C similar to to proof be time use of lemma 2. Ci −value αbecause TiTable i− i similar le eE( E( E( T ˜ E( T T ). ). The proof omitted is proof ) = F E( T ˜ i i ). Thus, it remains to the expected i i i N(0,1) N(0,1) Ci FN(0,1)CCi − E(Ti ) = FN(0,1)CCi − xomitted fComitted =because (1 −F (a)),toto (30) Ci −isµis ˜i (x)dx )proof iσ−µ iσ−µ Proof. Proof. The The proof proof Proof. itE(α+1,λ itisitThe issimilar similar isto the the omitted proof proof because of Lemma 1. ˜)able ˜i ) = C −µµ CCi Fi F −µµ Ci − µand C λbecause σ x and and ˜i2 ). iσ and and ∞because C =If F F FiN(0,1) E( E(E( T˜T E(T of Lemma 1.proof α E( Tthe i5. iF Proof. The is omitted is similar the proof N(0,1) N(0,1) N(0,1) N(0,1) N(0,1) the deadlines of tasks are independent ranC ) = F F T)˜= HEOREM 5. 5. If If the deadlines deadlines of of tasks tasks are are independent independent ranranTHEOREM THEOREM 5. If the HEOREM deadlines 5. If of the tasks deadlines are independent of tasks are ranindependent ranTi HEOREM T Thus, it remains theσσ expected value of the vari to compute i i N(0,1) σ σ σ σ N(0,1) x f (x)dx = (1 − F (a)), of of Lemma Lemma 1. 1. of Lemma 1. ˜ ) ∞ 2 2σ ∞∞ M. Wodecki W. Bożejko, ,˜c−· dµ),σthen ∞Ci + 1) and of, cLemma 1.∞ the α(α +x+ 1)1) (C P.N(d Rajba, and α(α α(α + 1) ˜ normally λ α(α E(α+1,λ + at 1) calcuα(α i2−µ) ˜distributed (C −µ) normally d˜idd˜∼ N(d σ C 2f2 then m mvariables variables variables normally distributed distributed ∼ N(d N(d ,i ,∼ cic− ·distributed ·N(d dµ ), the HEOREM 5. the deadlines of are Tnormally idtasks id 2an 2− −T dom variables normally dom variables distributed d2˜i ,then c · dthe ), then ∼ranthe · dx able ). i i∼ iC i i ),independent σeE( ithen ithe i Now i2x), Now we prove auxiliary lemma, which we use (x)dx = (1 − F (a)). (31) − 2i 2 If i x f f (x)dx (x)dx = = (1 (1 − F F (a)). (a)). (31) (31) ˜ x f (x)dx = x f (x)dx (1 − = F (a)). (1 − FE(α+2,λ (31) ) 2 2σ E(α+2,λ ) ˜ ˜ we prove an auxiliary lemma, which we use at calcu√ ˜ ˜ C − µF + ∞ E(α+2,λ E(α+2,λ ) ) E(α+2,λ ) −µ) 2 22 + 1) 2Ci iCC 2˜˜22 2σ 2 Ci i − N(0,1) i i 2µ 2 − µF +of e(C(Ciof i −µ) 2− ˜variable λof σ√ σ σ µN(d − C CT C ˜i2 the x xx we λα(α λ variables cted value of avariables variable T dom distributed ∼µ , c(C· i2σ d−µ) then − − id˜˜− iiσ i prove λuse λ(1Now (C pected ectedexpected value value arandom aerandom random variable variable xand xlemma, i ), value expected random value variable ˜ii2T˜.prove ˜which Now Now we prove an an auxiliary auxiliary lemma, which we we use an at at auxiliary calcucalculemma ia iTTiN(0,1) 2 .we 2normally 2−µ) 2 T 2π i of a random i T lating the expected value T x f (x)dx = − F (a)). (31) σ − µ C i ˜ σ 2σ2σ ˜ − i i √of √ 2π − − µF µF − µF ++√ e + e E(α+2,λ ) lating the expected value of variables T C Now we prove an auxiliary which i 2 of tasks are independent If the deadlines ranT HEOREM N(0,1) N(0,1) N(0,1) i 2 2 i we use at calcu˜iσ λ 2 lemma, √ − µF +2π e 2σ x σ expected value of2 a5.random variable Tσ value ϑ22ϑϑ22 ˜expected ˜T ϑ22 lating 2π 2π ∞ ofofvariables .˜1) .2of. tasks value lating the the expected expected variables lating Tα(α 2 22 ϑ 2ϑ2ϑ22 22N(0,1) 2 If thevalue 222σ i˜the iiT˜iT˜T 2µ 2µµ ii+ iT ˜ σ ϑσ ϑi∼ ϑexpected ˜variable Lemma 6. deadlines for the completion (27)Chave σ σ 2π i i ˜of ˜i i variables lating the value of variables 2σ 2σ σ σ σ i i i i 2 dom variables normally distributed d N(d , c · d ), then the Proof. Using the density function (26) of random variable 2σ µ 2σ µ σ σ σ σ i i i i(26) Proof. Proof. Using Using the the density density function function (26) (26) of of random random variable variable C C −− − i i i i(27) Proof. Using the Proof. density Using function the density function random (26) of rando C˜ − − − i of − − − − EMMA 6. If the deadlines for the completion ofof tasks L roof. Similarly, as in the case of a proof of theorem 2 should 2 2 2 2 x f (x)dx = (1 − F 2 2 2case 2−ϑ 2µ2− √ √ √ ˜ ϑ + e FN(0,1) (ϑ ) e 6. If the deadlines for the completion tasks (27) 2aϑi) 2 2 ϑi 2 + √ L EMMA √ √ √ √ √ Proof. Similarly, as in the of proof of theorem 2 should E(α+2,λ ) (a)). i ϑ ϑ + + e e FFN(0,1) (ϑ (ϑ ) ) e e − − √ √ √ √ √ ˜ i i ϑ e + F (ϑ ) F e (ϑ − e e C ˜ 2σ σ σ i i i i 2 i 2 i i i i N(0,1) N(0,1) the Erlang distribution C » E(α, λ), then N(0,1) Proof. Using the density function (26) of random variable C ˜ − − i i λ ∞Erlang ∞∞∞ C˜ifor x2 expected value random variable ∞E(α, Ifof 22π− 22π of 2π 2π 2π EMMA EMMA 6. IfIfdistribution the deadlines deadlines completion EMMA completion 6. the oftasks tasks deadlines (27) (27) for the Lproof Lthe √of √ √ ϑProof. +T e case F ) case oof. roof. Similarly, Similarly, asuse as2π in in(ϑ the the of ai eaproof Similarly, of of theorem theorem as in 2should should the aL theorem should 2π 2π 2π ∞6. i 22π λ α−1 ), ), then have the 2π 2π 2π 2π ˜1for ilemma ∞the ∞ this time the use of lemma 2.ofa2. N(0,1) ∼ E(α, then have the∞of Erlang distribution 1the be this time the of EMMA If∞the the deadlines for L 6. ∞ C∼ i1 Proof. Similarly, as in thecase case of aproof proof of theorem α αcompletion −λ x xα−1 α1λα−1 α−1 −λ −λ x of tasks α −λ1x (27) 2π 2π 2π 2 shouldhave ∞(x)dx ˜ ˜ x f (x)dx = x λ x e dx 1 x x f f (x)dx = = x x λ x x e e dx dx λ ˜ x f (x)dx = x x f (x)dx = λ x x e dx ∼ ∼ E(α, E(α, λ ), ), then then E(αe have the the Erlang Erlang distribution distribution C C have the Erlang distribution Cλ˜iα∼xα−1 ˜ ˜ 2 2 2 CiCC ebe this this time time the the use use ofofof lemma lemma 2.2.2.be this time useσϑ of 2. have α α−1 −λ x 2σ 22 lemma 2 iC ix− C˜i a = (α C˜i λλ ϑ ϑiiϑϑ2the ˜density i∞i x∞f ˜distribution ϑϑi2i (x)dx x e dx 1)! ∼ E(α, ), then the Erlang 2σ µ σ α this time the use lemma a (α (α − − 1)! 1)! σ σ (α − 1)! (α − 1)! i i i Proof. Using the function (26) of random variab σ σ a a a a i 2 C a a a a 2 − − α i 2 2 − − ϑ 2 2 − − Thus, it remains to compute the expected value of the vari 2 2 2 (α − 1)! σ 2 2 i √ √ √ a∞∞x fx µµµFN(0,1) (ϑ )iN(0,1) + 2C ϑ (ϑ )+ −value , (30) (1(1 −− FE(α+1,λ =a= (30) Thus, it+remains to µ compute expected the + + FFN(0,1) (ϑ 2C 2C e2C eFi √ √ ee 2 ˜if(x)dx iF ithe ieµ + (ϑ )+ + 2C ++ 2C√ ee(ϑ 2i) − i22)of N(0,1) ) (a)), i(ϑ i√ (x)dx (a)), (30) i√ i vari N(0,1) αλαα α )∞ √value + µ F+ )2π + C˜the FE(α+1,λ ∞ C i2π i ∞∞variN(0,1) ∞xx 2π 2π 2π 2π2πthe expected x ∞∞ ∞∞ (a)), Thus, remains tototo compute compute the the expected Thus, expected it iremains value ofof to the the compute varivariof 2remains 2π λ αααvalue 1 leThus, E(E( T˜iti2it ˜). x f f (x)dx (x)dx = = (1 (1 − − F F (a)), x f (x)dx (30) (30) = (1 1 1 1 α α 1 1 ∞ ˜ ˜ ˜ 2π α+1 (α+1) −λ x able T ). E(α+1,λ E(α+1,λ ) ) C C C α α−1 −λ x −F α+1 α+1 (α+1) (α+1) −λ −λ x x Thus, it remains compute the expected value of the variα+1 (α+1) −λ x α+1 α 1 i i i i x f (x)dx = (1 − F (a)), (30) = λ x e dx. α+1 (α+1) −λ x ˜ 2 2 2 x fCλ˜iλ(x)dx dxe− = x=x = Ci λ= λ λ =xxE(α+1,λ dx. λ x eeexx) dx. eλ xλdx. exλ(α+1) 2ϑ 2 and= ˜T ˜i).˜). ˜i2 ). x dx. ble le E( E( T E( T 2able 2 2 2 and λ x i λ ((α + 1) − 1)! 2 (α − 1)! a λ λ ((α ((α + + 1) 1) − − 1)! 1)! and σ −2C µF (ϑ ) +C F (ϑ ) , i λ λ ((α + 1) − 1)! ((α + 1) − 1)! a a a a able E( T ). −2C −2C µF µF (ϑ (ϑ ) ) +C +C F F (ϑ (ϑ ) ) , , a a i i i λ ((α + 1) − 1)! −2C µF (ϑ −2C ) +C µF F (ϑ ) +C , F (ϑ ) , N(0,1) N(0,1) 2 − a i i i i i i i −2C (ϑ ) +C FN(0,1) (ϑiii) , e iran5. 5.If If the deadlines tasks independent T HEOREM i deadlines iN(0,1) i are i N(0,1) N(0,1) N(0,1) i N(0,1) N(0,1) N(0,1) i i )are i µF itasks N(0,1) 2ranthe of independent T HEOREM + N(0,1) µ of Fi iN(0,1) (ϑ 2C and and and i + i√ ∞ ∞ + + α(α 1)1) ˜ and HEOREM HEOREM 5. 5. If If the the deadlines deadlines of of tasks tasks HEOREM are are independent independent 5. If the deadlines ranranof tasks are ranT T T ∞ 2π α(α 2 independent om variables normally distributed d ∼ N(d , c · d ), then the ˜ Since C −µ Since Since i i i Since α Since C C −µ −µ Ci −µdeadlines i ii 2˜ (x)dx = Since C −µ C −µ dom variables normally distributed d ∼ N(d , c · d ), then the HEOREM 5. If the of tasks are independent ranT i i i i i x f (1(1 −− F1E(α+2,λ (31) α+1 ∞ ∞ re ϑ = . ) (a)). ∼N(d .. ϑi = . α(α (31) xC,ifcC˜·i (x)dx = F∞E(α+2,λ (a)). (31) ere ϑ here ϑwhere = where .= . ϑi =σ where ivariables .2N(d α(α α(α +2+1) 1)1) +x dx. 1) i i= ˜σT ˜i∼ ˜ 2∼ = x(α+1) e−λ iof σ σσϑnormally σ.distributed ˜ ∞ 2 2 2 2 λ) om m variables normally distributed dom d d variables normally distributed d N(d , c , c · d · d ), ), then then the the d ), then the λ x pected value a random variable i i i i i i i i ˜ 2 α(α + ˜ λ x expected value normally of a random variable T 1 1 i x x f f (x)dx (x)dx = = x f (x)dx = (1 (1 − − F F (a)). (a)). (31) (31) (1 − λ ((α + 1) − 1)! 2 dom variables distributed d ∼ N(d , c · d ), then the 1 1 ˜ ˜ ˜ 1 1 a E(α+2,λ E(α+2,λ ) i) (α+1) −2Ci µFN(0,1) (ϑ ) +C F (ϑ ) , α+1 (α+1) −λ x− i i i α+1 (α+1) −λ x i C C C α+1 α+1 (α+1) (α+1) −λ −λ x x i i 2 2 2 α+1 (α+1) −λ x α+1 −λ x i i N(0,1) 2 2 2 i x f (x)dx = (1 F (a)). (31) ˜T ˜i ˜ 2 value ˜i x x fE(α+2,λ x xx λλλx e2eexe == λ= fE(α+1,λ )(x), e)(32) =f= ffE(α+1,λ (x), (x), (32) (32) C˜i λλλ λ xE(α+1,λ E(α+1,λ )x xpected pected value value of of a a random random variable variable expected T of a random variable T e = f (x), (32) λ E(α+1,λ ) (x), machine with probabilistic parameters ) ) Proof. Similarly, as in the theorem 3 (use of lemma 3). E(α+1,λ i f. Similarly, as in the theorem 3 (use of lemma 3). 2 2 λ oof. oof. Similarly, Similarly, as as in in the the theorem theorem 3 3 (use (use of of lemma lemma 3). 3). 2 2 ((α + 1) − 1)! Proof. Similarly, Proof. as in the Similarly, theorem as 3 in (use the of theorem lemma 3 3). (use of lemma 3). x+ expected with valueprobabilistic of random variable ϑin Proof. Similarly, theorem (useµ lemma □ ((α + 1)1) − 1)! 2 ((α ((α + 1) − − 1)! 1)! 2 Cas ((α + 1)density − 1)!function ((α + (26) 1)(26) −of 1)! σa σ σTσi 2 3 2σ i ϑthe i ϑi23). machine parameters Proof. Using the random variable C˜ ˜ Since 2σ µe− 2ϑ− − density i −µ Proof. the function of random variable − i ˜˜i ˜iCi √ √ √ ϑ FN(0,1) (ϑ ) e 2 2i = 2 2 + 2σ 2 Using where ϑ i i ϑi2ϑi2.2− ϑi2ϑi2 22 ϑi2 ϑi2 Proof. Using the density function (26) of random variable C √ √ √ ϑ + e F (ϑ ) e − σ 2σ µ µ 2σ µ σ σ σ σ σ σ i i 2 N(0,1) 2 −parameters 2 ∞ the Proof. Proof. Using Using thedensity density function Proof. (26) (26) ofof Using random random thevariable density variableCfunction Ci C (26 −2 2 ϑi −−2 2 ϑi W − 2 and − 2 achine with probabilistic 2π 2π 2π ∞function 2σ µfunction σTech. σ In this case, to compute value of the function omFF 1function Proof. Using of random √ √√ √ √F √ √ √ 2π 2π Bull. Pol. Ac.: 2016 7 i ˜i ϑ ϑi eto ++√ e√ e e− ϑ2 W + (ϑ (ϑ )XX(Y) )XX(Y) − the (ϑ√ 1 − 2 ∞ theedensity ∞ 1λ α+1 − − (α+1) −λ i(ϑ iAc.: i eXX(Y) i )2π i e1 and In this case, compute the value of the W2 W N(0,1) N(0,1) N(0,1) 1(26) α α−1 −λx−λ x=variable 2 −Tech. Pol. Ac.: Tech. 2016 7 √ .l.Pol. Pol. Ac.: Tech. Tech. XX(Y) 2016 2016 7 7 ϑ + FAc.: ) e Bull. Pol. Tech. Bull. XX(Y) Pol. 2016 Ac.: XX(Y) 2016 7 x e f (x), In this case, to compute the value of the function W dom and W α α−1 x i i 1 2 N(0,1) x f (x)dx = x λ x e dx E(α+1,λ ) 2π 2π 2π 2π 2π 2π 2π 2π 2π ∞ ∞ ∞x ∞ Proof. Similarly, in5. the theorem 3 (use of lemma 3). x 1−11)! αλα α−1 ex x dx ∞ ∞(α ∞ xC˜ifC˜i (x)dx machine with probabilistic parameters J), we will useuseTheorem 42as and 1 machine machine chine achine with with with with probabilistic probabilistic probabilistic probabilistic parameters parameters parameters ϑi2 ϑ 22π ((α +=1) 1)! we will Theorem and 3. 2πparameters α−1 −λ −λ σ 2π a ax∞ a ax− 1 (α − 1)! σ 2 2Theorem − i J), we will use 4 and 5. x f f (x)dx (x)dx = = x λ x x x e f e (x)dx dx dx = x λ In this case, to compute the value of the function W m and W α α−1 −λ x ˜˜ − 1 2 ++ µ µFN(0,1) (ϑ(ϑ 2Ci √σ we i) + x −−1)! λ a x C˜ie dx a (α − 1) FN(0,1) (α 1)! σσ e−−eϑi2ϑi2ϑ22 2 σ a a− ϑi2CxCi fiC∞˜i (x)dx =a a (α i ) + 2C i√ 2 Theorem 2 2π we we (α − 1)! ), will use 4 and 5. ithe 2− 2the 2 a a ∞ α 1 √ √ √ 2π + + µ µ F F (ϑ (ϑ ) ) + + 2C 2C + µ F (ϑ ) + 2C e e e In this case, to compute the value of function W dom and W 2 In In In this In this this this case, case, case, case, to to to compute to compute compute compute the the the the value value value value of of of the of the the function function function function W W W W om om om and and and and W W W W i iii, λ)) Case C. » E(α 1 N(0,1) N(0,1) N(0,1) α+1 x x 1 111 i1 2 222i 2 Case C+ (problem: p˜i(ϑ ,2016 ) e 2 Bull. Pol. Tech. ip˜ii∼ α α+1 µ(problem: FAc.: )E(α + 2C x(α+1) eα−λ dx.dx. 1 iXX(Y) iλ√ N(0,1) ∞ 2π = = λα+1 x(α+1) ex−λ executions p , i 2 J Case Cuse (problem: p˜and ∼ E(α λ2π )2π we αλααa∞∞ ∞ 11) 1 1− 1)! λα+1 25. ithe i ,2π J),wewe we will use Theorem 4 and 5. We assume that times of the tasks (α+1) (α+1) −λ −λ x α+ J), J), J), we we will will will will use use use Theorem Theorem Theorem Theorem 4 4 4 4 and and and 5. 5. 5. 2 ((α + i −2C µF (ϑ ) +C F (ϑ ) , We assume that the times of the tasks executions p ˜ , i ∈ J are λ ((α + 1) − 1)! i i µF i i i a N(0,1) N(0,1) = = = λ λ x x e e dx. i −2C (ϑ ) +C F (ϑ ) , α+1 (α+1) −λ x i i N(0,1) i ) N(0,1) 2variables 2, λ 2are = λ x λ e a dx. dx. + 1) − 1)! λ We assume that the times of the tasks executions p ˜ , i ∈ J C (problem: p ˜ ∼ E(α are independent random with Erlang distribution, i λ λ ((α + 1) − 1)! ((α + 1) − 1)! ((α i i we Case we we we a a −2C −2C µF µF (ϑ (ϑ ) ) +C +C F F (ϑ (ϑ ) −2C ) , , µF (ϑ ) +C F (ϑ ) , 2 i i µF ivariables i ) ,i distribution, i Since independent random with i(ϑ Erlang p˜i i∼ N(0,1) i λ a ((α + 1) − 1)! N(0,1) N(0,1) i(ϑ N(0,1) N(0,1) i i N(0,1) −2C ) +C FN(0,1) Ci −µ i that icompletion i task i 2 J N(0,1) ithe C −µ random variables with distribution, p˜Since p˜= Then, the date Erlang of isp˜ia random i(problem: We assume the times of tasks executions , ai ∈random J are here ϑCase = .(problem: i ∼ i » E(α i., λ). Case Case Case C (problem: C C (problem: (problem: p ˜ p ˜ p ˜ p ˜ ∼ ∼ ∼ E(α ∼ E(α E(α E(α , λ , , ) , λ λ λ ) ) ) Case C p ˜ ∼ E(α , λ ) iE(α where ϑindependent i i i i i i i i i i deσ , λ ). Then, the completion date of task i ∈ J is iC i Since Since Since CC −µ −µ Ci −µ σ i i variable with Erlang distribution (Corollary 1) Since 1 1 Since ndehere here ϑ ϑ where ϑ = = . . = . , λ ). Then, the completion date of task i ∈ J is a random E(α C −µ ∼ independent random variables with Erlang distribution, p ˜ α+1 (α+1) −λ−λ x x i i i i i iare We We We We assume assume assume assume that that that that the the the the times times times times of of of the of the the the tasks tasks tasks tasks executions executions executions executions p ˜ p ˜ p ˜ p ˜ , i , , ∈ , i i i ∈ J ∈ ∈ are J J J are are We assume that the times of the tasks executions p ˜ , i ∈ J are α+1 (α+1) σ σ σ i i i i i c · where ϑ = . x e (32) λ variable with Erlang distribution (Corollary 1) i x −λ ex x = =fE(α+1,λ fE(α+1,λ (32) −λ x λα+1(α+1) σ asas 1 1 1 ) (x), roof. Similarly, in the theorem 3 (use of lemma 3). ) (x),α+1 (α+1) , c ·independent α+1 (α+1) −λ ((α + 1) − 1)! Proof. Similarly, in the theorem 3 (use of lemma 3). variable with Erlang distribution (Corollary 1) ethe variables completion date task idistribution, ∈distribution, J is a random E(α 1 i , λ ). Then, ((α + 1) − 1)! ∼ independent independent independent random random random random variables variables variables with with with with Erlang Erlang Erlang Erlang distribution, distribution, distribution, p˜i p˜p∼ ˜p˜iiip∼ ∼ independent random variables with Erlang ˜∼ x x e e = = f f (x), (x), (32) (32) x λ λ λ α+1 (α+1) −λ x i E(α+1,λ E(α+1,λ ) ) oof. roof. Similarly, Similarly, asasas inErlang inthe thetheorem theorem Proof. (use Similarly, ofofof lemma lemma as 3). 3). in3).the theorem 3 (use of lemma 3).λ ((α x e ((α = f+ (32) e i33(use + 1) − 1)! ((α + 1) − 1)! 1) − 1)! E(α+1,λ ) (x),, (32) cde· E(α variable with distribution (Corollary 1) Proof. Similarly, in theorem 3 (use lemma dede, λ , , ). , λ λ λ ). ). Then, ). Then, Then, Then, the the the the completion completion completion completion date date date date of of task of task task task i ∈ i i i ∈ J ∈ ∈ is J J J is a is is random a a a random random random E(α E(α E(α ndetask i ∈ J is a random ((α + 1) − 1)! E(α i iii i , λ ). Then, the W.7Boz˙ ejko, , C˜i˜completion ∼ ∑i p˜i ∼ date E(α,ofλ ), Ac.: Tech. XX(Y) 2016 ,Bull. cll. c,cc···Pol. Pol. Ac.: Tech. XX(Y) 2016 7 variable variable variable with with with with Erlang Erlang Erlang Erlang distribution distribution distribution distribution (Corollary (Corollary 1) 1)1) p(Corollary ˜i (Corollary ∼ E(α, λ1)),1) C ·variable variable with Erlang distribution (Corollary i ∼ij=1 ∑ 23) ull. l. Pol. Pol.Ac.: Ac.:Tech. Tech.XX(Y) XX(Y)2016 2016˜ Bull. Pol. Ac.: Tech. XX(Y) 2016 7 7 j=1 23) Pol. Ac.: Tech. XX(Y) 2016 Bull. 7 Ci ∼i ∑ thus thus iii ip˜i ∼ E(α, λ ), ˙ ejko, P. Rajba, and M. Wodecki + α . . . + α . where α = α W. Bo z i 1 2 j=1 ˜ ˜ ˜ ˜ 3) whereαα = α … + α .i ip∼ pα ˜i∑ ˜p.˜p˜iipi˜∼ ∼ E(α, E(α, E(α, E(α, λ ),λλλ), ), ∼ C E(α, λ),), iC 2C +CαiC .ii˜∼ .∼ .∼ +∑ where = α11 + α ∑ ∑ α ∞ 1 α i∑ i∼∼ 2∼ W. P. Rajba, BoP. z˙P.ejko, P.z˙ ejko, Rajba, M. and Wo α+1 (α+1) xBo From thethe definition ofj=1 the Erlang distribution,thethe density W. Bo Boxz˙W. z˙ejko, ejko, W. Rajba, Rajba, Bo and and M. M. P.and Wodecki Rajba, Wodecki and λW. (1−F e−λ = z˙ ejko, j=1 j=1 j=1 From definition of the Erlang distribution, density E(α+1,λ )( j=1 23) 23) 3) 23) where From the definition ofi . the Erlang distribution, the density + α . . . + α α = α 1 2 λ ((α + 1) − 1)! λ ˜ thus The first two integrals are cal a function of the random variable C ∼ E(α, λ ) ˜ i function of the random variable Ci˜ » E(α, λ) ˙zejko, ˙ejko, W. W. W. W. W. W. Bo W. Bo Bo Bo Bo Bo z˙zBo z˙z˙ejko, ejko, z˙ejko, z˙ejko, ejko, P. P.P. P. P. Rajba, Rajba, P.P. Rajba, Rajba, Rajba, Rajba, Rajba, and and and an an aa function of the random variable Ci ∼∞distribution, E(α, λthus )thusthe thus thus thus The thus The first From of the density +1α α ++ α .α2.α ....+ α + α .α α where where where where αα = ααthe = α= αdefinition α iα i.i..i . Erlang 1α 11+ 2α 22..2.+ .+i+ where α= = and (31). Then, after simple 1+ thus The The first first two two The int intw α 1 α Now we prove the second equality thesis of the lemma. α+1 (α+1) −λ x 1 α α−1 −λ x ˜ ∞ ∞ ∞ λ (1−F x e = (a)). and and (31). function of the random C E(α, From From From From the the the the definition definition definition definition of1variable of of of the the the Erlang Erlang Erlang distribution, distribution, distribution, the the the density density density density the ∞1 λthe xErlang e∼distribution, , distribution, ifλx) > 0, E(α+1,λ α 1 λ α+1 (α+1) α ∞ 1 (34). From the definition of the Erlang the density assertion ∞ ∞α and (31). (31). Then, Then and (α−1)! α+1 −λ (α+1) −λ x α (α+1) −λ λ˜i˜ e−λ +x1) α> α− α 1thus 1thus 1and α)x =αeα a x((α λ αC x˜α−1 , λ )λif 0,1)! fof (26) (x) = thus thus thus thus thus The Th Th TT λλ∞ λαα+1 (1−F (a)). xλ eThe (1−F (a)). α+1 α+11λ (α+1) (α+1)x −λ −λ α+1 xx ex α (α+1) −λ x=∞ αE(α+1,λ α+1The (α+ ˜of ˜ 1 24) function E(α+1,λ )α−1 ) C (α−1)! asse assertion (3 function function function of of the the the the random random random random variable variable variable variable C C C ∼ ∼ ∼ E(α, ∼ E(α, E(α, E(α, λ λ ) ) ) ˜ i 2 λ x 2 λ λ x x e e = = x (1−F (1−F e = (1−F (a)). (a)). (a)). xasse i i i i the=random Ci x∼ E(α, λ≤) 0. (26) E(α+1,λ E(α+1,λ ) ) E(α+1,λ ) (26) fCof λ λ ((α + 1) − 1)! ((α + 1) − 1)! λ λ ((α + 1) − 1)! λ ˜i (x) 0,1 λvariable if x assertion assertion (34). (34). a a a α xα−1 e−λ λ x f x e dx (x)dx = x 24) function ˜iλλ ∞ ∞ ∞ ∞ ∞ ∞ ∞ λ λ λ λ ((α ((α + + 1) 1) − − 1)! 1)! ((α + 1) − 1)! ((α + 1) − 1)! λ C and and and and an an a , if x > 0, a a a a ααααααNow thesis 1 1 1 1 1 1 1 α α α α α α α Now we prove the second α equality of the lemma. 0, if x ≤ 0. When tasks completion tim (α − 1)! (α−1)! α+1 α+1 α+1 α+1 α+1 α+1 α+1 (α+1) (α+1) (α+1) (α+1) (α+1) (α+1) (α+1) −λ −λ −λ −λ −λ −λ x −λ x x x x x x d d we prove the λλ second of(1−F the lemma. (a)). iλ i= fC˜i (x) = 1 111λ1 αλλλxααα−1 (26) α α−1 α−1 α−1 −λ −λ x−λ xxx x x λλλλ equality xxxxthesis xequality xx Now eeeeof ethesis eethesis = = = = = = (1−F (1−F (1−F (1−F (1−F (1−F (a)). (a)). 4) E(α+1,λ E(α+1,λ E(α+1,λ E(α+1,λ E(α+1,λ E(α+1,λ E(α+1,λ )))(a)). ))(a)). )(a)). )(a)). we prove the second we prove of the the lemma. second equality the Now we prove the second equality the lemma. eee−λ ifif> xxif xx> > 0, 0, 0, 0,0, we asse asse asse ass ass as Now W When λxxαxxeα−1 e,−λ,,,if ,if x>Now > In such af (x) case tardiness ∞ ∞ Erlang distribution, values oacta (α−1)! (α−1)! (α−1)! (α−1)! λ λ λ λ λ λ λ ((α ((α ((α ((α ((α + + + + + 1) + 1) + 1) 1) 1) 1) − − 1) − − − 1)! − 1)! − 1)! 1)! 1)! 1)! 1)! λ λ λ λ λ λ λ 0, if ≤ 0. Now we prove prove Now the the we second prove equality equality the second thesis thesis equality of of the the lemma. lemma. thesis Now of we the prove lemma. the second equa aaaaasecond a((α a((α When When tasks tasks W c 1 (α−1)! f f f (26) (26) (26) (26) (x) (x) (x) = = = = ˜i CC ˜C ˜fa ˜ii i˜case 2 α α−1 λ x 2 α(α + 1) 24) 24) 4) In such tardiness (26) (x) = C 24) ∞1 ∞ F using ∞ Erla x≤ f0. x∞ ∞∞ λ∞∞ x ∞ e dx Erlang dis ˜i (x)dx = 0, 0, 0,0, ˜ if x if if if ≤ x x x ≤ 0. ≤ 0. 0. In suchCai case 0, tardiness calculated the Theorem 1 1 = (d ) . 1 − C ∞ ∞ ∞ ∞ ∞ i Erlang Erlang distributi distribut Erla E(α+2,λ ) if x ≤ 0. 2 α α−1 2 λ x 2 2 2we αthesis α−1of λof xthe (α − 1)! Now Now Now Now Now we we we we prove prove prove prove prove prove prove the the the the the the second second second second second second equality equality equality equality equality equality thesis thesis thesis of of of the of the the the the lemma. lemma. lemma. lemma. lemma. lemma. 1x= 12 equality 1thesis d2Now W W W W Ci − di , if C˜i >ddi i , iNow fthe (x)dx = calculated x= calc xwe fwe (x)dx In such a case tardiness λλ2thesis α−1 α−1 λthesis xx λof α−1 λlemma. 2the 2second C˜iee C˜i = λ λαxxα1)! xx2(27) fCfd˜Ci˜(x)dx x2x(α eeλx1)! dx dxλdeiαxxxdx xx2dx dx fcalculated (x)dx =xx2ffCC˜˜i= xi(x)dx x(x)dx =(α x2 ˜i (x)dx T˜i˜ = C˜i − di , if C˜i > di , − (α − calculated using using calc − C d d d d i 25) InIn i i i i ∞ ∞ ∞The ∞∞ddi i last ∞∞ (α(32). Erla Erla Er EE such In such such a case aaaacase case case tardiness tardiness tardiness tardiness (α −−1)! 1)!d∞∞i∞∞∞used − 1)! ααααααα−1 dd d∞ di dErla Ti = ˜ 0, (α (27) if˜C˜i˜ ≤ di . i∞ iErl equality Insuch such case tardiness 25) In 2222222 α−1 αD α−1 α−1 α−1 α−1 α−1 λλλxλxλxλxλ xxx 2222222 1111111 Case (problem: 1|d˜i ∼ E(β di ,di . = α(α +i i1) 1 − FE(α+2,λ C i− 0, di , if C ifiC>i ≤ λ λ λ λ λ λ λ x x x x x x f f x f f f f x x x x x x x e e e e e e e dx dx dx dx dx dx dx (x)dx (x)dx (x)dx (x)dx = = = = = = = x x x x x x x (d ) . ˜+ ˜C ˜iC ˜i(x)dx ˜fC ˜i(x)dx ˜(x)dx calcu calc calc cal cal ca ci C C C C ˜ ) α(α + 1) α(α + 1) α(α 1) i i i i i 2 (27) α(α Ti = C˜ C (27) Case DCas (pr (α (α = α(α1+ Cas ˜ifiC ˜C ˜i˜ii> ˜C ˜i˜ii− (α (α (α (α − − − − − 1)! − 1)! − 1)! 1)! 1)! 1)! 1)! dd+ di1) ddidi ddi1 λ α(α 1) α(α + 1) 5) − d − − , d d d , , , if C if if C > > d > , d d d , , , C We assume that the required id+ idd iidi1) i ididi− i (α ˜ ˜ i i i i i i i i i ˜ = (d ) . F 1 − F = − F (d ) . Case Case D D (problem (problem ifiC≤i > E(α+2,λ ) (di )of. the = E(α+ L EMMA 5. T˜The cumulative of (27) the(27) randi .dfunction E(α+2,λ ˜i˜ii˜= i , if C i, 6.))(d If execution 2 2 == .). i times 121HEOREM −−FFE(α+2,λ 1(d − 1W − T= T˜T == 0,Ci − ddistribution (27) (27) i )i )Fthe E(α+2,λ ) i λtasks λλE(α+2,λ λT= assu 25) 25) 5) 2 2We T= (27) EMMA 5.˜ iThe distribution function the rani cumulative ˜ifiC ˜C ˜last ˜ii≤ tasks are independent random λλ22 independent λ equality usedof(32). 25) domL variable We We assume assume tha th W ≤ ≤ d ≤ . d d d . . . 0, 0, 0, 0,0, ifThe C ifif C α(α α(α α(α α(α α(α α(α α(α + + + + + 1) + 1) + 1) 1) 1) 1) 1) i i i i i Ti˜ (27) ˜ random variables of the Erlang distribu Case Cas Cas Ca Ca C C if Ci ≤ di . task tasks are in = = = == = (d (d )(d .i).)equality ....E(β . tasks 1111− 1− 1− − 1− F − F− FE(α+2,λ F FE(α+2,λ F F The last equality used last used (32). The last equality used= (32). dom variable Ti (27) L EMMA 5. The cumulative distribution function of the ranid i)˜i)ii))i∼ E(α+2,λ E(α+2,λ E(α+2,λ E(α+2,λ E(α+2,λ ))equality )(d ))(d )(d )i(d , tasks µ), where βtask tion itasks i = 2λ 2(32). 2(32). 2∈ 22(32). 2J, thenThe are are indepen indepe The The last last equality equality The used last usedlast equality (32). (32). used The last used (32). Lemma 5. The cumulative distribution function of the random The equality used □ λ λ λ λ λ λ , λ ), i the expected value of tardiness E(α W W W W i ˜ −1 T HEOREM 6. If times of the tasks execution are ˜ L L L L EMMA EMMA EMMA EMMA 5. 5. 5. 5. The The The The cumulative cumulative cumulative cumulative distribution distribution distribution distribution function function function function of of of the of the the the ranranranrantion ∼ E tion d domvariable variable TiF(27) ias ˜i˜i∼∼E(β >the 0, ran(di )cumulative + (1 − FC˜i (d , 1}.tion Similarly, fo i ≤ n}) i ))FC˜i (x + d i ), if xof L EMMAT˜i 5. distribution function E(β , , µ µ tion d d tion C˜The (27) i i i tasks task task tas tas ta ta −1 ≤,t execution FT˜variable (x) = T T HEOREM 6. If the times HEOREM the tasks 6.execution Ifarethe Iftimes of if xT> 0, (di ))FC˜i (x + drandom The The The The The The The last last last last last last equality equality equality equality equality equality equality used used used used used used used (32). (32). (32). (32). (32). (32). (32). HEOREM 6. Iftimes the times of the of execution ˜ T˜T ˜i˜iF αare ˜i (di ) + (1 − FC˜iindependent i ),TTHEOREM (27) (27) dom dom dom variable of the Erlang distribution C iare ≤ n}) al- dom −1 −1 ivariable Tlast Ttasks HEOREM 6. 6. HEOREM If If the the times 6. If of of the the times tasks tasks oftasks execution execution the HEOREM are are the time mulative distribution function ˜ithe Fvariable (x) =Ti T(27) Ti˜(27) Theorem 6. If the times of the execution are ,1}. 1}. Si S ≤ ≤variables n}) n}) iof ≤ 0, ifindependent x ≤variables 0,independent )variables = (1 −the Ftasks )ition E( T 1independent −iFi6. i (27) ition T˜ivariable E(α,λ ) (di )) E(α+1,λ ) ,(d al- dom tion tio tio tio random independent of the Erlang random distribution random variables of Erlang distribution (d ))F (x + d ), if x > 0, (d ) + (1 − F F i x ≤expected 0,independent J, thenif the value of tasksof tardiness isdistribution λ, λ), i 2 J mul mulative dti les C˜i E(α iC,˜iλ ), i ∈i independent independent random random variables variables random of of variables the the Erlang Erlang of distribution the distribution independent Erlang distribution random variable C˜i0, i i ∈ J , then random variables the Erlang E(α mulative mulative distribu distribu mul F (x) = i (28) T T T T T T T HEOREM HEOREM HEOREM HEOREM HEOREM HEOREM HEOREM 6. 6. 6. 6. 6. 6. 6. If If If If If If If the the the the the the the times times times times times times times of of of of of of of the the the the the the the tasks tasks tasks tasks tasks tasks tasks execution execution execution execution execution execution execution are are are are are are are ˜ + (x (x+ d+ +), ddd), ),),if xififif> xif xx> > 0, 0, 0, 0, (d )(d+ (1 + +(1 (1 − (1− F − −˜F FF (d ))F (dii))F ))F F ˜F FF Ti tasks bles liiiivalu ≤ i∈ ≤ i≤ i≤ n ˜C ˜i˜i(x ˜C ˜i˜i(d ˜F ˜i˜i(d ii(d ii)i))+ ii(d i))F , λJ,), J, the expected value λE(α ), of itasks ∈ J, tardiness then the expected isisi the E(α λ ), ithen ∈ then thethen expected value of tardiness is C˜i (x CC C C C ithe i ,is i ˜ (xi +iiid iC iC x> 0,E(α F J≤ 0, (28) ∈ JCexpect ˜i≤ tardiness ii∈ ˜i (di ) + (1C− λE(α λ≤ ), ),> ii(28) i,∈∈ J, J, then the ),the expected ∈expected J, then value the value expected of of tasks tasks value tardiness tardiness of ,Erlang λ is is ),) (d tardiness i i∈ J, then E(α E(α E(α i ), if expected value ofvariables tasks F(x) FF (x) (x) (x) == = = C0, C C˜i (di ))F Ci i ,i ,x i , λα itasks ˜˜T −d )distribution .∈ 1the − FErlang i i ∈ J J ∈ i−i d i, T˜F T iof and independent independent independent independent independent independent independent random random random random random random random variables variables variables variables variables variables of of of of of of the the the the the the Erlang Erlang Erlang Erlang Erlang distribution distribution distribution distribution distribution distribution alalal- F E(α,λ i T idensity i˜i˜ (x) = function ˜ ˜ es mula mul mu mu mu m m (1xifif − ) α E(Ti ) = if 1 − FE(α+1,λ 0,function 0, 0, if≤ x≤ ≤ ≤0, 0, 0, ) (di )) 0, xxF 0, Ti = al- and Tdensity i α αα)(diα if E(α,λ x ≤ (28) 0, of value 0, α ˜of ˜ii)i,i,iλ bles les es ,iλ ,E( ,iλλ λ), ,(1 λ), λ ), ),˜i), ii), i)∈ i∈ i∈ i= ∈ i∈ J, ∈ J, ∈ J, J, J,then J, then J, then then then then then the the the the the the the expected expected expected expected expected expected expected value value value value value value of of of of tasks of tasks tasks tasks tasks tasks tasks tardiness tardiness tardiness tardiness tardiness tardiness tardiness is is)isis isisisi ))0, iii∈ E(α E(α E(α E(α E(α E(α E(α (1 − F (d )) ) = (1 − (d F ) (d T E( T 1 − F = − F (d )) (d ) E( T 1 − F on i,), i i i i i E(α,λ ) E(α+1,λ ) E(α,λ i ∈ i ∈ i ∈ i ∈ J˜ E(α,λ ) E(α+1,λ ) ˜ ˜ ˜ ˜ (28) (28) (28) (28) bles (1 (1−−F FE(α,λ Ti )Proof. = (d (1 −The FE(α,λ )) 1(d − = i(1 ) − FE(α,λ ) (d E(TTi)i )==(28) E( E(Ti )) (d 11− FFE(α+1,λ expected value i )) i )) iE(α+1,λ i )i )FE(α+1,λ E(α,λ ) )(d )− ))(d λ λ i )JT∈ λ(d tion and density function (1 − FC˜i (di )) fC˜i (x + di ), if x > 0, E( λ λ λ ted and −d (d ) . 1 − F α α α α α α α f (29) (x) = (d )) f (x + d ), if x > 0, (1 − F i i E(α,λ )˜˜˜˜˜˜˜ and and density density function function and density density function ˜ function i and function C˜i i C˜i Tdensity ∞)))of ∞1))− comple uted n where the ))date tasks )i)T )= )i= )= )= (1 = (1 = (1 (1 (1 (1 − − (1 − − F − F− FE(α,λ F FE(α,λ F F (d (d (d (d )) (d (d E( E( E( E( E( TE( TTiTiT 111i1− 1− − 1 − F − F− FE(α+1,λ F FE(α+1,λ F F and density function fTi˜i (x)= (29) E( if x ≤ 0. i)iT i= i)) i)) i)) i)) iii)) i)(d iii )iii))1 E(α,λ E(α,λ E(α,λ E(α,λ E(α,λ ) ) )(d ))(d )i(d E(α+1,λ E(α+1,λ E(α+1,λ E(α+1,λ )(d ))(d )(d )i(d −d −d )1− .E(α+1,λ F −+FdE(α,λ −d (d )))λλλλ)λ .(d 1− − F i(d iF E(α,λ )F(d E(α,λ ˜ (10,− (d )) f (x + d ), if x > 0, F −d −d −d 1 − (d ) ) . . (d ) . 1 − F 1 − F − λ λ E( T ) = x f x)dx = (1 − F (d )) fC˜i (x−d ˜ ˜ i i on ion on ˜ ˜ i i i i i i i i1)dx. 0, if x ≤ 0. i i E(α,λ E(α,λ ) ) E(α,λ ) whe where the Ci Ci Proof. The expected value Ti C i d fT˜i (x) = (1(1 (29) where where the the date date whe of oEd tion 0 0 (d (d (d )) (d )) f )) )) f f (x f (x + (x (x + d + + ), d d d ), ), ), if x if if if > x x x > > 0, > 0, 0, 0, (1 − (1 − F − − F F F ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ii iii Ci CCCii i i iii C C C C EMMA 7. Cumulative dist L i i i ed ted ed Proof. − FC˜i (di ))because fC˜i (x + diti ),isifsimilar x> xif ≤ 0. 0, ff˜˜˜(x) f ˜ f(x) (29) (29) (29) (29) (x) (x) == = = 0,(1 Proof. The expected value Proof. value value −d −d −d −d −d −d −d (d (d )The )ii)).....expected .. − 1− 1− − 1− F − F− FF FE(α,λ F F Proof. The expected value ii iii11 i1 i1 i)(d i)Proof. i)i∞ the proof E(α,λ E(α,λ E(α,λ E(α,λ E(α,λ ))(d )(d ))(d )(d )i ∞ i (x)proof uted The expected Proof. The value expected value Proof. The expected value L EMMA Lsubstitu (29) = Ti TTfTiiThe 0, 0, 0, 0,isisomitted ifisxififif≤ xx∞xProof. ≤ ≤ 0. ≤0. 0. 0.toto(29) whe whe whe wh wh w w fC˜i (x The + di expected )dx. Using We calculate the E(α,λ integral T˜The Proof. proof omitted because it similar the proof i EMMA 7.7. Cu C L 0 ˜ ∞ ∞ ∞ E(Ti ) = if x f≤ −F fC˜i (x +value di)dx. 0, 0. = ˜i (di )) (1 ∞∞ (33) ∞∞ ∞ ∞ LLEMMA of Lemma 1. T˜i x)dx C (C − 1 − F Proof. The expected ˜ ∞ ∞ ∞ i d and complying with (30) we obtain u = x + d i ˜ ˜ ˜ Proof. Proof. Proof. Proof. Proof. Proof. Proof. The The The The The The The expected expected expected expected expected expected expected value value value value value value value 0 0 i of Lemma 1. is omitted because it is similar Proof. The proof to the proof E( T E( T ) = x f x)dx = (1 − F (d )) ) = f (x x + f d x)dx )dx. = (33) (1 − F (d )L E( T ) = x f x)dx = (1 − F (d )) f (x + d )dx. (33) (x) = F ˜i − ˜d iC ii )dx. iL i ffFC˜(x ˜Ti˜)i )== i xxE( ˜ix)dx T˜i˜FF T˜i dii )dx. T˜= C˜ii˜i(33) C(1 T˜i+ C˜i(1 − L L i=(1 i(x E( T T E( T E(to f f x)dx ) = (1 x − − f x)dx (d (d )) = )) (d + + )) d )dx. f ) (33) (x = x f x)dx (33) = ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ i i i i i i T T T T C C C C C C Proof. Proof. Proof. Proof. The The The The proof proof proof proof is is omitted is is omitted omitted omitted because because because because it it is it it is similar is is similar similar similar to to the to the the the proof proof proof proof 0 0 0 i ii i 0, ∞ i0i i i 0i ∞0 F ∞ ∞∞∞ ∞∞∞ Using ∞ ∞∞∞ ∞∞∞ i ∞ 0 ˜i (x ofProof. Lemma 1. 0 0 0 0 0 T f (x + d )dx. substitution We calculate the integral (x) (x) = = F F The proof is omitted because it is similar to the proof Now we prove an auxiliary lemma, which we use at calcu˜ i ∞ f (x + d )dx = ∞T˜Ti˜i ∞ Proof. The proof is omitted because it is similar to the proof of0E( C˜˜i ˜˜˜˜˜ f+ )dx ofof of Lemma ofLemma Lemma Lemma 1.we 1. 1. 1.prove the E( E( E( E( E( TE( T(30) TiTiT )i)T )= )i= )= )= = = xx∞x∞xfxfTfxT˜f˜xiTfiintegral f˜x)dx x)dx x)dx x)dx x)dx x)dx = = = = (1 = (1 = (1 (1 (1 (1 − − (1 − − F − FF F (d F (d (d (d )) (d )) )) )) ffCfCsubstitution fiC˜iC(x f˜i(x (x + (x + (x + + dintegral + + diidi(x d)dx. di)dx. di)dx. )dx. )dx. (33) (33) (33) (33) (33) (33) .d(33) ˜− ˜d i)) i(33) Now an auxiliary ulemma, which we use calcu˜x)dx ˜fiT ˜F ˜C ˜iC ˜i(d ˜C ˜+ ˜(d ˜f˜C ˜fC ˜(x ˜(x i) iT i= id i)) iii)) i)dx. i+ i)dx. ∞ fC− (x )dx. Using substitution We We f= dCF )dx. Using We atcalculate the integral 2 complying C i iC ˜Using ˜id(x i˜Ti˜(x i0 + iC iiC iiii icalculate i0the iC ii C we = x+ 0+ 0 fC˜i (x∞+f d Cd ofLemma Lemma lating the expected value of variables T˜i˜dii T˜and 1.1. □with iUsing and density function 0)dx. fT˜CTiintegral + )dx. f (x + d substitution substitution )dx. Using substitution We calculate calculate We the thecalculate calculate integral integral We calculate the integral 00000obtain 00the 00 fC 0 0 0 0 0 0 0 ˜i (x ˜ i i i i˜ 2.We 0 0 C C i i . ∞ useuat lating the prove expected value of variables Ti i Ti we Now we an auxiliary lemma, which calcu ∞ complying complying complying with (30) w xx+ dd∞i and uobtain =we x +uobtain d=i and and withwe (30) = xand +uud= with ∞ ∞∞∞ ∞ ∞we and icomplying and density ∞∞(30) and complying and with with complying (30) (30) we obtain with obtain (30) we obtain and complying with uwe uuse == xxat++at ddicalcu= + x + d i i i and density funct and density func and Now Now Now Now we we we we prove prove prove prove an an an an auxiliary auxiliary auxiliary auxiliary lemma, lemma, lemma, lemma, which which which which we we we use use use at calcuat calcucalcu2 f f f f f (x f (x f (x (x (x + (x + (x + + + d + d + d d d )dx. )dx. d )dx. d )dx. )dx. )dx. )dx. Using Using Using Using Using Using Using substitution substitution substitution substitution substitution substitution substitution We We We We We We We calculate calculate calculate calculate calculate calculate calculate the the the the the the the integral integral integral integral integral integral integral ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ i i i i i i i � EMMA 6. If the deadlines for the completion of tasks (27) LNow 0 0 0 0 0 0 0 C C C C C C C uld lating i T . the expected value of variables T (u − d ) f (u)du − d f (u)du = ∞ ∞ ∞ ∞ (C ) f F ∞ ∞ f (x + d )dx = f (x + d )dx i i i i i i i Now we prove andeadlines auxiliary lemma, use at calcuWe calculate integral s0 fC˜i (x + d i C˜ii)dx. Using isubstitution we prove an auxiliary lemma, we at calcu ∞ C˜i i(27) d˜i i d C˜use C˜i ∞ thei ˜completion ˜.T ˜ii˜i2i22... we ˜ii˜iTwhich ∞ ∞ ∞∞ with ∞ iof tasks EMMA 6. Ifvalue the for L ouldlating iT˜T iλiii2which TT lating lating lating the the the the expected expected expected expected value value value ofof of variables ofvariables variables variables T˜the (x) = iT dwith and and complying complying complying complying complying with with with with (30) (30) (30) (30) (30) (30) we we we obtain obtain obtain obtain obtain obtain obtain uuuu= u= u= u= = x= xu = x + d = xx+ x+ + x+ x+ + d+ dididifand diand diCand 0d ˜i ∼ i with iand fcomplying (x + dfdi )dx = fwe (x + ddi )dx fCf˜Ti˜d(x and an an and and af (x + dfcomplying = f= (x + dwe )dx ˜iiT ˜2i2.0. E(α, ), then have the Erlang distribution C ˜ii )dx ˜we i i + di )dx = and ˜iand ˜(30) iwe ˜ ˜ lating the expected value of variables T and T and complying with (30) we obtain C C C i T lating the expected value of variables i i i i f f (x (x + + d d )dx )dx = = (x + f (x )dx (x + + d d )dx )dx f (x + )dx f (x + di )dx 0, i ), ithen ˜i ∼the ∞of tasks (27) C˜Ci˜i 0 E(α, λ have the Erlang distribution ii00∞ ∞∞∞∞∞C˜i 00α C˜Ci˜i i 0 ii00∞ ∞∞∞∞∞C˜i i0 0 C˜i 0(x)= 6. If the deadlines C for completion L EMMA d ∞ ∞ ∞ (x) f f 0 0 ˜ ˜ TTi i ∞ EMMA EMMA 6. 6. 6. 6. IfIf the IfIfthe the the deadlines deadlines deadlines deadlines forfor for for thethe the the completion completion completion completion of tasks oftasks tasks tasks (27) (27) (27) LLLEMMA −∞di ∞f ˜ (u)du ∞ (d∞) −d ∞ 1 − F uld uld uld L EMMA (u − d(27) = ofof − F ˜i(27) (u)du di)dx i ) fC(30) i+ fC˜f˜CifiC˜iC(x f˜= (x (x + (x + (x + + + d+ d+ didi1d)dx di)dx )dx )dx = = = = = = = ffCfC˜f˜)CfiC˜iC(x f˜i(x (x (x + (x + + + d+ d+ didid)dx di)dx )dx )dx ∞E(α,λ ) (di ) . E(α, λcompletion ), then Erlang distribution C˜˜˜i˜α∼ E(α+1,λ ∞If ˜fC ˜(x ˜(x ˜fC ˜(x ˜(x ari- have Lthe i)dx i∞i)dx i)dx i)dx iii)dx ∞∞ ∞fC ∞ ∞ EMMA 6. the deadlines for the of tasks C C C ould x f (x)dx = (1 − F (a)), α i i i i i i i i i ˜ ˜ λ E(α+1,λ ) d d C Proof. We(uomit bec ∼ ∼ E(α, ∼E(α, E(α, E(α, λλ), then ),),then then then(a)), have have have thethe the the Erlang Erlang Erlang Erlang distribution distribution distribution Ci= C C Ciλii∼ i (ufC˜− ddid)d)ffC˜i− (u)du fC= dithe ) fC− (u)du − d = (u)du dffi (u)du =(u−−dd(u 0d 00i0) 00− 000f0d 0C˜i (u)du ari-have ˜ (u)du ˜iproof xdistribution fCi˜i (x)dx −λF),λE(α+1,λ ∞xdistribution i− )fC− f0˜C0i˜i0d(u)du (u)du − − − diidPol. di ) fC˜Proo (u) == (30)(u =− ˜i (1 ) i i i C˜i (u)du 226theErlang Bull. Ac.: 65(2) (u 2017 ∼ E(α, λ ), then have Cλ C˜Ci˜i d(u)du C˜i i (u)du ∞∞∞(u d∞i ∞∞∞∞f∞ dTech. di i)i= Proof. α i iWe i i i the x ∞ ∞ ∞ ∞ ∞ Lemma 5. Substituting (33) into above equality we obtain the a α ∞ ∞ ∞ ∞ d d d d d d Proof. Proof. We We omit omit Pro iii i ii i i α ααα(1 − FE(α+1,λ (30) ˜i (x)dx = x∞xxffC ) (a)), ) − diα= (d .˜fC(u)du = ) (a)), 1= − 1 − FE(α+1,λ aririri- and − Lem dididididi Unauthenticated = ˜(u)du (u)du (u (u (u (u (u (u − − (u − − − d)− d− didid)i)di)diif)f)iCfCi)˜f˜C)fiC˜iC(u)du f˜i(u)du (u)du − − − − d− d− fCfC˜f˜CfiC˜iC(u)du f˜i(u)du (u)du (u)du (u)duLemma = = = =Ftheorem. Lemma 5. )(diα E(α,λ fC˜C˜i˜i(x)dx (x)dx (x)dx == = =λ (1α (1 (1 − (1− F − − FFFE(α+1,λ (a)), (30) (30) (30) (30) ˜(u)du ˜(u)du ˜fC ˜(u)du ii fα ˜i f(x)dx x x fx E(α+1,λ E(α+1,λ E(α+1,λ ))(a)), )(a)), tion anC C iiC ii iiC i i α Lemma 5. 5. Lem and i λ α α α ari(d ) − d (d ) . (d ) − d = = 1 1 1 − F λ=λλλ (1 − FE(α+1,λ ) (a)), = dd− diddiidiF di iE(α+1,λ dididdi)iF di(d diE(α,λ (d ) − d ) . = 1 − F 1 − F Download Date | d− 4/25/17 7:48 PM Proo Proo Pro Pro PiPc) i i i i i (30) i i i ) ) E(α+1,λ ) ∞ x xxx x fC˜i (x)dx E(α+1,λ ) E(α,λ ran−−ddi i 11− (dFF )E(α,λ − d))(d (di )1can . Fproceed (dPr = 11−− = λ ))(d = )we F 1(d − − α(α +λ1) i )i )FE(α+1,λ iE(α,λ i(di1)i )−..Then, i ∞2 x E(α+1,λ )− E(α+1,λ )to λE(α,λ λFFE(α+1,λ the and λ λ λ λ Substituting (33) into the above equality we obtain the asserx f (x)dx = (1 − F (a)). (31) 2 α(α + 1) 2 Lem Lem Lem Le Le L L ˜ T α α α α α α α Then, w ˜ ˜ ˜ E(α+2,λ ) and and and x x2 Cf i˜ (x)dx = λ 2 n-theand We calculate the expected value of the variable T . and T . T random variables i Then, we we can can T i iequa (1 − FE(α+2,λ ) (a)). (31) (33) (d (d )− )− − − d− d− dididididi11i1iobtain (d ..Then, .. above = = = = == =into 1− 1− 1− − 1− F − F− FE(α,λ F FE(α,λ F F 1111− 1− 1− − 1− F − F − FE(α+1,λ F FE(α+1,λ F F Substituting (33) into the ))equality above Substituting equality we (33) obtain into asseranannSubstituting the above we the asserC i)(d i)i)i))ii− i)(d i)i)the i))i.i).).the ∞ E(α+1,λ E(α+1,λ E(α+1,λ E(α+1,λ E(α+1,λ )(d ))(d )(d )i(d E(α,λ E(α,λ E(α,λ E(α,λ E(α,λ ))(d )(d ))(d )(d )i(d
Proof. The expected We Werequired assume assume that that the required required deadlines deadlines for for the completion completion of of (3 Weexpected assume ivalue that the deadlines the completion of value independent variables of the value Erlang distribution i the i for i the ithe i iλ Proof. The expected value We assume that the required for completion of E(α,λ )random EMMA 7. Cumulative distribution Lcase Proof. The iasfor ,L1}. Similarly, B,density we determine the cu≤ EMMA Cumulati Proof. The expected of the If distribution function of thedistribution of tasks tardiness Proof. value The expected . expected EMMA 7. Cumulative of tasks L7. times mulative EMMA Cumulative distribution of tasks tardiness (35)of pdeadlines where the of tasks completion isCumulative C7. i )The λT),HEOREM i∈Proof. J,value then expected value tasks tardiness is∞n}) E(α −1 ) (dtimes ∑ f(α,λ the tasks execution are 7.7.tardiness Cumulativ LLEMMA date −1 i≤ i∞= j . 7. i ,of the of the tasks execution are EMMA 7. Cumulative distribution of tasks tardiness (35) L −d −d (d )tasks )i)date .∞.≤ 1ithe 11− − F F ∞(d j=1 EMMA EMMA distribution Cumulative of tasks distribution tardiness (35) of tasks L L i ∈ J ...tardiness , Similarly, as for case B, we determine the cui n}) ∞1}. i6. ii(d i(d i)i E(α,λ E(α,λ ) ) , 1}. Similarly, as for case B, we determine i n}) ndependent random variables of the Erlang distribution = p . where the of tasks completion is C i i −d −d −d −d −d ) . (d (d (d ) ) . 1 − F 1 1 − 1 − − − F F F F ∑ ∞ ∞ tasks tasks are are independent independent random random variables variables with with Erlang Erlang distribudistributasks are independent random variables with Erlang distribu, λ ), i ∈ J, then the expected value of is i j ∞ i i i i i i The The last last equality equality used used (32). (32). tasks are independent random variables with Erlang distributy used (32). E(α,λ ) E(α,λ E(α,λ E(α,λ E(α,λ ) ) ) ) eE(α last equality used (32). i j=1 ∞ ∞of mulative function of the∑density of tasks tardiness iscompletion i i i∑ iij=1 ∞ the ErlangE(distribution = = where the the date date of tasks tasks completion completion is C ∈date Jwhere ∑ ∞˜of ∞x)dx x∞αf∞T˜ x)dx (C ) iii= i= ˜i the j=1 T˜= )− =variables (1 − F (d )) fwhere (xdistribution d(d (33) = pis .C = = p˜(C pi ,pjpp.j .ijpjj..− .j . x)F where thei(33) where where tasks the the the the date completion date date of of of of tasks tasks tasks tasks completion completion completion is is is C C variables ˜iis ∑ ∑i∑ ∑ dm value ˜where E( )= = xCfd˜fCumulative = (1 − F+ )) f (x + dC (33) i∞ i i )dx. random of Erlang distribution i)dx. jC iC iif ˜iJ, ˜(x idate i= j=1 j=1 j=1 j=1 Cd 1≤ − Fj=1 − d− ,max{2(min{d C > Cd˜di(C ˜T T=then xi= fT= x)dx (1 − FCtasks (d )) fd)dx. (x + mulative distribution function of the density of tasks tardiness ˜idistribution ˜of ˜dis T iiE( C˜E(β C 1 − Fd˜( i˜(33) TE( )independent xx)dx fE( x)dx =−F(1 F (d fEMMA (x )dx. (33) ˜+ mulative distribution function of the density of tasks ˜E( ˜(1 ˜ii)) ˜i+ ix)fx= i˜)dx. λT˜), i∈ the value of tardiness (α i tion T ) = x)dx = (1 − F (d )) f + d )dx. (33) ˜ iβ i1 ˜expected ˜)) ˜tion ix∼ d˜d d˜≤ ∼ ∼ E(β , , µ), µ), where where β β = µ, µ, µ µ = = max{2(min{d max{2(min{d : : tion tion d 7. of tasks tardiness (35) L E(β , µ), where = d µ, µ = max{2(min{d : 1 d T C x)F (C ) if x > 0, 1 − F i ,E( ˜ ˜ T ) = f (1 F (d )) f (x + )dx. ∼ E(β , µ), where β = d µ, µ = : 1 ≤ d i i i T ˜ C C ˜ ˜ (C ) if x > 0, (C − x)F − F i − iFta i E( ) = x)dx ) = − (d x f )) x)dx f = (x (1 + − d F (d )) (33) f (x d )dx. (33) i i i i i i i i i i i i i i i ˜ i 0 0 i i ˜ ˜ ˜ ˜ T i ∈ J i i i i i i i C C ˜ ˜ i i α i i ˜ ˜ ˜ ˜ ˜ ˜ i i i i i ) = (1 − F (d (d ) E( T 1 − F 0 0 d d T i ˜(3 − d , if C > d , C C C (C ) if x > 0, (C − x)F 1 − F i i (x) = F =− d x)F T,expected Cithe Ci07. C Ci (C ifdii˜xi (C >i 0, if (36) x> −FF x)F 1(x) −= Fd˜i T 1)d− F Proof. The expected value value iexpected iiis idistribution i d˜T˜d i ), i 0Cumulative (x) =i )111≤ i˜(35) E(α,λ )i T E(α+1,λ ) tardiness 0 the EMMA of tasks tardiness Lthe i i 0tasks ˜(x) n6. the expected value of tasks tardiness d(C ∞ F 07. d˜i (C expected Proof. Proof. Proof. Proof. The The value The expected value value value value λexpected i ∈ J, then expected value of is T i˜i i i i i ii i i ii distribution (x) = F ˜ −1 −1 −1 ˜ 0The 0 i T HEOREM HEOREM 6. 6. If If the the times times of of the tasks tasks execution execution are are IfTProof. the of tasks execution are i i 0expected −1 = ˜ ˜ 0E(α 0 0 0 )The = (1 − F (d )) (d ) E( 1 − F EMMA EMMA 7. Cumulative Cumulative distribution of of tasks tasks tardiness tardiness (35) (35)di( L L i ∈ J TThe HEOREM 6. If the times of the tasks execution are T∞˜itimes λ = (35) T ˜ (x) = F (36) i ∈ J T i i (x) = (x) = F F (36) iidistribution 0, E(α,λ ) E(α+1,λ ) ≤ d , 0, if C ˜ T ∞ , , 1}. 1}. Similarly, Similarly, as as for for case case B, B, we we determine determine the the cucui i ≤ ≤ n}) n}) ˜ ˜ i 7. , 1}. Similarly, as for case B, we determine the cui ≤ n}) ˜ ˜ EMMA 7. Cumulative EMMA EMMA EMMA EMMA 7. 7. 7. Cumulative Cumulative Cumulative Cumulative of distribution distribution distribution distribution tasks tardiness of of of of tasks tasks tasks tasks (35) tardiness tardiness tardiness tardiness (35) (35) (35) (35) L L L L L 0, , 1}. Similarly, as for case B, we determine the cui ≤ n}) i i T i ∞ α T T − d , if C > d , C i if≤x0, ≤ 0, 0,if x ≤ i (xsubstitution i ∞(33) if0,x0, ))∞ i substitution ∞ ∞ ∞ fsingle + Using We calculate the integral ∞∞∞)dx. ∞ + dx =E( (1 − )) fthe (x + diof )dx. ∞ Using ∞∞λof ˜i (x fprobabilistic )dx. Using We calculate integral d˜ , x≤≤ if iCi0,≤˜iif ˜variables 0, ˜iWe irandom 0, scheduling machine with parameters i i= ∞ 0ofsubstitution Cthe ˜xii0, ∞T ∞d ∞ ∞variables ∞ ∞ + + C˜fi(d Cintegral (C if > 0, 0, (C − 1dUsing −i )dx. Fmulative fthe (xdErlang dα Using substitution calculate )˜F = (1 − F (dintegral )distribution − F ˜i ,(35) 0− C˜˜x)F independent independent random of the Erlang f10C˜+ (x + )dx. We calculate the andom variables of the Erlang distribution α idistribution Td˜of ˜Stable icalculate dependent the Erlang distribution i∞ i˜distribution ,distribution if C)the > dfunction ,0, f) +x+ Using substitution We the integral FWe (dcalculate (x )dx. (33) E(α+1,λ )(d iUsing d˜˜iC d˜id − d˜iof , iifxtasks if tasks Ctardiness dtardiness Ci(36) fiλ (x dC)dx. )dx. Using substitution calculate the integral iidistribution ifunction ii)dx. i )) iWe i)variables mulative function of of the the density density of of tasks distribution function density ofthe tasks tardiness f(1 (x dand fmulative substitution (x +idF dd)dx. )dx. the integral calculate the integral ˜random i > tardiness ˜0i1)− 0E(α,λ mulative of density i )) 0isubstitution C˜di (x C˜iWe C˜= (C − x)F (C if > 0, 1 − iF ˜x i i −d (d ) . − 0 C E( E( T T ) ) = x x f f x)dx x)dx = = (1 − F F (d (d )) f f (x (x + + d )dx. (33) (33) ˜ i (x) = F i i ˜ ˜ 0 0 C C i i ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ E(α,λ ) i i i i i F (d )) (d 1 − F ˜ ˜ d i i complying with (30) we obtain u = + d = (35) T ) = (1 − F (d )) (d ) E( T 1 − F ≤ d , 0, if C T T C C˜(x C C (C (C − − x)F x)F (C (C ) ) if if x x > > 0, 0, 1 1 − − F F T E( E( E( E( T T T T i i )αthen = x f x)dx ) ) ) = ) = = = = (1 x x x f − x f f f x)dx F x)dx x)dx x)dx (d = )) = = (1 (1 (1 (1 − − − − F f F F F (d (d (d (d + )) )) )) d )) )dx. f f f f (x (33) (x (x (x + + + + d d d d )dx. )dx. )dx. )dx. (33) (33) (33) (33) icomplying i with 1 = T E(α,λ ) E(α+1,λ ) and (30) we obtain u = x + d i 0 and density function i i i i i i i i ˜ ˜ ˜ ˜ i i i i ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ i i i i i i i i i i i i i i i i i i i E(α,λ ) E(α+1,λ ) i i and density T T T T T −d (x) = (d ) . F 1 − F (36) d d d d C C C C C C C C C C (C ) if x > (C 0, (C (C (C ) ) ) ) if if if if x x x > x > > > 0, 0, 0, 0, (C − x)F (C (C (C (C − − − − x)F x)F x)F x)F − F 1 1 1 − 1 − − − F F F F and complying with (30) we obtain u = x + d = where the date of tasks completion is C ,u,+ ), ii ii+ ∈∈then J, J, then then the expected value value of of tasks tasks tardiness isisobtain E(α complying with (30) we obtain xJ, dλ expected tasks tardiness is i i iiof ˜i d˜di0, ˜i(x) ˜didi˜˜iii i i i ii i+ iiwe ii iitasks itardiness iiid i∈ i ii and i= i0value ,=λ=), i= ∈ value of and density function i C∑ complying (30) we obtain x+ E(α,λ ) we 00λ if ≤ i 0and and density function T˜u the i ifdensity ˜ifunction d˜di˜did˜i˜ii ii= dif j=1 iu with obtain xλ∞λ + i and = F F (36) (36) i∈ ∈with Jdensity J= i 00= Jiswe with and (30) complying (30) uE(α xi0ithe d), = xexpected disubstitution d˜xicompletion ,ddensity 0, C ∈ Ji(x) and density 000the 0 uexpected 0(30) 0tardiness 0with ˜x ˜function ,p j . (36) 0, and function and function i≤ obtain ∞i0, TT T(x) p where the date of tasks is C i. ≤ dfunction (x) (x) (x) = = = = F F F F F (36) (36) (36) (36) fCid˜iand (xand +complying dcomplying )dx. Using egral ∑ 0, if ≤ 0, iT i(x) ˜ ˜ ˜ ˜ ˜ i j ∞ i−d j=1 T T T ∞ ∞ 0 (d ) . 1 − F i i i i i 0, 0, if if x x ≤ ≤ 0, 0, ∞ ∞ ∞ ∞ i i (x + d )dx. Using substitution E(α,λ ) ∞ ∞ Proof. The expected value i ˜ ˜ ˜ ˜ ˜ ˜ ˜ ∞ ∞ i 0, 0, 0, 0, 0, if x ≤ 0, if if if if x x x ≤ x ≤ ≤ ≤ 0, 0, 0, 0, ˜ ˜ ∞ integral ∞∞f∞ ∞ α α α ∞the C−d α0)dx. ∞(d ∞ − if C > ddj=1 C Cidii− > Cdate f˜ifCfUsing (x (x + + dddE(α,λ )dx. Using Using substitution We We calculate calculate the integral , i ,ddiF >tasks ,∑tardiness CCicompletion i .dx˜tasks the of isfif C i,,if C idi i i,, pof i− i , ˜tasks iCumulative ˜fi∞ ˜∞ (C )>ftardiness (C )the .ffintegral 1with F= id if)dx. (x + )dx = fC˜ (x fwhere (x + di )dx id idistribution i> EMMA 7. (3d˜( LdCumulative ifC= jF of (35) −d (d )d(d .substitution − F lying (30) obtain F ˜(x ˜+ + d− )dx C)dx i − x), i− (x (x (x (x + + + + d)dx. d)dx. )dx. )dx. Using Using Using Using substitution substitution substitution substitution lculate We the We We calculate calculate calculate calculate the the the integral integral integral id Proof. The ˜T ˜integral E(α,λ )(1 ˜− and density function i1 ˜(d ˜iiif= ˜didLemma i )dx idistribution d˜i0, C+ ii )dx iE(α+1,λ iC )+ )− (C − x), xi0, fid)dx (xiddi)) + = fiwhere (x + )expected = = (1 F∞(x (d (d )) E( T 1˜(x − − FF (1 −iE( Fwe (d (d 1value − F (C )(C (C − x), if >= CT F iif i (35) iT + = fisubstitution )dx 0˜F 0= 00d001 C C C )obtain (1 −we Ff− (d TWe − F ˜˜iiof = (35) T(x = (35) ˜)+ iif i)fand i i)i))dx i )) F ˜f) (35) T˜fiCfCumulative fC0of (x )dx == 7.7. (x + dF )dx i tasks i˜ ˜iid(x ˜iid i˜C i˜fi˜i1 i) E(α,λ E(α,λ ))) E(α+1,λ )i(d E(α,λ )i i) E(α+1,λ iE( id)i) dTx), d˜˜ii (C C C (30) ˜i i (C f if x > 0, F E(α,λ )C E(α+1,λ )˜ ˜i + ˜= (x) = f f + f (x + d EMMA distribution tasks tardiness (35) = . the date completion is C iL idate dp dii˜completion C C (C ) (C − x), x (C > ) 0, f (C − x), xd> F (x + = (x + )dx = f (x d )dx ∑ ∞ i)dx i)dx ˜ p jif.(37) where the of tasks is C 0 ˜ (x) = f i density function i i i j i i ˜ ˜ ˜ ˜ ∑ ˜ ˜ i i i i i i i 0 0 ˜ i j=1 d d C C i ˜ ˜ ˜ (x) = ( f d d d d C C C and and complying complying with with (30) (30) we we obtain obtain u u = = x x + + d d j=1 λ λ λ ˜ T i i di , iifi Cif icomplying i∞ (x) = f λ 0 0 ˜ i i i i i i i i i and and density density function function (x) = f 0 0 ˜ ≤ ≤ d d , , 0, 0, if C C ≤ 0, if C roof. The expected value T ≤ d , 0, ∞ 0 0 ˜ (x) = (37) f and complying and and and and with complying complying complying (30) we with with obtain with with (30) (30) (30) (30) we we we we obtain obtain obtain obtain + d u u u u = = = = x x + x x + + + d d d d T i i i i i ∞ (x) = (x) = (37) f f i 0, i i 0 0 ˜ T i∞ i i i i i and density function and and and and density density density density function function function function ˜ 0 0 0 ˜ ˜ 0, T i T T E( T ) = x f x)dx = (1 − F (d )) f (x + d )dx. (33) i EMMA 7. Cumulative distribution of tasks tardiness (35) L 0, if x > 0. ∞ ∞ i i(C − x)F (C ˜ ˜ ˜ 0, if x > 0. i i i ∞ ∞ 0, T C C ) if x > 0, 1 − F d value 0, if x > 0. Proof. The expected value i ˜ ˜ i i ˜ i i ∞ ∞ ifi xx > > 0, 0. of tasks tardines if x > ∞∞∞ ∞ ∞+ ∞ ∞ (d ∞ )distribution fd˜i (C x > Cumulative 0,− x)F Fd˜i (C ∞= x)dx 0∞(d 1(1 ∞ .)∞)∞ x + d−d (x + )dx di (C d0, ∞ ∞ xf∞−d = F (d )) f.C(u d)if)dx. i (u)du i− iT)dx ˜(d ˜i (x ∞ 7.(33) tasks (35) ∞ ∞ ∞− C L T˜∞ −d (d .− − FFE(α,λ )id− .− − d0FiE(α,λ (C 1of −fif= F0, (x) (3 Fix), 1∞i(u ii )i−d ˜7. −+d0fLddEMMA (u)du − f ˜x), = i i)))i∞ i i 1F i∞ i 0∞ i )distribution itardiness (CCumulative ) fd− (C − if− x ˜0, = E( fiC)˜ii = (x1 + E(α,λ ˜i(u)du (u dwhere fthe (u)du d> =fFT˜the i)dx E(α,λ )C i iifif i (C) f)df˜i(C d(u)du (37) i= TEMMA iif i )i tasks C∞ i completion i tasks di˜(x) di˜idof (u − ddf)dx ))dx f (u)du d f = 0x=f )dx C˜C(u)du C˜C i F − d ) (u)du − d (u)du = (C (C − − x), x), x x > > 0, 0, F F f f (x (x + + d d )dx )dx = = f f (x (x + )dx ∞ i (u ˜ ˜ i i (36) = − ) f − d (u)du = = p p . . where the the date date of of tasks completion is is C C i i ˜ ˜ = p . where date completion is (x) = (36) ˜ ˜ ˜ ˜ ∞ ˜ ˜ ˜ ˜ i i i i i i i i i i ∑ ∑ = p . where date of tasks completion is C ∑ C C ˜ ˜ (u − d ) f (u)du − d f (u)du E(0T˜i ) = x)dx = (1 − F (d )) f (x + d )dx. (33) ˜ x ≤ 0, 0, if i i i i j j ∑ i j C C (x) = (37) f d d d d (C ) f (C − (C x), (C (C (C ) ) f ) ) f if f f (C (C x (C (C > − − − 0, − x), x), x), x), if if if if x x x > x > > > 0, 0, 0, 0, F F F F F C C C C (u − d ) f (u)du − d (u − d ) (u)du f (u)du − d f (u)du = = ∞ f (x + d )dx f f = f f (x (x (x (x + + + + d f d d )dx (x )dx + = = = = )dx f f f f (x (x (x (x + + + d d d )dx )dx )dx )dx i j i i d d ˜ ˜ ˜ ˜ ˜ j=1 j=1 i i j=1 iii ii TProof. Ci C Proof. We the proof because it is ˜ i i ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ i˜˜x)F i i i ii ˜d˜d(C i˜ iii)i iiomit i˜C iC ii˜C i˜ CTC ˜ ˜ ˜ ˜ ˜ ˜ ˜ j=1 i i i i i i i i i i i d d i i T 0, if x > 0. C C C i if x > 0, (C − 1 − F i Proof. We omit the pro C C d d d d d d d d C C C C C C C i i i ˜ i i i i (x) (x) = = (37) (37) f f i d d i i i i i i i i i i i i i i i i i i i i i i i omit the proof because isbecause similar to the proof (x +˜ dd(d Using We integral d00i the 0f0(33) ˜i(x) dbecause d= omit proof itsimilar isit toproof proof ofs i 0,= ifx)F x similar ≤ 0, d0. ii )dx. Proof. We omit the proo T i ithe )dx =calculate (1 − (d fC˜i (x (x) =We = = (37) (37) (37) (37) (37) fT˜˜i (33) ffT(C fif˜˜We dT˜ii0)000∞ 0F ˜calculate isubstitution if= xWe 0i d(1 Proof. omit the proof because itisiissimilar to the proof of E( = x0 ∞f+T˜ifdx)dx fC0, (x iT i(x) dii )) d= d+ 0The 0C)dx. 0˜0id00− ˜T ˜(x) i )dx. We omit proof because We omit it˜ (C the proof to the itthe is similar of (36) FProof. ˜i d i diFC i d1 i )) substitution i )dx. T T(x) (C − x)F if x 0, > 0, −Proof. F ˜ii (x) (C ) if x > 0, − 1 − F i˜the iii ) Proof. ˜> i the (x + Using We integral ∞ Cexpected Proof. Proof. The expected value value pected value 0, 0, if if x x > > 0. 0. T ˜ i oof. The expected value i ˜ i d d Lemma 5. d d α 0 C i i Lemma 5. 0, 0, 0, 0, if x > 0. if if if if x x x > x > > > 0. 0. 0. 0. 0 i i α ∞ ∞∞∞∞f∞C∞˜complying i ∞ with − di ) fC˜i (u)du 0(u)du 7. 1Cumulative ∞∞∞∞obtain EMMA 7. 7. Cumulative distribution distribution of of(35) tasks tardiness (35) LLEMMA distribution of tardiness Lemma 5.FTLemma = FL xtasks ≤ 0,tardiness 0, if and we u= x−(u)du + di iα (30) F EMMA distribution of (36) tasks tardiness LLemma Lemma ∞ =tasks ˜EMMA 7. Cumulative i )1− ∞ and Lemma 5. (35)(35) ˜i (x) i (x) dα α Lemma 5. (u)du 0We + 5. − (d )i5. =d)(d − F 1− − fα i−0∞ du −xd+ α (d −Cumulative ddensity (d0, )0,. . proof = −F Ffunction i ∞d∞∞ (x )dx. Using substitution We calculate the integral ˜= i. 0,proof ifunction E(α+1,λ )T(d E(α,λ )− )− and with we obtain u= d1i − i)5. ithe i ∞ ∞ ∞ C omit because it1.1 is similar to of E(α+1,λ )(d E(α,λ )(d d∞1FiProof. (d )Fdthe = 1ff(d − F if x )≤ 1(u F ∞ iα and density (u (u − d− ddf))iCd(d (u)du d f (u)du (u)du = = F (d − d − F (d ) . i∞ complying x ≤ 0, if i1 iF ∞ ˜ffii˜(30) ˜− ˜ ˜ ) d ) = − 1 − i) E(α+1,λ )f− E(α,λ ) i= ii˜fi) i i i i i E(α+1,λ E(α,λ ) − d (d ) . = 1 − 1 − F λ i i i C C C C (u − d ) f (u)du (u (u (u − − − d d ) ) f ) f f (u)du (u)du (u)du (u)du (u)du − − − − d d d d f f f (u)du (u)du (u)du (u)du = = = = = (d d (d ) ) − . d (d ) . = − F 1 − F 1 − F 1 F E(α+1,λ ) E(α,λ d λ i i ∞ i i i ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ Proof. We omit the proof because it is similar to the proof of i i i i i i i f (x + d )dx. Using substitution tegral E(α+1,λ E(α,λ ) i)dx. i )dx. i (33) idensity i C˜ We )(1 E(α+1,λ )(x E(α,λand ) Proof. function f(x (x + dCiCi iidi)dx. Using substitution the integral C C C C C C)+ C+ ix)dx λ ˜E(α,λ λCfcalculate i= iC i˜ ii(d i)) i)dx. dE(α+1,λ dd)) dfC dd(x λ T˜T˜idi)0)i= = x x f x)dx = = (1 − − F F (d )) f d )dx. (33) Proof. We We omit omit the the proof proof because because it it is is similar similar to to the the proof proof ofof fT˜Ti˜E( x)dx (1 F (d f (x + d (33) 0 C iλf− E( )+= x x)dx − F (d )) f + d (33) Then, we can proceed calculate th and complying with (30) we obtain = xiE( id idd(1 i i ˜ ˜ ˜ ˜ ˜ ˜ ˜ i i i i i i i λ λ ˜ ˜ ˜ ∞ ∞ Then, we can procee i i and density function d= d d d d Lemma 5. T T C C C (C (C ) ) if if x x > > 0, 0, (C (C − − x)F x)F 1 1 − − F F C (C ) if x > 0, (C − x)F 1 − F Proof. We omit Proof. Proof. Proof. Proof. the We We proof We We omit omit omit omit because the the the the proof proof proof proof it is because because similar because because to it it it it is the is is is similar similar similar similar proof to of to to to the the the the proof proof proof proof of of of of T Ci i + dii ii )dx =Ci (C )˜ii) fproceed xthe 0, x)F 1we −asserF i iifi i˜i (x + di )dx ˜can i i i i i ii Then, can calculate the values i itoFd˜calculate i if i (C i proceed i i the expected of i˜dcalculate (Cdasser− x), ifexpected x2values > 0,values (x di˜di˜− d˜Then, dwe Then, we can proceed d˜i (C i> ito i proceed iproceed i we Then, we can expected values of α+1,λ )(d 0u 0 d= 00 5. 0 fC˜Substituting i˜to Lemma d(C d˜i the C plying with we obtain (33) into the above equality we obtain the Then, can proceed Then, to calculate we expected to calculate the of expe 0Substituting 0f with and complying (30) obtain x1f+ dF(33) i can i (di= i we Substituting (33) into the above equality we obtain the (x) (x) = = (36) (36) F F (x) = (36) F and density function (30) ) − ) . − (x) = (36) F ˜ ˜ i E(α,λ and density function ˜ ˜ ˜ (C ) f − x), if x > 0, F (x + d )dx (x + d )dx ˜ i i into the above equality we obtain the asser(x) = f (3 Lemma Lemma 5. 5. ˜ ) T T T Substituting (33) into the above equality we obtain the asser2 ˜ ˜ and T . random variables T ˜ ˜ α α ∞ ∞ i i 0 0 i i T ˜ 2 Substituting (33) into the above equality we obtain the asseri i i and random variables T ˜ and d≤ i the equality Cinto obtain Lemma asser we Cinto iinto (33) the above obtain the 5. Lemma Lemma Lemma 5.5. 5. ∞ F= −Substituting dαi 1 − )−Fthe .Finto Substituting (33) Substituting (33) equality we the above the equality asserobtain asseri˜i ˜and i˜ 0, i αα)αα(d i dT random 2 . 0, T0, .T˜i . Tif random variables i(33) xxT ≤ 0,ifvariables 0, if 0,Lemma E(α,λ ∞ (d ∞ x if ≤ 0, 0,variables random T˜i iand ˜i≤ ˜i ifand =T˜0, fT5. Substituting the above we obtain the ∞ above T2T˜f.ii2x˜variables .(C random variables TT˜random 0the ˜i (x) xi > 0.(37) (d )i0F)iiequality )i+ − − d− didUsing (d (d )))can .. ... we = 1− − FF F 1integral 1− i− and Ti )iif tion theorem. random variables ∞ ∞ i ˜and i(d i+ ii1 ii ))i )iiassertion ∞ tion theorem. iF E(α+1,λ E(α+1,λ ) ) E(α,λ E(α,λ ) ) (C x), x > 0, f (x + d )dx = f (x d )dx f f (x (x + d d )dx. )dx. Using Using substitution substitution We We calculate calculate the integral (d ) − d (d (d (d ) ) ) − − − d d d (d ) . (d (d (d (d . f (x + d )dx. Using substitution = = = = = he integral 1 − 1 1 1 − 1 − − − F F F 1 − F 1 1 1 − 1 − − − F F F F i ∞ ∞ f (x + d )dx. substitution etion calculate the integral Then, we proceed to calculate the expected values of tion theorem. ˜ ˜ ˜ ˜ i ˜ i i i i i i i i i i i i tion theorem. ˜ E(α+1,λ ) E(α+1,λ E(α+1,λ E(α+1,λ E(α+1,λ ) ) ) ) E(α,λ ) E(α,λ E(α,λ E(α,λ E(α,λ ) ) ) ) i dvalues diof CThen, 0 λ Cλ theorem. if(C x >−0. 0theorem. C00i CCi i i0, i ∞ tion tion theorem. tion can˜ f(x proceed to □ expected (x) =x), (37) theorem. λtheorem. λiC∞ λλifλC˜i = fdT˜i (C if8. x> 0,Tto F d˜i˜(C + )dx (x +obtain dobtain )dx 0equality 0) f+asser) 8. fddeadlines x), if(37) xcompletion > 0, the Fcalculate o(xuuxthe we the HEOREM Ifthe the requested deadlin (x dwe = + d2 i )dx i )the i− i+ i(u 2 .Then, ˜i the HEOREM 8. ideadlines iT i )dx −fC˜di (30) −we drandom (u)du Then, we we can can proceed proceed to calculate expected expected values values ofof ˜calculate d˜i (C(37) ˜i (u)du i(30) i fC and and complying complying with with we obtain obtain = =dxdxabove + ddi= HEOREM If the requested for the complet T complying with (30) we i C˜variables and complying with (30) we obtain = + HEOREM 8. If the requested for the T and T T C Then, we can Then, Then, Then, Then, proceed we we we we can can to can can calculate proceed proceed proceed proceed to the to to to calculate calculate calculate expected calculate the the values the the expected expected expected expected ofcompletion values values values values of of of iwe and and density density function function and density function e equality obtain the asserHEOREM 8. IfIfof the rer T (x) = f 2 i 0, if x > 0. and density function i i 2 (x) = f HEOREM 8. If the requested deadlines for the completion T ˜ 0 ˜ 2 i ˜ HEOREM 8. If the requested HEOREM deadlines 8. If the requested for the deadlines for T T 0 0 ˜ ˜ ˜ (u − d ) f (u)du − d f (u)du = Tthe ˜ We calculate the expected value of the variable T . 2 We calculate the expected value of the variable T . T Substituting Substituting (33) (33) into into the the above above equality equality we we obtain obtain the asserasser˜ ˜ ∞ ∞ i d d i i and T . random variables T 2 2 2 Proof. We omit the proof because it is similar to the proof We calculate the expected value of the variable T . i i ˜ i i C C 2 i ˜ ˜ ˜ ˜ ˜ of tasks are independent random variabl i 2 tuting Substituting (33) Substituting Substituting Substituting into the (33) (33) above (33) (33) into into into into equality the the the above above above above we equality obtain equality equality equality the we we we we asserobtain obtain obtain obtain the the the the asserasserasserasserithe iWe i ˜ We calculate the expected value of the variable T . 2 2 0, if x > 0. of tasks are independen 2 2 2 2 2 i We calculate the expected value of the variable T . and and T T . . random random variables variables T T ˜ 0, if x > 0. calculate the expected value of the variable T . ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ iand iand of tasks are independent random variables with Erlang dis ∞ ∞∞ ∞ We ∞expected Wecalculate calculate the value ofthe the variable difvariable of tasks are independent with Erlang ∞ d∞ ∞− dof Proof. omit proof because it is variables similar toErlang the proof ofdistriWe calculate the expected value TiT∞ thei variable T .i . ofi random and T . of and TiTiare TT .i..independent variables random random random random Tiare variables variables variables variables TTiTT ithe ofwith tasks are independent ∞value itasks iiand of tasks independent random variables with Erlang distri(u − d∞expected (u)du =theorem. of tasks are independent random random variables distriwith i .We ithe iicompletion i.irandom i ) fC˜i (u)du i tion tion theorem. ˜− (x )dx C˜+ Lemma 5. 8. If Proof. the requested deadlines for the TdHEOREM ˜iproof α i+ divariables ∼ E(β ,bution µ), i0, = 1, 2,E(β . .of .value n, then bution heorem. tion tion theorem. theorem. theorem. (C (C ) ) f f (C (C x), x), if x x > > 0, F F ˜ (C ) f (C − x), if x > 0, F ∼ , µ), iof d f f (x (x + + d d )dx )dx = = f f (x d d )dx + ditheorem. )dx = f (x + d )dx i− iif (C ) f (C − x), if x > 0, F ˜ f (x + d )dx = f (x + d )dx ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ −fCd˜ii(x ) ftion (u)du − d f (u)du ˜ i i i i i i i i i i i d d ˜ ˜ ˜ ˜ ˜ T HEOREM 7. If the task completion times are independent i i ∼ E(β , µ), i = 1, 2, . . . n, then the expected value bution d We omit the proof because it is similar to the i i (u − d ) f (u)du − f (u)du = ˜tion ˜ i d d d d C C C C d d C HEOREM 8. If the requested deadlines for the completion T ∼ E(β , µ), i = 1, 2, . . . n, then the expected bution d ˜ i i 2 T HEOREM 7. If the task completion times are independent Lemma 5. ˜ ˜ d d C C i i ˜ ˜ i i ∼ E(β , µ), i == bution d C C i i i i α i i i i i i i i(x)i ,= iiIf ) i the −times dtimes (dindependent )times .(x)bution = 1are − FE(α,λ 17. F ˜completion iproof id i the i completion 7. If the task completion are independent Proof. We omit the because isi ,then similar to1, the of(37) C)7. ∼ E(β ,8. µ), ithe = 1, 2, .n, .itproof n, the expected value of exp bution d˜HEOREM i value i the T HEOREM IfdE(α+1,λ the task times are ithe i(37) iC iof iIf ∼ E(β = 1, 2, ∼ . ..E(β .proof then µ), the i(37) =expected 2,for .proof .the .the n, value then of the dfT bution TTheorem HEOREM 7. If− task completion times are independent iindependent (x) = f f )0(d = f xpected of variable T . i the iµ), HEOREM HEOREM 8. 8. If the the requested deadlines deadlines for completion completion T T (x) = (37) T HEOREM the task completion times are independent iHEOREM irequested 00− 07. ˜ ˜ 0λ ˜ 2 0 0 T HEOREM 7. If the task are independent d T T T T T HEOREM 7. If the task HEOREM completion If times task completion independent are independent tasks are random variables with Erlang distrii Proof. We omit the proof because it is similar to the of HEOREM 8. HEOREM If HEOREM requested 8. 8. 8. If If If If the the deadlines the requested requested requested requested for deadlines deadlines the deadlines deadlines completion for for for for the the the completion completion completion completion T T T T T d ˜ i tardiness (35) is i i Proof. We omit the because it is similar to the p i (d ) − d (d ) . = 1 − F 1 F i i i i ˜ 2˜distribution (35)□ alue of the variable TE(α+1,λ .the idistribution 222. ,random )random E(α,λ ) erlang ivariables variables calculate ,tasks λ5. ),Then, i are ∈ J, then with E(α tardiness (35) isi ,distriLemma 5. 0, 0, iftardiness xx0.> >the 0. 0. expected 0, if),xrandom 0. of tasks are independent variables with Erlang tardiness (35) istardiness ithe 0, ifvariables x if > αWe ∞ of of with λproceed i> ∈J, J,is then random with erlang E(α idistribution 2the 2T random variables with erlang E(α i 2 J then Lemma tardiness (35) isis distriWe We calculate expected expected value value of the the variable variable T ˜random ˜iT ˜,T ˜λ ˜2ii.E(β tardiness (35) is ), i ∈ J, then random variables with erlang distribution E(α i, λ), λ we can to calculate values ∞ ∞ ∞ ∞ ∞ tardiness (35) is (35) , λ ), i ∈ J, then random variables erlang distribution E(α ∞calculate ∞ of of tasks are independent independent random variables with with Erlang Erlang distrii ˜ calculate We the We We calculate calculate expected calculate the the value the the expected expected expected expected of the value variable value value value of of of T the the the . variable variable variable variable T T . . . . , λ ), i ∈ then variables with erlang distribution E(α i , λ ), i ∈ J, then random variables with erlang distribution E(α i , λ ), i ∈ J, then , λ ), i ∈ J, then ∼ , µ), i = 1, 2, . . . n, then the expected value of random variables with random erlang variables distribution with E(α erlang distribution E(α bution d Lemma 5. (d ) − d (d ) . = 1 − F 1 − F of tasks are of independent of of of tasks tasks tasks tasks are are are are independent random independent independent independent variables random random random random with variables variables variables variables Erlang with distriwith with with Erlang Erlang Erlang Erlang distridistridistridistrii i i i i .task i we Lemma 5.i ∈ iisobtain i i 2 J i Then, i tardiness ibution αd))(u)du E(α+1,λ E(α,λ )(u)du we proceed to calculate theexpected expected values of β ˜ifthe he(u task completion independent (33) into the above equality asserthe of tardiness (27) i Then, ∈ Jrandom isdwe 2 . the the expected value (27) of task can proceed to calculate values ofvalue E(β iasser= 1,ithe 2, .tardiness n, bution then the value of value of (27) of task Jis is (u (u − )are f1of fCabove d(27) di)i (d fC∼ = −expected d(d )Substituting (u)du − d− ftardiness (u)du ˜= i ,− (u dtimes )of − dequality fvalue ˜˜id˜˜expected ˜can λ ˜C ˜iof ˜C ˜i i(u)du ˜(u)du iiiof i= ifd )C˜= −(33) dare )completion .− 1into − F the value tardiness (27) of task iµ), ∈F Jvalue is ˜tardiness ˜d ithe i −of and T.)˜2, variables mpletion times independent the expected value task i 1∈ Jthe isE(α,λ )(u)du − dJexpected ..∼ = − Fexpected Ci)(u)du βexpected ifHEOREM i− i(27) ∼ ∼ E(β E(β µ), ,µ), µ), = 1, 1, 2, ....n, ..n, ..n, .n,then n, then then the the expected expected value of of bution C C ii= E(α+1,λ )T E(α,λ β the expected of (27) of task ∈ Jn, ˜tardiness ˜omit Substituting the we obtain iiof i Jis ii)d ˜ i expected i(d i˜= ii,,iproof iµ), E(α+1,λ ) (d 2ithen i.the the value (27) of task i ∈ is 2iT ˜ the expected value of tardiness expected value task of tardiness i ∈ (27) of task ∈ J is tardiness (35) is β = F (C )(C F (C )of − E( T E(β , µ), ∼ i ∼ ∼ ∼ E(β E(β 1, E(β E(β 2, , µ), . , , . µ), . i i = i = = 1, 1, 1, 1, 2, the 2, 2, 2, . . . . expected n, then then then the the value the the expected expected expected of value value value value of of bution bution bution bution bution d d d d T HEOREM 7. 7. If If the the task task completion times times are are independent independent ˜ ˜ β β ) = F )(C E( T ˜ ˜ Then, we can proceed to calculate the expected values of i i i i(C i i i i i i i i i d d d d d d E(β ,µ) E(β ,µ) Proof. Proof. We We omit the the proof because because it it is is similar similar to to the the proof proof of of Proof. We omit the proof because it is similar to proof of ˜ d d and T . random variables T iof Proof.E( We omit the proof because iti F is similar to proof of,µ) ,the λ task ),task i∈ J, then thi erlang7.Ttion distribution E(α itask i 7. iithe λ E(β T T THEOREM EOREM HEOREM If HEOREM HEOREM 7.i7. 7. completion IfIfIf Ifthe the task task times completion completion completion completion arei i independent times times times times are are are areindependent independent independent independent random variables TF TT .) = ˜i )iF(C (C − (C theorem. ithe i i(C = (C (C )(C −Fi )E(β Fthe (C )), (38) TE( ˜E( tardiness (35) is i )F= i )(C i )), i and iiii)(C = F (C )(C E( T ˜ ˜ E(β ,µ) E(β ,µ) E(β +1,µ) i )FT iF iof i,µ) E(β ,µ) E(β ,µ) E(β +1,µ) ) = )(C F (C ) − (C )), (38) E( T µ(i i ) = F (C )(C F F (C ) − (C )(C F F ) − F )), (38) T E( E(β i i i i , λ ), ∈ J, then istribution E(α Then, we can proceed to calculate the expected values ubstituting (33) into the above equality we obtain the asserE(β ,µ) E(β ,µ) +1,µ) i i i i i i i i α(α + 1) tardiness tardiness (35) (35) is is E(β ,µ) E(β ,µ) E(β ,µ) E(β ,µ) E(β +1,µ) E(β +1 2 Then, we proceed to calculate expected va i with tionrandom theorem. α(α + 1) µ µ the 1tardiness distribution ˜i can ˜(d 2distribution Lemma 1) tardiness (35) tardiness tardiness tardiness isvariables (35) (35) (35) (35) is is is is , λ , λ ), ), i ∈ i ∈ J, J, then then random variables variables with erlang erlang distribution E(α E(α α(α + 2 and T . random T α(α + 1) µ ˜ i i Lemma 5. 5. Lemma 5. µ µ HEOREM 8. If the requested deadlines for the complet T α(α + 1) i Lemma 5. f tardiness (27) of task i ∈ J is α α ˜ 2 , λ ), i ∈ J, then , , λ , λ , λ ), λ ), ), ), i i ∈ i i ∈ ∈ ∈ J, J, J, J, then then then then m variables random random random random with variables variables variables variables erlang distribution with with with with erlang erlang erlang erlang E(α distribution distribution distribution E(α E(α E(α E(α α(α + 1) E( T ) = (d ) ) 1 − F − F α 2 β nto the above equality we obtain the asserα(α + 1) α(α + 1) i i i i i E( T2i− )1= = (d F2El(α,λ 1variables −Frequested FE(α+2,λ i variables into the equality we obtain the asserEl(α,λ ) Ifi1the ˜1i(d i(d 2= i )T˜ 2 .βE(α+2,λ i )T˜ 2 . ˜(d 2F(d )i(d )(d )J− = 1E( − F˜(33) )(dT − 2i)T TE( )T˜∈ = 1F (d )− (27) task i1Substituting is ˜E( for random on ˜∈ 28.− E( T˜E(α+2,λ 1).F1(d −− − ˜the El(α,λ E(α+2,λ i1 (d )E(α+2,λ .F = 1above 1˜− −2of F F 1calculate − FF and random T˜i)i )), )F − d− (d )i− .F 1El(α,λ − − Ftheorem. ii )(C E( T2the )expected = − (d )tardiness (d )(d 1− − (d = − λ ii)FHEOREM i )= (C F (C (38) Ti))∈ E( Tof T )ithe = − F (d ))tardiness = −i1d)(27) (d )F˜Fii,µ) (d 1= F)E( F 2)− i i))i)d)1 i(27) i2i)J= El(α,λ )and E(α+2,λ i1 iF idF i ,µ) We the expected value of variable . 1(C where βdeadlines +the . . .βcompletion + E(α+1,λ E(α+1,λ E(α,λ E(α,λ )i(d λ E(α+1,λ )expected E(α,λ )− i(d i))tardiness i− ii))∈ i∈ the expected expected value value tardiness of task task i1F i∈ J.J)F is is El(α,λ )of E(α+2,λ E(α+1,λ E(α,λ ithe iT i )2i )If Theorem 8. the requested deadlines the E(β E(β +1,µ) =ββnβ.1of +ββ2++.dis .. .. . 1+ El(α,λ )of El(α,λ )task E(α+2,λ )F β β2completion iof λ of tasks are independent random variables with Erlang 2the λ 2E( 2)E(β where β = β + β + . . . + β .iβfor pected value the expected expected of tardiness value value value value (27) of of of tardiness tardiness of task i (27) (27) ∈ (27) (27) J of is of of of task task task i ∈ J J J is is is is ˜ λ where β = β + β + . . . + β . ) F (C )(C (C ) − (C )), (38) T β β− β.where ββ βwhere λ λ n(C 1 2 β= + tion theorem. λ λ n 1 2 µ i i i i i Weithe the expected value of variable T . λ λ ˜ ˜ E(β ,µ) E(β ,µ) E(β +1,µ) where β = β + β + . . . + β . calculate 1(C 2 (38) where β = β where β = β + β + . . . + . + β + . . + β . HEOREM 8. If the requested deadlines for the completion T α(α + 1) n 1 2 ) ) = = F F (C (C )(C )(C F F (C ) − ) F F (C )), (38) E( E( T T of tasks are independent random variables with Erlang distrin n 1 2 1 2 i ˜ ˜ ˜ ˜ ˜ i i i i i i i i iiof i )), E(β E(β ,µ) ,µ) E(β E(β ,µ) ,µ) E(β E(β +1,µ) +1,µ) µ Then, Then, we we can can proceed proceed to to calculate calculate the the expected expected values values of of Then, we can proceed to calculate the expected values of F F F F F ) = F (C ) ) ) = ) )(C = = = F F F F (C (C (C (C )(C )(C )(C )(C ) − F F F F (C (C (C (C ) ) − ) ) − − − (C )), (38) (C (C (C (C )), )), )), )), (38) (38) (38) (38) E( T E( E( E( E( T T T T tasks are independent random variables with Erlang distribuThen, we can proceed to calculate the expected values ˜ i i i i i i i i i i i i i i i i i i i i i i α(α + 1) α 2 E(β ,µ) E(β E(β E(β E(β ,µ) ,µ) ,µ) ,µ) E(β E(β E(β E(β ,µ) E(β ,µ) ,µ) ,µ) +1,µ) E(β E(β E(β E(β +1,µ) +1,µ) +1,µ) +1,µ) ∼ E(β , µ), i = 1, 2, . . . n, then the expected value bution d (dcalculate (d )we − FIfwe ˜ithe µdeadlines 8.(34) the requested for the obtain 17. )asseri i 8. 2 If 2α obtain i ) the(33) ithe the requested for the T.HEOREM α(α asser+ ˜+ α,λ )into E(α+2,λ We the the variable .αα 2deadlines α(α + 1) 1) If αF HEOREM the completion times independent µ.completion µ µµ µµProof. Substituting (33) into the above equality we we the the 2task =iare (33) 3)Substituting above equality obtain asserexpected into of (d 1of − )equality − −TT into the above obtain asserequality HEOREM 2+ 2− 21) 2.i2, 2d tasks independent random variables with Erlang distri2˜α 22− − 2d 1+ F (d + 1)T (d )value ˜α d.E(α+1,λ 2− ˜d 2= α(α + α(α α(α α(α + + + 1) 1) 1) ,and µ), 1, .function . n, then the expected value of bution α(d Proof. Using density function ofcom the ˜T(d ˜i2idensity ˜T(34) ˜ii = 1, 2, …, n, ˜∼ 2d (d )dare (34) 1, μ), 1i)β)− − i2 α(α i2.)iE(β iabove 2 ˜iiis ˜− E(α+1,λ )β ).ivariables E(α+2,λ )the Using therandom densit i− tion » E(β then theof expected value of ˜1) iβiT where + + .(34) .)FT .˜E(α,λ iE(α+2,λ iβ 2bstituting 22iλ and T .)Erlang random random variables TF T(d random variables 21 )(d E(α,λ )(d 2d d+ (d )F 1(d F 1FEl(α,λ − F2d T . random variables 21 E( E( T T ) = = (d ) ) (d (d ) ) 1 1 − − F F 1 1 − − F F T HEOREM 7. If the task completion times are independent ni+ 1(34) 2˜)i.(34) iT − 2d (d ) d (d 1 − F expected value of the variable T . Proof. Using the function of the tardiness of rand i and i+and ˜ ˜ ˜ ˜ i(d i )1F i+ Proof. Using the density the tardiness of i We calculate the expected value of the variable . − 2d ) d ) . (34) 1 F F i i E(α+1,λ ) E(α,λ i i i El(α,λ ) ) E(α+2,λ ) ) Proof. Using the density tardiness (35) E(α+1,λ E(α,λ i i − 2d (d ) + d (d ) . (34) 1 − F 1 − F i λ λ i i i E( E( E( E( T T T T ) = − F ) ) ) = ) = = = 1 1 1 − 1 ) − − − F F F (d (d (d (d ) ) ) ) (d ) (d (d (d (d ) ) ) 1 − 1 1 1 − 1 − − − F F F F of tasks are independent random variables with distrii − 2d − (d ) + d (d ) ) + d . (d ) . (34) − F 1 − F 1 − F 1 − F E(α+1,λ E(α,λ ) Proof. Using the density function of the tardiness of random i 2 2 where β = β + β + . . . β . λ of tasks are independent random variables with Erlang i i i i Proof. Using the density Proof. function Using the of density the tardiness function of random of the tardin i i i i i i i i E(α+1,λ ) )i)2)) erlang E(α,λ i random iλ idwhere El(α,λ El(α,λ El(α,λ El(α,λ El(α,λ E(α+2,λ E(α+2,λ E(α+2,λ E(α+2,λ )) )i) ∈E(α,λ i E(α+1,λ ˜ i i ii λ ) variables n J, 2E(α+2,λ E(α+1,λ ) with E(α,λ ) )1)i ) E(α ) , λ ), then distribution i λ tion tion theorem. theorem. λ λ ˜ 2 2 2 2 i nrandom theorem. ∼ E(β , µ), i = 1, 2, . . . n, then the expected value of bution where β β = = β β + + β β + + . . . . . + . + β β . . ˜ (35) Ti (Lemat 7) the ˜expected i βvariable iis tardiness (35) 1+ 2+ λ λdistribution λλλλE(α variable Ti(Lemat (Lematvalue 7)the the λ λ 7.2 If the variables T HEOREM task completion times are independent β tardiness =variable where where where ββ= = = β11T+ + .(Lemat .= .˜βis + + β˜.β7) β22+ + .....variable .the .7) + .+ .+ + βthe βexpected βn.nn.n.nexpected .value ), id∈ J,where theni , µ), with erlang (Lemat value nd nβ 1where 1β 1˜β 1+ 2β 22+ variable T (Lemat 7) value i, λ ˜β.(Lemat i 7) Tcompletion 7) iβ ∼ E(β iβ 1,(35) 2,T2˜i.β n, then the expected bution T+ the expected ithe ∼ E(β ,˜µ), ideadlines =value 1,7)of 2, .variable .for .for n, then expected v E(α,λ (di )completion variable the expected (Lemat value the expected value i .bution HEOREM HEOREM 8. 8. If If the requested requested deadlines the the completion T= Tthe 8.Tvariable If requested deadlines for therandom completion T∈˜iHEOREM ithe iT the expected value ofiIf tardiness (27) of task iUsing J is ideadlines HEOREM 8. If the requested for the completion the task times are independent β . (34) + d 1 − F (d ) Proof. By the definition of the expected value T HEOREM 7. the task completion times are independent tardiness is Proof. the density function of the tardiness of 2 Proof. By definition of the expected value +1,λ ) ) ∞ i α α Proof. By the definition of the expected value ∞ F ,of λvalue iBy andom erlang distribution 2density 2.J, the Proof. definition of then definition is the 22value Using variable variable Proof. the the iof 2of By the definition of the expected ˜i ) ∞=value 2is the expected value tardiness (27) task i− J˜), 2the 2E(α +Proof. dProof. (d )expected . of (34) 1variables − F2d Proof. the ofof the expected i∈ αWe ααdefinition αα ˜T∈ ˜i(d ∞E(β ∞ ˜of the function the tardiness of random E( ∞ β ∞ ∞ T F (C )(C F (C ) − (C )), (3 By the definition of expected value E(α,λ )with By the Proof. By the the expected the expected value i We calculate calculate expected value value the T . the expected value of variable T . tardiness (35) i i i i 2 2 2 We calculate expected value of the variable T . ,µ) E(β ,µ) E(β +1,µ) ∞ ∞ tardiness (35) is − − 2d (d (d ) ) + + d d (d ) ) . . (34) (34) 1 1 − F F 1 1 − − F F ∞ ∞ ∞ ˜ Proof. Proof. Using Using the the density density function function of of the the tardiness tardiness of of random random of of tasks tasks are are independent independent random random variables variables with with Erlang Erlang distridistrii of tasks are independent random variables with Erlang distri i ˜ i i i i i i ˜are ˜µ of tasks independent random with distriiE(α,λ ))(d E(α,λ )))(d ))(d i(d J, ith erlang distribution E( Tfunction )(C =x)dx xF (C )T f)(C (C − x)dx =ix)dx. F ∞ di expected − 2d 2d 2d 2d )E(α+1,λ + di ,i λ) ))erlang + + + + dthen diid)α(α dii∞∞(d )1˜− .is (34) (d (d .. ), (34) (34) (34) (34) 1), − F)iiof 1idistribution 1i∈ 1− − − FFE(α,λ F F 1 −−− F− 1− 1− − F FE(α FE(α+1,λ F iUsing ∈ J, then random variables E(α variable 7) the expected value tardiness )= F (C )(C F (C )−− Fthe (C )), (38) E( T˜Proof. ˜the ˜i= ˜dix) Proof. Proof. Proof. the density Using Using Using Using function the the the the density density density density of the function function function tardiness of of of of the of the tardiness random tardiness tardiness tardiness of of of of random random random random E( ) xF (C ) f ( ivariables iErlang ix ˜ ∞ i iλ ii 1 i− i )i∈ ii)i )i(Lemat i))∞Proof. i.,.λ ˜ E(α+1,λ )1(d E(α+1,λ E(α+1,λ E(α+1,λ E(α+1,λ )with )(d )(d E(α,λ E(α,λ E(α,λ E(α,λ )T ˜(C ˜(C i i i i i i i i d d d E(β ,µ) E(β ,µ) E(β +1,µ) ˜ E( T ) = xF (C ) f x)dx = F ) x f (C − + 1) ∞ he value of (27) task J λ d E( T ) = xF (C ) f (C − = F (C f (C − i ∞ ∞ ˜ ˜ ˜ ˜ i i i i i β E( T ) = xF ) f (C ˜ ˜ ˜ ˜ ˜ ˜ i ∞ ∞ 2 2 2 i i i i i variable (Lemat 7) the expected d˜ii7) dx)dx E( T˜)dvariable )i∼ = xF (C f1, (C F˜value (C )iof ii ) d˜i ∞∞ 2bution 2 (P( 22, λ of the E( λ2λ2∞)λλof dif) dthe di i− E( T˜value E( T xF (C )= (C )f2, = − x)dx = (C (C )˜ivalue fthe fxd0˜fivalue (C − Fof x)dx. ˜bution i fE( ∞ ∞ ∞− 0F i....xF i x x)dx i(C ˜(d ˜i (C (38) > ˜F ˜E(β ii= i i.− ˜˜,i˜T ˜d(Lemat ˜i xJ2∼ dd˜d˜ix)dx. ˜ id˜i ˜.(C ˜= ˜ ivariable i> i˜1, iµ id˜)idi(C idi− 00i= ˜ 2value 2T(27) d d d d E( T (x)dx = x C ) f (x + d ))dx ) = x α(α + 1) 2 2 2 T (Lemat 7) the expected expected value d d T = ) (d ) 1 − F 1 F ∼ ∼ E(β E(β , , µ), µ), i = = 1, 2, 2, . . n, n, then then the expected expected value value of of bution d d E(β , µ), = 1, . . n, then the expected d tion expected value i i ˜ ˜ T ) = f (x)dx = x (P( C ) (x + d ))dx ˜ 2 2 ˜ ˜ ˜ µ), i . . n, then the expected bution d 0 i ˜ of tardiness (27) task i ∈ J is i i i i i i ) = )(C F (C ) − F (C )), (38) E( T i i ˜ i i i i 0 ˜ i ˜ 2 2 2 El(α,λ ) E(α+2,λ ) the expected of tardiness of task i ∈ is T β i i i 2 C i i i ˜ ∞ ∞ variable T variable variable variable variable T T T T (Lemat 7) the (Lemat (Lemat (Lemat (Lemat expected 7) 7) 7) 7) the the value the the expected expected expected expected value value value value 2 2 2 2 2 2 ˜iifid(x i i ))dx E(β +1,µ) iβ0 xTtask (x)dx xi1λ (P( C > df ˜)id(x f(d (xd+ Tdi d TE( Tthe HEOREM 7.x)Fthe IffIf˜fthe the completion completion times iare If times are ,µ) C(x TE( + ))dx )T− = xE( ftask =x independent (P( C2E( > ˜times 0 ˜)E( T.expected HEOREM 7. If task times are if = i i iiCi i> dii) fE(β ˜˜HEOREM i ))dx ˜˜ C 0iE(β 00 0 T +0.(C )independent = fi> (P( ˜ii(x)dx i ,µ) i)F iindependent > i7. C (x)dx x= (P( )E(α+2,λ ))dx ) = x= i 1completion E( TT )f E( T˜completion (x)dx (P( (x)dx > d− = fare (P( ))dx ))(C +0where d,µ) ))dx )itask = )T˜= x21) fxC ˜)(x i= T˜i (x)dx iindependent ii C i= ix)C˜i,µ) β=F1)E(β + β˜+1,µ) + .dnext .iii)(C + βx)dx. ˜i(C ∞ ∞ ˜i (x µ.(38) id id ix(35) El(α,λ ) (d −∞ iProof. T˜−∞ )Cxi+ =)−∞ F˜0xF (C (C )βis − )), E( T i2The value −∞ ∞to iT Tdefinition = C C By the the definition of of the the expected expected value )+ F (C (Ci)next − F (C E( T˜= idefinition ˜˜ii)C˜tardiness i= i+ iiF− iis α(α + Proof. By the definition of the expected value 0T idefinition ii value E(β E(β inF part is,µ) similar the of i4)) −∞ 0expected tardiness tardiness (35) E(β ,µ) E(β E(βproof +1,µ) ∞ ∞ ∞ part (35) isifiwhere = ∞ the E( T (C ) (C F x f (C − ∞By 0x)dx The is similar tardiness (35) is By the Proof. Proof. Proof. Proof. definition By By By By the the the of the the definition definition expected of of of of the the value the the expected expected expected value value value value 2Proof. λ ˜ ˜ ˜ i i i −∞ 0 β = β + β . . . + β . −∞ 0 −∞ 0 The next part is similar to proof of 4 ∞ ∞ ∞ ∞ ∞ ∞ ∞ ∞ ∞ ˜ d d d d , , λ λ ), ), i i ∈ ∈ J, J, then then random random variables variables with with erlang erlang distribution distribution E(α E(α , λ ), i ∈ J, then es with erlang distribution E(α ˜ µ The next part is similar to the proof of 4 n 1 2 , λ ), i ∈ J, then ndom variables with erlang distribution E(α i i i i µ i i The next part is similar i x)dx E( (di ) 2E( −2FEl(α,λ F dnext T∞i )1=−iα(α xF (C)i(d )0f∞id˜)i (Ci− =E( FE( fxF x)dx. + ∞ Ti ) = 1α(α The similar the proof of 4F˜ (C The next part isisxsimilar The tonext part proof is of similar 4=F toi )the ofi − 4i −x)dx. ˜i˜(C ∞ i )part i− ) E(α+2,λ ∞∞ d˜i1) d˜i˜(C T T )= = xF (C (C fβ )˜fto fnthe (C (C − − x)dx x)dx = ) xproof xxf˜fxff˜(C (C x)dx.to ∞ ∞∞ ∞ = ˜α ∞ 2˜1) (d (x +λd∞ (x)dx x E( (P( C > d− fFC˜E(α+2,λ +∞ ˜T ˜(C ˜)˜˜)iii2= ˜(C ˜(C ˜(C ˜ii(C ˜˜fi(C ˜i i(C iT i)i)0if)i)f= i(C i− ix)dx. ∞T 2 T˜i )˜= 2) (d i∞ i1) − i)))dx 2 i0iC ∞ ) ) 1 d d d d d d d d E( E( E( E( E( T xF ) ) ) = ) = = f (C xF xF xF xF − (C x)dx (C (C ) f (C F (C − − − x)dx x)dx ) x)dx x)dx = = x = = F f F F F (C (C (C (C ) − ) ) ) x x f (C (C − − − − x)dx. x)dx. x)dx. x)dx. ∞ ∞ ∞ ∞ ∞ ∞ where β = β + β + . . . + . ˜ ∞ ˜ 0 T ) = (d ) (d ) F 1 − F ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ i i i i i i i i i i i i i i i i i i i i i i i i 1 2 (α,λ 2 i i the the expected expected value value of of tardiness tardiness (27) (27) of of task task ∈ ∈ J J is is alue of tardiness (27) of task i ∈ J is 2 2 2 2 2 2 > d ) x f (x + d )dx. P( x (P( C > d ) f (x + d ))dx d d d d d d d d d d d d d d d d d d d 2 α El(α,λ E(α+2,λ ) i) e2 expected of of i2(P( JC ) xf2˜(P( + d∈ (d .i > 1˜is −)dx. F)iE(α,λ 1˜xP( −2˜2fC ββii i ii of β i idensity ˜))dx (x00+ +dd + … + β x2. f fC˜where iλ Proof. the tardiness ofIf rand iii)dx. ii the ii i i i ii i= iF ˜The 0Using ˜0i2Tvalue ˜− ˜˜diE(α+1,λ i2C˜ii tardiness i )P( 2d )i2(d )+ CiC 222d 2C˜(27) i˜˜+ ˜iC + dx= )2f)task fd)dx. P( E( E( TxiE( )2˜˜2˜1))= xC (x)dx = C > d))dx fi) )f)i)Cf2E( ))dx > (x diid> .(C ˜2= ˜C ˜> HEOREM If+1,µ) the expected deadlin T,µ) i)d> 0T 01i00 + β 0HEOREM 0000 0)), where = β+ β)dx. + .i(C + ˜fβ ˜˜part d(34) ).ito xT (x )dx. P( iHEOREM ˜i2(P( i(x ˜T ˜ii)βiF ˜(x i= i)> 9.(38) theee T = + βexpected +expected .function .i0)of .β)− + βn9. .F C 2density nthe ˜where nβ = β 1is ˜F )(P( + )dx. P( the proof of 4(C Tfd i 2> iiβ + (d .(x (34) − F−∞ E( E( (x + (x (x + + + dd+ diddd+ ))dx ))dx ) =− 2d (x)dx )− = )λ== = xi2xfx2> fiT˜T(x)dx fif)˜iiT(x)dx (x)dx (x)dx = di0ix1i(x x)x+ xxfCx2(P( (P( (P( CiC > didinext dd)iii)ix)dx. fifi˜CCf(x )x˜(d x)C fd0> (x > dpart (x dF P( C˜= P( C λE(α+1,λ 2,µ) )Using = = FT (C )(C FIf (C F (C )), (38) E( E( T i˜ )+ = )(C FHEOREM (C )9. )), (38) 0T Proof. the function the tardiness of random ˜x i(x C ˜(x)dx 9. deadlines for the complet ).T = (C F − F (C (38) E( i E( i= iC id isimilar i))dx ˜i iC i x> iii)> i))dx 0HEOREM i− i1iF i(C ideadlines i)), Ifβexpected the deadlines for the completion iof iE(β i− iTiii) iE(β T E(β ,µ) ,µ) E(β E(β E(β E(β +1,µ) C0The T˜Ti˜ifTT E(β ,µ) E(β ,µ) +1,µ) iT)(C i)(C i )(C i (C C˜idF C˜C C α HEOREM 9. If the T ,µ) E(β ,µ) E(β +1,µ) iT iii i i iE(α,λ i˜C i˜ii i toi ithe ˜ 9. If the expected deadlines for the completion 0 −∞ next is similar proof 4 T T HEOREM 9. If the HEOREM deadlines 9. If the for expected the completion for ∞ 0 variable (Lemat 7) the expected value 2 λ µ µ 0 µ −∞ 1 −∞ −∞ −∞ −∞ 0(d 01) 0+ 00− i µ 0 0 The The next next part part is is similar similar to to the the proof proof of of 4 4 α(α α(α + 1) 1) α(α + 1) of tasks are independent random variabl α(α + ˜ − 2d ) + d (d ) . (34) 1 F − F of tasks are independen the 6variable i 2 Proof. Using the function of theof tardiness of random The next part The The The is The similar next next next next part part part part todensity is the is is is similar similar similar proof similar of to toto tothe 4the the theproof proof proof proof of of of44variables 44 with i Similarly, α)dx. i T (Lemat 7) the expected value E(α+1,λ )definition E(α,λ )F 2 )T˜T˜i22))(d of tasks are independent random with Erlang dis of tasks are independent random variables with Erlang distri2− iare as in proof of Theorem by performing substi2 of tasks are independent ∞ ∞ d x f (x + d Similarly, as in the proof of Theorem 6 by performing substiE( E( = = (d (d ) ) (d (d ) ) 1 1 − F F 1 1 − − F ) (d ) −xα+1,λ F 1 − F of tasks independent random variables Erlang distriProof. By the of the expected value ˜ 2T˜El(α,λ (> ) = (d ) (d ) 1 − F 1 − F i i of tasks are independent of tasks random are independent variables with random Erlang variables distriwith λ i i i i i i Similarly, as in the proof of Theorem 6 by performing substi(d + (d )of (34) 12d − El(α,λ El(α,λ E(α+2,λ E(α+2,λ ∞E(α,λ ∞Theorem ∞.∞ ∞2the E(α+2,λ ) iproof Cdin i))1proof i))substi∞Using ∞ i i )i )d as Similarly, as in)iproof the 6E(α+2,λ by performing substiUsing the density function of thefunction tardiness ofthe random Proof. Using density function of the tardiness ofsubstiEl(α,λ )value +performing dperforming (d )performing . Proof. (34) 1 −) F − FTheorem ˜random i ˜ (x iTheorem ii− Similarly, asE(α,λ inβsubstithe of Theorem 6˜β7) by performing Proof. density of tardiness ofthero 2)222(d )of 2Fproof HEOREM 9. If the expected deadlines for the completion ˜the iproof 2in fSimilarly, as in the 6 by E(α+1,λ )the ˜ ∼ E(β , µ), the expected value bution d iT Similarly, the Similarly, of as 6 by of Theorem 6 by substi Proof. the definition of the expected ˜ ˜ ∼ E(β , µ), bution d λ λ i i λ 0Byi )dx. variable T (Lemat the expected value ˜ λ Ci ) + 2 2 2 2 where where β β = = β β + + β + + . . . . . . + + β β . . where = β + β + . . . + β . i i ˜ (x (x + + d d )dx. )dx. > > d d ) ) x x f f P( P( C C ∞ ∞ where β = β + β + . . . + β . i ∼ E(β , µ), the expected value of square tardin bution d nnf∼ 22µ), ˜i >ud= ˜˜C ˜f> 2 bution 9.u If the expected deadlines for the completion , µ), the expected value ofsquare the square dn)1E(β ˜∼ ˜E(β i˜iwe nexpected tution = d(x we obtain 1HEOREM 2∼ iE(β i(C ˜T ∼ E(β ,(C µ), bution di )ithe i1= tution +independent we1obtain obtain i+ ) xdλ xiC > > (x didobtain )ii+ )ii)0)du0i )dx. xxxT xxfCC˜fHEOREM fCfi˜∞C˜C(x (x + + + dof divariable )dx. )dx. P(C P( P( P( T xF )˜expected (C −value = Fexpected (C xcompletion fdi˜tardiness −the x)s ,9. the expected of the square tardiness bution dIfHEOREM ∞P( µ), expected E(β value ,x)dx µ), ofthe tardiness value ofi the dT bution diexpected iC i(x ˜+ ˜the i+ iC i> idd idd ii)dx. itasks i)dx. variable 7) the value i variables ifor iT˜∼ 9. 9. If the expected expected deadlines deadlines for for the completion ˜If tution + dC iE( i , i9. iexpected ideadlines tution = xwe we obtain 7)bution value d(Lemat d˜i7) d˜for i (Lemat iiC u== xxT˜+ ddi i we variable Tfithe the expected i˜˜obtain ii 2tution i obtain are with Erlang i (Lemat ideadlines ivalue i THEOREM 9.= TTT HEOREM THEOREM HEOREM the 9. 9. If the the the the expected deadlines the deadlines for the the completion completion completion completion i )the tution u= = xT + value iIf is roof. By the definition of expected tution uTheorem xu+ tution u˜i = x∞+ d=itasks we 2did 2the E(the T˜irandom )expected xFexpected )If (C − x)dx =distriFdeadlines xfor fthe (C − x)dx. i we is ∞ ˜i (C ˜If ˜i (Ccompletion ˜ithe 0obtain 000of 0substi∞HEOREM ∞ for ˜ 0 0 i i i ˜ is proof of 6 by performing d d d d α α are independent random variables with Erlang distriis E( (x)dx x (P( C > d ) f (x + d ))dx ) = x f α i is ˜ ˜ variables i i C∞i i isis ofoftasks ∞ 22of 26the 2 2By variables expected Ti )) + isthe ition of value 2+ ∞independent F α(α tasks are are independent independent random random variables with with Erlang Erlang distridistri˜value Proof. the definition the 0function 0xwith ˜2d are + 1) substi (d ∞tardiness ˜idensity ∞ ∼ E(β , µ), the expected value of square tardiness bution − − (d (d .i.))dx (34) (34) 1 − F 1)F 1i− − F2Fin E( )ii= xas fiin (x)dx = xd1dTheorem d)by (x d(34) + dE(α+1,λ (d .˜− 1the − −heorem FE(α+1,λ tasks are of independent of of oftasks tasks tasks tasks are are random independent independent random random random random with variables variables variables Erlang distriwith Erlang Erlang Erlang distridistridistridistri− 2d (d )proof + dof (d )Cperforming − 1by −expected 2the Proof. Proof. Using Using the the density function function of the tardiness tardiness of of random Proof. Using the of the of random ∞ 26 2T E( )value = xF (C fdensity − x)dx = F (C )+ f4with (C(C − x)dx. ˜ithe ˜(d ∞performing i> i i ∞of iC i).d −∞ 0C i> i∞)by i)i(P( Proof. Using the density function of the tardiness random α(α + 1) 2+∞ E(α+1,λ )F E(α,λ )2(d )the E(α,λ )of ∞ ˜α(α ˜of iTSimilarly, iproof if)substi2isubsti2) Similarly, Similarly, as proof of Theorem performing performing substisubsti1) i2d iindependent i similar i1) iErlang Tin E(α+1,λ )the E(α,λ ˜)+ ˜variables ∞1F ∞of ˜∞(34) ˜6iE(α,λ iof d˜˜are d˜˜i2(C d+ dwith ˜C(C ˜2random i˜ ˜substi2)proof obtain α(α + ) = F ) E( ) = P( d ) u f (u)du E( T 2part arly, as Similarly, in Similarly, Similarly, proof as as as in of in in the Theorem the proof proof 6 of by Theorem Theorem Theorem Theorem performing 6 6 6 6 by by by performing performing performing substisubsti2the 2by i2 square iT i )α(α ∼ E(β , µ), the expected of the tardiness bution d α(α 1) 1) The next is to the proof of λ ∞i−∞ ) = F E( T ˜ ) = P( C > d ) u f (u)du E( T i ˜ 2 2 ˜ i i Similarly, as in the proof Theorem 6 by performing substi˜ 2λ 2as 2 λ i ˜ E(α,λ ˜ ˜ i ˜ i ˜ 2 2 i i 0 2 2 E(α,λ )( ˜ ˜ 2 2 2 2 ) = F (C ) F (C ) E( T E( T ) = xF (C ) f − x)dx = F (C ) x f (C − x)dx. ˜ ) = P( C > d ) u f (u)du E( T i ˜ i 0 0 ˜ ) = F (C ) F (C ) C E( T i ∼ ∼ E(β E(β , µ), , µ), the the expected expected value value of of the the square square tardiness tardiness bution bution d d ) = P( C > d ) u f (u)du E( T E( T )E(β xF ))i fthe (C − x)dx = (CiT ˜> ˜T ˜+ ∞ ivalue ithe i˜ value ˜iThe ˜(Lemat ˜∼ ˜i)i∼ i T˜iC i ufd )tardiness = E( C > )µ), uE(β (u)du E(bution Tui˜i d)˜(Lemat )i xFF ˜fx+ ˜i id ˜)expected ˜E(α+2,λ ˜2i (Ci)− ˜T ˜idTid˜(Lemat id E(Titution (x + )tution = u x= (x)dx = > )(u)du f˜(u)du i,iµ), isimilar i˜value i value i˜∞C E(α,λ ))expected d˜iiE(β d˜f= dproof Ci˜i)di ))dx ˜dii , )E(β = F (C Fof T iF λfiE(α,λ ˜,ii7) )idwe = P( )(P( iP( is i(Lemat next part the of 4λF C ∼ ∼ ∼ expected E(β ,ithe µ), µ), the value the the the expected expected square value value value tardiness of the the the the square square tardiness tardiness diP( bution bution bution bution = (C F )F (C )i i)))tardiness E( 2(C diE(α,λ dsquare dE(α+2 = C dixdi)iE( P( fi C= E( T iE(α,λ iof variable variable TE( 7) the expected variable 7) expected dvariable ˜dC i(C 2E(α+2,λ i> C T expected T˜+ E(α+2,λ ˜fiiC ˜ii(u)du id ithe ivalue ii,iµ), iE( i) (C iE(α+2,λ iobtain i2˜i> diis u= x= d+ we obtain 2E( i+ )to )of )square iE(α,λ i )di= i) ixT i ∞)iu= λof 2C 2∞(x C i 7) the obtain i> i iC ˜ 0d )T 0)expected 2 2 E(α,λ 20 ˜ d u = x + tution tution tution tution d u u u u = = = x x x + d d d d we obtain we we we we obtain obtain obtain tution u = x + d we obtain is 0 ˜ 2 ∞ (x)dx = x (P( d ) f + d ))dx d λ i i i i i i −∞ 0 i λ λ E( T (x + d ))dx ) = x f (x)dx = x (P( C > f ˜ ˜ ˜ d i i i i dii C˜i Thei next By i α(α + 1) to theproof i x value Tdefinition is is ∞ i value dexpected ) d fvalue CCthe i 2 of P( ∞∞ i Proof. i >T˜expected Proof. By definition of part similar ofthe thedefinition expected ˜i(x∞+ di )dx. 4 deadlines is ithe oof. the definition is) = is isis ∞∞is ∞ ˜ifC> ∞α E(α+2,λ of part α(α) + ∞ ivalue T HEOREM 9. α If the forthe complet )of∞ ˜The −∞ 0∞ ∞ F (C )1) F (C ) T˜i2 ∞E(α,λ ∞ ∞expected ∞i∞C = P( C˜i∞>By uC α + d∞ )dx. dthe x2i ifC˜i (x FE( 2∞ next is∞2∞ 9. ˜i(u)du 2di0) P( ithe iexpected 0 )similar ff4completion α(α α(α + + 1) 1) similar to proof of 4 ∞ ∞∞ ∞ α The next part is to the proof of 2 ∞ ˜ ˜ ˜ 2F 2 2 HEOREM If the expected deadlines for the T ) = (C ) F (C ) E( T 2 2 2 2 ∞ ∞ 2 ) u f (u)du α(α + 1) α(α α(α α(α α(α + + + + 1) 1) 1) 1) α −2C (C )).iα +CFi2(39) FE(( ˜ 2 ∞ ∞ ∞ ∞ α α ˜i ∞∞d∞ λ i i i E( E( T T ) ) = = xF xF (C (C ) ) f f (C − − x)dx x)dx = = F F (C (C ) ) x x (C (C − x)dx. E( T ) = xF (C ) f (C − x)dx = F (C ) x f (C − x)dx. 2 i ix)dx. ˜ ˜ ∞ ∞ ˜ ˜ E(α,λ ) E(α+2,λ ) ˜ ˜ 2 E( T ) = xF (C ) f (C − x)dx = F (C ) x f (C − x)dx. i E(α+1,λ )−2C ∞ ∞ 0 2 2 2 2 2 ˜ ˜ ˜ ˜ ˜ ˜ ˜ − 2d u f (u)du + d f (u)du . ˜ ˜ ˜ ˜ 2 C 2 2 2 2 2 2 2 2 2 2 2 i i i i i i i(C i)− i i i i i ˜ ˜ ˜ ˜ ˜ ∞ 2 2)Erlang E(α+ i i i i i i −2C 2 (C ) +C F (C ) F ) ) = = F F (C (C ) ) F F (C ) E( E( T T T ) ) = = P( P( C C > > d d ) ) u u f f (u)du (u)du E( E( T ˜ ˜ − 2d u f (u)du + d f (u)du . 2 i d d d d d d d d d d d −2C (C ) +C F (C ) . F ∞ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ i d d d d ˜ ˜ i i i i i i i of tasks are independent random variables with dis 2 i i i i i C C −2C FE(α+ > d x f (x + d )dx. P( C i i i i i i i E(α+1,λ ) E(α,λ ) i i i i E(α,λ E(α,λ ) ) E(α+2,λ E(α+2,λ ) i i i 2 2 i i i − 2d u f (u)du + d f (u)du . i i i i i i i E(α+1,λ ) E(α,λ ) λ ) = F (C ) ) ) = ) = = = F F F F (C (C (C (C ) ) ) ) F (C F ) F F F (C (C (C (C ) ) ) E( T E( E( E( E( T T T T C C C C ˜ −2C (C ) +C F (C ) . (39) i i ) = P( C > d ) ) ) = ) = = = P( P( P( P( C C C C u > > > f > d d d d (u)du ) ) ) ) u u u u f f f f (u)du (u)du (u)du (u)du E( T E( E( E( E( T T T T i − 2d λ u f (u)du + d f (u)du . i i i i 2 2 2 2 2 2 2 2 ˜ ˜ −2C −2C (C ) +C F (C ) (C . ) +C F (39) (C F F 2 2 i d Rajba, and M. Wodecki 2 2 2 i i ˜ ˜ ˜ ˜ ˜ − 2d u f (u)du + d f (u)du . i i ˜i)i− ˜ i i i i i i i i i i λ i i i i i i i i i i i i i C i E(α,λ ) E(α,λ E(α,λ E(α,λ E(α,λ ) ) ) ) E(α+2,λ ) E(α+2,λ E(α+2,λ E(α+2,λ E(α+2,λ ) ) ) ) 2 E(α+1,λ ) E(α,λ ) C C i ˜ ˜ i i i i i ˜ i i i i i 0 0 0 0 ˜ ˜ 0 0 i ˜ ˜ i i i i − 2d u f (u)du + d f (u)du . i i E(α+1,λ ) E(α,λ E(α+1,λ ) ) E(α,λ ) C C C C C i ˜ C C 0 0 2 ˜ i i Similarly, as in the proof of Theorem 6 by performing substi− 2d 2d u f (u)du + d f u f (u)du (u)du . + d f (u)du . i i d d HEOREM 9. If the expected deadlines for the completion T λ λ λ i ˜ ˜ 2 2 2 2 2 ˜ i C C i i i i i i E( E( T T (x)dx (x)dx = = x x (P( (P( C C > > d d ) ) f f (x (x + + d d ))dx ))dx ) ) = = x x f f λ (x)dx = x (P( C > d f (x + d ))dx x f ˜ ˜ ˜ ˜ of tasks are independent random variables with Erlang distrii i d d dii i i di˜ λ λi E( Ti Td˜)i i )= (x)dx = didiCi diidii)iCdxiCiperforming fiiC˜fiiidiC(x + did))dx λ tardin ˜i i + ˜i>i C i ii ix ˜(P( idiby C Cα (x + > xasxfin C λµ), λλλ value of the square ii T˜dTproof di P( di iid)> )dx. C˜C ˜f λ iSimilarly, dTheorem T˜the diiiCi0of i˜d i i )dx. ˜ii (x i i C d d i CC i∞ 6 substi ∼ E(β , the expected bution d d i i i HEOREM 9. If the expected deadlines for the completion T d d d ∞0tution i i i i 2 HEOREM 9. If thevalue expected T −∞ 00 0u = ix + ii 0 i −∞ −∞ 0 obtain ˜to for the com α d tasks random variables with Erlang distri−2C )independent +C Fsimilar (C )theproof .proof (39) Fof The The part is is similar similar to to proof of 44of The next part isE(α+1,λ similar the proof of E(β µ), expected thedeadlines square bution d(C i∞we next part is similar to proof of of 4of 2next tardiness i The ipart ithe )are E(α,λ i∼ ∞ ∞∞∞ ∞∞ ii ,) +xin ∞ ∞.∞∞ calculated performing utution fC˜i2(u)du d+∞i2two fC˜obtain (u)du αthe α)4the Thenext next part is to the 4 □ Erlang are imilarly, as the proof of Theorem 6 by substi222 2 ∞ ∞∞ ∞ ∞−2C (C ) +C F (C . (39) F u = d we α α α α α is i The first integrals using the equality (30) λ i i i of tasks are independent random variables with Erlang distrii E(α+1,λ ) E(α,λ ) ∞ 2 2 i + d f (u)du . of tasks are independent random variables with 2 2 2 ˜ ˜ −2C −2C (C (C ) +C ) +C F F (C (C ) ) . XX(Y) (39) (39) F F 8 Bull. Po i of 8 Ci Similarly, 2˜fthe 22i 2Theorem 2 8f ∞˜f (u)du i +C i.. ... Tech. E(α+1,λ E(α+1,λ ))(C )(C E(α,λ E(α,λ )(39) )i )i )iitardiness µ), expected value square bution di i∼ E(β i2d iithe iF proof Theorem 6C by performing substi... − 2d uin uu)f)dx. (u)du + d+ d+ (u)du ...performing λ by 22f + −2C −2C −2C −2C (C )F +C FE(α,λ (C (C )i(C ))+C +C )α(α F .FE(α,λ F (C (C (C (C ))XX(Y) (39) (39) (39) (39) FE(α+1,λ F F 2d− is substias proof of 6 i ,−2C 2f(u)du ˜˜(u)du ˜(u)du Bull. Pol. XX(Y) i2d i> ithe ii iλ iiiF i )i)iBull. i+C iof iAc.: 8 Bull. Pol. Ac.: Tech. 20162 ˜ ˜ i ) E(α+1,λ E(α+1,λ E(α+1,λ E(α+1,λ ) ) ) E(α,λ E(α,λ E(α,λ ) ) ) ) C C C C i i i i 2d − − − − 2d 2d 2d u f (u)du + u d u f u f f (u)du (u)du (u)du + + + d d d d (u)du . f f f f (u)du (u)du (u)du ˜ 8 + 1) d i i i i ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ ˜ λ ution u = x + d we obtain i i i i i > d d ) x x f f (x (x + d d )dx. )dx. P( P( C Bull. Pol. Ac.: Tech. XX(Y) 2016 > d ) x f (x + d P( ˜ i > d ) x f (x + d )dx. P( C 8 8C˜− 8 Pol. Ac.: Tech. Bull. 2016 Pol. Ac.: i i i i i ˜ ˜ ˜ Cii C˜i i i idafter C C C C C C C C C i Then, 2 i i i i iand (31). i i 2 2 ˜ i simple transformations, we obtain the ∞ ∼ E(β , µ), the expected value of the square tardiness bution d i i i i i i i i i λ λ λ λ λ C C d d d ˜ ∼ E(β , µ), the expected value of the square ta bution d ˜ C ˜ i i i i HEOREM HEOREM 9. 9.E( IfIfTiexpected the the expected deadlines deadlines for for the the completion completion TTthe 9.T HEOREM IfTheorem expected deadlines for the i Ci > di ) didi d T HEOREM is iexpected i deadlines α(α ) =expected F )+ 1) FE(α+2,λ 9.2 If the the completion E(idxTdi id+ di0tution d˜ii )d= dP( icompletion e).obtain ii0 E(α,λ ) (C ) (Ci ) obtain 2 idiii u fC˜i (u)du If deadlines forfor 2 completion i00we )independent =the FPol. Fthe (Ci ) of E( T˜9. C di T˜i2 )u==P( λwith i )with di is of tasks are independent i> i ) ∞ u fC˜i (u)du E(α,λ ) (C E(α+2,λ )Erlang assertionE( (34). iis 2 α(α + 1) Bull. Ac.: Tech. XX(Y) 2016 of of tasks tasks are are independent random random variables variables with Erlang distridistrirandom variables Erlang distri of tasks are independent random variables with Erlang distri 2 λ 2 the 266substi ∞the substifirst two integrals are calculated using equality are random variables with Erlang (C distribution ∞ ˜iindependent ˜in Bull. Pol. Similarly, Similarly, as asthe the proof proof of ofdperforming Theorem by performing substisubstithe of Theorem 6˜i by milarly, asThe in of Theorem 6duiby (30) tasks )Ac.: = FTech. ) 2016 F ) Tech.XX(Y) E( T = 2P( C > )Theorem fCperforming (u)du E( Tin α(α +E(α,λ 1) XX(Y) ˜by performing iTech. α(α + 1) ) (C2iα E(α+2,λ )Ac.: i )proof 8proof Bull. Bull. Pol. Pol. Ac.: Ac.: XX(Y) 2016 2016 ˜E(β ˜iexpected 2value ∞˜ i2simple transformations, ∞ bution i ˜ithe 2we ˜i)d ˜88i88> ˜di˜2i )∼=E(β 8 and Bull. Pol. Ac.: Bull. Tech. Bull. Bull. Bull. Pol. XX(Y) Pol. Pol. Pol. Ac.: Ac.: Ac.: 2016 Tech. Tech. Tech. XX(Y) XX(Y) XX(Y) XX(Y) 2016 2016 2016 2016 ∼ ∼ E(β E(β , , µ), µ), the the expected expected value of of the the square square tardiness tardiness bution bution d d , µ), value of the square tardiness ˜ ˜ Tech. ∼ , µ), the expected value of the square tardiness bution λ 2 F (C ) F (C ) E( T (31). Then, after obtain the d » E(β , μ), the expected value of the square tardiness i i i i = P( C d ) u f (u)du d i ) = F (C ) Fis (Ci )(3 E( T ) = P( C > d ) u f (u)du E( T ˜ i i i i i E(α,λ E(α+2,λ ) 2 ˜ i i i C E(α,λ ) E(α+2,λ ) ∞ ∞ tution ux= =xdx+ dddii we we obtain obtain dion weu obtain i α i −2C (C ) +C F (C ) . F 2 C =uWhen + we obtain i tasks completion times are random variables with itution 2 i i i i i+ E(α+1,λ ) E(α,λ ) 2 i − 2di u fC˜i (u)du +d λ λi ) . dfiC˜isi.(u)du . is −2C (39) assertion (34). □areisis λ ) (C i2∞ fC˜ i(u)du i ) +C αi λ FE(α+1,λ i FE(α,λ ) (C −distribution, 2di i d∞ + ∞ u∞fC˜i (u)du di di ∞∞ function and W Erlang values of the W i 21) 1 2 α(α α(α + + 1) 1) 2 α(α + 1) α(α + d d −2C (C ) +C F (C ) . (39) F When tasks completion times are random variables with ∞˜ 2 ∞ α i i 2 2 2 2 2 2 2 2 ∞ ∞ i i i α 2 2 2 E(α+1,λ ) E(α,λ ) i − 2d u f (u)du + d f (u)du . ˜ ˜ ˜T ˜i > ˜ii )P( ˜˜P( ˜)E( ˜E( 2i)))(C )F )λ= =i2F FE(α,λ (C ) .F 2) (C22iF FFE(α+2,λ (Cii)) (Ci ) . E( =E(α+1,λ FE(α,λ (C)T )˜ii+C F(C )(39) E(T˜i ) F )= =CC P( C C > dd.ii))i 6uand uC˜u˜ii(u)du ffC˜C˜i i(u)du (u)du E( T)˜T= P(CE( di 2)iusing f (u)du Tiu )f =calculated T = (C ) (C E( > d ) f T ˜ii> iii)) i iiu i 2 E(α,λ ) ) E(α+2,λ E(α+2,λ i −2C (C ) F (C i the Theorem 7. E(α,λ ) E(α+2,λ ) C i −2C (C ) +C F i C i 2 ) E(α,λ ) (u)du + d f (u)du i i di dC˜ − 2d i E(α,λ ) values function f d˜the + di W1 and fC˜iW (u)du i (u)du λλ ) Bull. Pol. λ i 2 are calC˜i duof λ. i Cdidi i λ λE(α+1,λ 8Erlangi distribution, Ac.: Tech. XX(Y) 2 (39) di i i di d i 8 Bull. Pol. Ac.: Tech. culated using the Theorem 6 and 7. XX(Y) 2016 ∞ ∞∞ ∞ ∞ ∞∞ ∞ α α α α ˜i ∼ E(β 2 Ac.: Tech. XX(Y) 2016 2 2 . 2Pol. Ti ) Case D− (problem: d(u)du 2+did,2.2χ)| ∑ffwi(u)du (39) +C FFE(α,λ (C.ii))of..single FFE(α+1,λ −2Ci FE(α+1,λ )iiE(α+1,λ +C FE(α,λ (C (39) −2C−2C +C FE(α,λ (39)(39) i i))Bull. E(α,λ )−2C i(CiF i)))(C i)))(C i ischeduling − 2d ff1| iE(α+1,λ di u fC− +ii udi2fC˜iuu(u)du ) (CTech. ) (C i+C 2d + d+ . .. ˜i (u)du Stable machine i 2d i i fC˜i (u)du C˜C˜ifi(u)du C˜C˜i i (u)du C˜i (u)du i Bull. Pol. Ac.: XX(Y) 2016 λ λ λ ˜ 8 Pol.machine Ac.: Tech.sche XX( λ Case D. Stable scheduling ofBull. single Stable with We assume required dthat di i di the1jd ddi i di i » E(β di (problem: di deadlines i, χ)j∑w i Ti )for the completion of Weindependent assume that the required deadlines for Erlang the completion tasks are random variables with distribu-Proof. The expected value 6 tasks are independent random variables with Erlang The expected value ˜ The expected value Proof. The expected 6. C 88 tion of Bull. Bull. Pol. Pol.2016 Ac.: Ac.:value Tech. Tech. XX(Y) XX(Y) 2016 Bull. Pol. Ac.: Tech. XX(Y) Bull. Pol. Ac.: Tech. XX(Y) 20162016 1 ≤ Proof. di ∼ E(βi , µ), where βi = di µ, µ = max{2(min{ddistrii :Proof. ˜ ∞ bution βi = d μ = max : 1 ∙ {2(min{dithe −1 , d re i ≤ n}) i » E(β Pr ∞ i, μ), where i μ,B, 1}. Similarly, as for case we determine cu2 ∞ 2 Presen E( 2 T˜i ) = 2 x Fd˜i (Ci ) f d˜i (Ci ˜− 2 x)dx 2 i ∙ n})–1, 1}. Similarly, as for case B, we determine the cumu˜ n mulative to E(Ti ) = E(x)dx Ti ) = x 0Fd˜i (Ci ) fd˜i (Ci − x Fd˜i (Ci ) fto distribution function of the density of tasks tardiness d˜i (C soli lative distribution function of the density of tasks tardiness i 2 J 0 0 (40) its i∈J its va ∞ ∞ 2 ∞ ˜ ˜ = F (C ) x f (Ci − x)dx. (40) isa Ci − di , if Ci > di , = Fd˜i (Cdi˜)i i x20 fd˜i (Cdi˜i− x)dx. = Fd˜i (Ci ) (40) x2 fd˜i (Cistic x) i − by (35) T˜i = (35) 0 0 by AP 0, if Ci ≤ d˜i , In Incarrying theproof proof Theorem 4 se carryingout outthe the substitution, substitution, asasininthe of of Theorem In carrying out the substitution, In carrying as in theout proof the of substitution, Theorem 44as inselect the i i where of tasks completionisisCCi i = ∑j=1 and usingLemma Lemma 66 (equality (equality (30) wewe finally obtain and using (30)and and(31)) (31)) finally obtain The Ta =∑ where the the datedate of tasks completion j. j . j =1 pp and using Lemma 6 (equalityand (30)using and (31)) Lemma we 6finally (equality obtain (30) and (31 va variou Ci Ci 2 Ci L EMMA 7. Cumulative distribution of tasks tardiness (35) 2 2 ˜ E(T ) = F (C ) 2 (y − 2C˜i2y 2+Ci ) fd˜i (y)dy 2 (i) E(T˜i2 ) =i Fd˜i (Cdi˜)i i (y−∞ Ti )i = − 2CE( ) fdF (y +C Bull. Pol. Ac.: Tech. 65(2) 2017 227− 2C ˜i d(y)dy ˜i (Ci ) i y +C (i) i y375 3) 1 − Fd˜i (Ci − x)Fd˜i (Ci ) if x > 0, −∞ −∞ Unauthenticated FT˜i (x) = (36) and i Ci CDate (ii) 0, if x ≤ 0, Download | 4/25/17 7:48 PM Ci 2 (ii) test n = Fd˜i (Ci ) 2 y fd˜i (y)dy y−∞fd˜i (y)dy = Fd˜i (Ci ) y2 fd˜i (y)d = Fd˜i (Ci ) inte and density function −∞ −∞
0 0 selection by AP2 - with criterion W criterion 1 (6), and 2 (7). In carrying out the substitution, as in the proof of Theorem 4its its its variants variants with withWW random random parameters. parameters. In In the short, short, the the determin variants with short, determin(6), and by In APIn -short, with criterion W selection criterion (6),determin and by selection W ∞ In∞∞carrying 1parameters. 2 criterion 2 (7). 1instances variants with random parameters. its variants the with determinrandom In carrying out the substitution, as in the of Theorem as 4 its outproof the substitution, in thealgorithms proof ofrandom Theorem 4 tested The have been on two classes of ofon 2 2 ∞ ∞ and using Lemma 6= (equality and (31)) we finally obtain(40) istic istic algorithm algorithm will will be be denoted denoted by by AD, AD, and and the the stochastic stochastic on 2i)) 2(30) = F F (C (C x x f f (C (C − − x)dx. x)dx. (40) istic algorithm will be denoted by AD, and the stochastic one ˜ ˜ ˜ ˜ i i i The algorithms have been tested on two classes of instances of 2 The algorithms have been tested on d d d d F (C ) x f (C − x)dx. (40) istic algorithm will be denoted by AD, istic and the algorithm stochastic will one be de and using = Lemma 6 (equality (30) and (31)) we finally obtain using (31)) we finally obtain i i Lemma 6 (equality ˜0i0f ˜ i(C =d˜iFd˜i i(Ci )0i i xdand = (40) Fd˜i(30) (Ci ) and xby fdAP, (C − x)dx. (40) various size and level ofAP difficulty: ˜iAP, i whereas di i − x)dx. by whereas by by AP we we denote denote stochastic stochastic algorithm algorithm wit wit 1 1 by AP, whereas by AP we denote stochastic algorithm with various size and level of difficulty: 0C 0 various size and level of difficulty: 1 i by AP, whereas by AP1 we denote stochastic by AP, algorithm whereas bywith AP1 w Ci 2 Ci 2 2 (6), and and by by AP AP22 -criterion - with with criterion criterion W W250 (7 selection selection criterion criterion W W E(T˜the = Fthe (y in− 2C yin )proof 11 (6), 2 (7) In In carrying carrying the substitution, as asiproof in+C the the proof of of Theorem Theorem 44selection ˜i (Csubstitution, ˜i (y)dy i) (6), and by AP with W (7). criterion W i )˜out 2 the 2f dW. 2out 2 selection 2criterion 2 i of (i) 375 benchmark instances of three different sizes with 40, d Bożejko, P. Rajba, and M. Wodecki 1 2 2 In In carrying out substitution, as Theorem 4 ˜ (6), and by AP with criterion W (7). (6), W selection criterion W 1instances 2benchmark 2 40, 1 50 E(Tthe )= Fd˜i (Ci )−∞ (y − the 2C +C )of (y)dy E( )= Ffd˜iTheorem (Ci )the substitution, (y +C ) f (y)dy carrying out substitution, as in Ini yT proof carrying out 4 − 2C as in the proof of Theorem 4 ˜ i yThe i i i i (i) 375 benchmark of three different sizes with (i) 375 instances of thre d The100 algorithms algorithms have have been been tested tested on on two twoclasses classes of ofinstances instances oo ihave been and andLemma using usingLemma Lemma 66(equality (equality (30) (30) and andwe (31)) (31)) we wefinally finally obtain The algorithms tested on twotwo classes of of instances of of −∞and −∞obtain and jobs from the OR-Library [20]. andand using 6 (equality (30) (31)) finally obtain The algorithms have been tested on The classes algorithms instances have been using Lemma 6 (equality and using we finally Lemma obtain 6 (equality (30) and (31)) welevel finally obtain (30) Ci and (31)) and 100 jobs from the OR-Library [20]. and 100 jobs from the OR-Libra various various size size and and level of of difficulty: difficulty: size andand level of of difficulty: Cvarious (ii) test problems were generated as follows: each job i, an CCi various size level difficulty: variousfor size and level d Ci i Ci2 i (ii) were generated as problems follows: for each job i, of anas (ii) test were generated y22− f2d2C (y)dy 22 22 = Fd˜ (C Ci testbyproblems ˜i2C iC))i)2 whereas AP we denote stochastic algorithm with selection ˜ ˜ 2 1 2 2 i E( E( T T ) ) = = F F (C (C (y (y − y y +C +C ) ) f f (y)dy (y)dy integer processing time p was generated from the uniform ˜i i ˜ (y ˜ ˜ i i i i 2 2 2 2 2 i (C ) y f (y)dy (C ) y f (y)dy = F i i (i) (i) 375 375 benchmark benchmark instances instances of of three three different different sizes sizes with with 40, 40, 5 d d E(E( T˜i T)˜ 2=) = − 2C y +C ) f (y)dy Fidi˜iF(C˜ i(C )=id)˜dF −∞ ˜ ˜ ˜ i i i y +C di (y−∞ dii i d)˜i f ˜E( d(y T˜ii id)i =i Fd˜i (C−∞ −−∞ 2C − 2C +C )by fd˜i AP (y)dy (i) 375 instances of three different sizes with 40,40, 50 i benchmark integer time pithree was generated from the uniform integer processing time pi50 was5 i ) (i) i yprocessing criterion W and – with criterion The algoi i instances 375 benchmark of different (i)W 375 sizes benchmark with instance di di (y)dy 1 (6), 2and 2 (7). −∞−∞ −∞ were generated distribution [1, 100] integer weights w −∞ i and and 100 100 jobs jobs from from the theclasses OR-Library OR-Library [20]. [20].[1, Ci and 100 jobs from OR-Library [20]. rithms have been tested on two of[20]. instances various were generated distribution [1,the 100] and integer weights wof distribution 100] integer n and and 100 jobs the OR-Library and jobs from the O from Ci Ci C2Ci i Ci Ci from the uniform distribution [1, 10]. Let 100 Pi = .job Dis Ci (ii) ∑ (ii) test test problems problems were were generated generated as as follows: follows: for for each each job i,i,[1 aa i=1 n i,pian C i from size and level of difficulty: 2 2 (ii) test problems were generated as follows: for each job −2Ci y f (y)dy +C f (y)dy . (41) C C the uniform distribution [1, 10]. Let P = p . Disfrom the uniform distribution ˜ ˜ i i ∑ (ii)2 test problems were generated as follows: (ii) testand for problems each jobwere i,i angen d(y)dy (C y2y−∞ffd˜di−2C = FdyFidf˜di˜i(C i=1 if (y)dy y f. (y)dy ˜i(y)dy i i))y2 f+C 2˜ i(y)dy 2 (41) depend on P two additional tributions of deadline d −2C (y)dy (41) Fd−∞ (C= ) == +C f (y)dy . (41) i ˜ ˜ ˜ ˜ i˜iF i i integer integer processing processing time time p p was was generated generated from from the the uniform uniform i i )(i) 375 yd−∞ yi benchmark fd˜i (y)dy di = Fd˜ (Ciinteger dprocessing instances of three different with 40, i itributions ˜i (Ci )di i f d˜i (y)dy di time pi pwas generated from the uniform d−∞ depend on integer Psizes and two additional tributions of deadline di iwas of deadline di depen i −∞ −∞−∞ −∞ −∞ −∞ integer processing time processing time generated from the uniform −∞ parameters L and R which take on[20]. values from 0.2 to 1.0 were were generate generate distribution distribution [1, [1, 100] 100] and and integer integer weights weights w w 50 and 100 jobs from the OR-Library i i were generated distribution [1, 100] and integer weights w parameters L and R which take on values from 0.2 to 1.0 parameters L and R which take In this way we have proved the thesis statement. i were generated distribution [1, 100] and integer weights distribution w [1, 100] an i di njob CCiproved Cwe Ci i nn i generated in increments of 0.2. An integer deadline proved distribution (ii) test problems were generated as 10]. follows: for each i,∑ an In this way we have thethis thesis In have thesis statement. Ci C Ci way Cthe C from from the the uniform uniform distribution [1, [1, 10]. 10]. Let Let Pwas Pwas = = pintege pii.. Dis Di ∑ 2i2 statement. C n i=1 i=1 from the uniform distribution [1, Let P = p . Disi i i ∑ generated in increments of 0.2. An integer deadline d in increments of 0.2. An i 2 −2C −2C (y)dy yyffd˜d+C (y)dy2+C +Cfii˜ (y)dyffd˜di˜i(y)dy (y)dy (41) (41) from ii=1 the uniform distribution [1, 10]. Let from Pthe = the distrib p . Dis˜(y)dy ∑uniform 2the i=1 −2C . . .i.(41) from uniform distribution [P(L−R/2), P(L+R/2)]. Five integer processing time. pi was from uniform ii i i y ifidy˜i−∞ i i −2C f−∞ (41) y fd˜i (y)dy +C fd˜i (y)dy (41) di f−∞ depend depend on on PP and and two twoiadditiona addition tributions tributions of of deadline deadline ddiigenerated ˜i (y)dy −2C i of d˜i (y)dy +C d−∞ depend on P and two additional tributions deadline d from the uniform distribution [P(L−R/2), P(L+R/2)]. Five from the uniform distribution [P(d i −∞ −∞ depend on P and two additional tributions of deadline d tributions of deadline i −∞ (Theorem8 and−∞ −∞cal−∞ were Proved equalities 9) will be used when distribution [1, 100] andRRinteger w generated problems forweights each of the 25 pairs of vali were parameters parameters LRLgenerated and and which which take take on on values values from from 0.2 0.2 to to 1. 1e Proved equalities (Theorem8 and 9) will be used when calProved equalities (Theorem8 and 9) will be used when calparameters L and which take on values from 0.2 to 1.0 n25 problems were generated for each of the pairs of valproblems were generated for parameters L and R which take on values parameters from 0.2 L and to 1.0 R In this way we have proved the thesis InIn this way we have proved the thesis statement. this way we have proved the thesis □ fromoftheRuniform distribution [1, 10]. Let P = ∑ pi. Dis-value ofwh culating the value of function W Wstatement. . i=1d 1 oraz 2statement. ues and L, yielding 125 problems for each In In this way we have proved the thesis statement. was was generate generate in in increments increments of of 0.2. 0.2. An An integer integer deadline deadline d this way we have proved the thesis statement. In this way we have proved the thesis statement. i i culating the value of function W1 8oraz culating theW9)value W1 oraz Wtributions in in increments of 0.2. An deadline 2 .will of 2 . of Rof ues and L, yielding 125 problems for each value ofAp ues Rindtwo and L,generated yielding 125 iincrements Proved equalities (Theorem and be function used when deadline diinteger depend ondeadline Pofand was generated increments of 0.2. An integer dwas of 0.2. i additional n=200,300,400,500. from from the the uniform uniform distribution distribution [P(L−R/2), [P(L−R/2), P(L+R/2)]. P(L+R/2)]. Fiv Fiv from the uniform distribution [P(L−R/2), P(L+R/2)]. Five n=200,300,400,500. n=200,300,400,500. calculating the value of function W1 oraz W2. parameters L anddistribution R which take[P(L−R/2), on valuesfrom from 0.2 uniform to 1.0 Five from the uniform P(L+R/2)]. the distrib Proved equalities (Theorem8 and 9) will be used when calProved equalities (Theorem8 and 9) will be used when calproblems problems were were generated generated for for each of of the the 25 25of pairs pairs of of val va 5.Proved The algorithms’ stability Proved equalities (Theorem8 andand 9) 9) will be used when calproblems forfor each ofeach thedproblems pairs valinand increments of 0.2. An when integer deadline was generated The setwere allgenerated 975 examples of deterministic data is denoted equalities (Theorem8 will Proved be used equalities when cal(Theorem8 9)of will be used cali 25 problems were generated each of the 25 pairs were of generat val-of d 5. The algorithms’ stability 5. The algorithms’ stability culating the value of function W culating the value of function W oraz oraz W W . . The set of all 975 examples of deterministic data is denoted The set of all 975 examples 1 1 2 2 ues ues of of R R and and L, L, yielding yielding 125 125 problems problems for for each each value value oo culating the value of of function W1Woraz W2W .culating from the uniform distribution [P(L ¡ R/2), P(L + R/2)]. of R and yielding 125 problems for each by Ω. culating value function the value of function W oraz W . L, 1 oraz 2. 1 of 2L, ues R and yielding 125 problems uesfor ofeach R value and value L,ofyieldi of In this the section we shall introduce a certain measure which let ues by Ω. by Ω. n=200,300,400,500. n=200,300,400,500. Fiveeach problems were for each of there the 25was pairs algorithms’ stability In 5. thisThe section we shall introduce certainwe measure which let a certain In this asection shall introduce measure which let For example ofgenerated deterministic data anofexample n=200,300,400,500. n=200,300,400,500. us examine the influence of the change of jobs’ parameters on n=200,300,400,500. For each example of deterministic data there was an example For each example of determinist values of R and L, yielding 125 problems for each value us examine the influence of the change of jobs’ parameters on us examine the influence of the change of jobs’ parameters on 5. 5. The The algorithms’ algorithms’ stability stability of theThe probabilistic data designated the following distribu5. 5.The algorithms’ stability The set set975 of ofall all 975 975 examples examples of ofwith deterministic deterministic data datadesignated isisdenote denote theThe goal function value (1) i.e. the solution stability. algorithms’ stability 5. The algorithms’ stability The set of all examples of deterministic data is denoted In this section we shall introduce a certain measure which let of n = 200, 300, 400, 500. of the probabilistic data designated with the following distribuof the probabilistic data The set of all 975 examples of deterministic The set data of is all denoted 975 exam the goal function value (1) i.e. stability. the the goalsolution function value (1) i.e. the solution stability. tions of random variables (other values are deterministic): by by Ω. Ω. Let δexamine = ((pshall uwe ,introduce dof1 ), . change . , (p ,measure uofn ,jobs’ wnmeasure ,measure eparameters be onanby the influence the.certain The set ofrandom all 975 variables examples of deterministic data isvariables denoted (other va In Inus this this section section shall shall aancertain certain which let letby Ω.tions n , dn )) which 1 , we 1, w 1 , e1introduce (other deterministic): tions of random Ω. byare Ω. In In thisthis section Let δ we =function ((pshall uvalue w1 ,(1) e1 ,i.e. da1Let ),athe .certain . .solution (p ,((p unstability. ,section e,nwwhich , d1we δ, In = ,ne))1let ,shall dbe .introduce . .by , (pΩ. ,aof wcertain , en ,example dexample )) be of anwhich section we measure let letvalues nthis 1 , introduce 1 ,introduce n , uFor nFor neach nmeasure 1w, nu,1which 1 ),)an each of deterministic deterministic data data there there was an anexampl examp the goal example of deterministic data for the TWT problem. By D(δ us us examine examine the the influence influence of of the the change change of of jobs’ jobs’ parameters parameters on on For each example of deterministic data there was an example (a)For processing times: pD(δ ˜deterministic · pdata α =was 0.2, i ∼)N(pi , α i ), where each example ofparameters there Fortimes: each was an example example of, α det· us us examine thethe influence of of thethe change of jobs’ parameters onon example of deterministic data for the TWT problem. By D(δ ) example of deterministic data for the TWT problem. By examine influence change us of examine jobs’ parameters the influence of the change of jobs’ on (a) processing times: p ˜ ∼ N(p , α · p ), where α = 0.2, (a) processing p ˜ ∼ N(p Let function δ = ((p ), …, (p , e )) be an ex- of(b) For each example of deterministic data there was an of ofthe thedue probabilistic probabilistic data data designated designated with the the following following distribu i iwith i with iex- distribu i 1, u 1, w1, e 1, di.e. 1i.e. n, un, w nby n, d ndisturbance we denote a set of data generated from δ a ˜ the the goal goal function value value (1) (1) the the solution solution stability. stability. the probabilistic data designated the following distribu∼ N(d , c · d ), c = 0.1, latest dates: d i i i of the probabilistic data designated with of the the following probabilistic distribudata thethe goal value i.e.i.e. the solution stability. wefunction denote a value set (1) of (1) data generated from δ function by aBydisturbance we denote agoal set of data generated from by a dates: disturbance function the solution the stability. value i.e. the stability. ample of deterministic data for the TWT problem. D(δ) we (1)ample ofδsolution the probabilistic data designated the following ∼ N(d , c latest · di ),values cwith =deterministic): 0.1, (b) latest due d˜i (other (b) due dates: d˜i ∼ N(di , c · dde tions tions of of random random variables variables (other values are are deterministic): deterministic): i(other i) of goal jobs parameters. A disturbance consists in changing these tions of random variables values are Let Let δ δ = = ((p ((p , , u u , , w w , , e e , , d d ), ), . . . . . . , , (p (p , , u u , , w w , , e e , , d d )) )) be be an an ∼ E(α , λ ), α = p · λ , 1, (c) processing times: p ˜ nnn, ennn,in nnbe nnan these tions 1), .11.of 1 n , uparameters. i (other(other i values α values itions iof determinα λα =variables of random variables are=deterministic): random LetLet δ denote == ((p w11,data , e111,1A ,generated d111,disturbance ..,.1(p wnLet dn= )) of jobs parameters. consists jobs A disturbance consists in changing these n, ,u nn,nchanging 1 , u11, ,of 11w a set from δ by a disturbance of jobs distributions of random variables are δ ((p δ ((p u e d ), . , (p , w , e d )) be , u , an w , e , d ), . . . , (p , u , w , e , d )) be an ∼ E(α , λ ), α p · λ , λ = 1, (c) processing times: p ˜ ∼ E(α , λ (c) processing times: p ˜ n n n n n n n n 1 1 1 1 1 1 α i α αi i α times on random determineddata values. example example of ofdeterministic deterministic for for the the TWT TWTproblem. problem. By By D(δ D(δ(d) )istic): ) (a) ,N(p µiβ ), βi··= ·where µi β , µ latest due dates: d˜i ˜∼ip˜pE(β iN(p βα (a) processing processing times: times: ˜iE(β ∼ ∼ ,αα pdue p=ii),d),diwhere α = =1, 0.2, 0.2, , µ ), example of of deterministic data fordata the TWT problem. By D(δ ) on iBy i), i,), parameters. A disturbance consists in changing these times times ondeterministic random determined values. times on random determined values. ˜i = (a) processing times: p ˜ ∼ N(p , α · p where α = 0.2, example data for the TWT example problem. of deterministic By D(δ ) data for the TWT problem. D(δ ) ∼ , µ β · µ , µ = 1, (d) latest due dates: d ∼ E(β (d) latest dates: d i i i i i i i i (a) processing times: p ˜ (a) processing times: p ˜ ∼ N(p , α · p ), where α = 0.2, β β β i i i i β∼ Let Adenote = {AD, AP} where and APδfrom isfrom theadeterministic and we we denote aa set set of of data dataAD generated generated δδdisturbance by by aa disturbance disturbance ˜ii˜ » N(p ∼ N(d N(d cc·= ·i(a)), d),di0.1, ),ccc= = 0.1, 0.1, (b) (b) latest latest due due dates: dates: dand random determined values. (a) processing times: pd˜= , α¢p where α = 0.2, wewe denote of data generated from by ˜i by ii∼ i i,,c i), i), LetaAset =set {AD, AP} where AD and AP isdenote the deterministic and Let A= {AD, AP} where AD and AP isdue the deterministic ∼ N(d , c · d (b) latest dates: d denote a of data generated from we δ by a disturbance a set of data generated from δ a disturbance ˜ ˜ Parameters’ values α 0.2 (case = 0.1 (case (b)) and i ∼ N(d , c · d ), c = 0.1, ∼ (b) latest due dates: d (b) latest due dates: d i dα i , c¢d i ), c = 0.1, i(case theof probabilistic algorithm, respectively (i.e. solving examples ˜i » N(d of jobs jobs parameters. disturbance disturbance consists consists in in changing changing these these (c) Parameters’ = 0.2 (case (a)), = (case (b)) andN Letparameters. A = {AD, AP where AD and AP is(i.e. the deterministic and (b) latest duevalues dates: Parameters’ values αα,1, 0.2 }AA ∼ ∼ E(α ),pα α·ciλ = p0.1 pλiiα·on ·λλ= ,= λλααbasis = =1, 1, (c) processing processing times: times: pi˜p˜ii(d)) of of jobs parameters. A algorithm, disturbance consists in changing iE(α iii,,iλλ= α α), i= α the probabilistic respectively solvingthese examples the probabilistic algorithm, respectively (i.e. solving examples ∼ E(α , λ ), α , (c) processing times: p ˜ jobs parameters. A disturbance consists of jobs in changing parameters. these A disturbance consists in changing these λ = µ = 1 (case (c) were established the i i α i α α ∼ E(α λwere ), p¢λi1· ,λ ,λ = 1,(d)) ∼ times: p˜i (c) (c) times: p˜were β i ,λ α ), i ==processing α(c) iof with deterministic oralgorithm, random times of jobs’ for (c) processing times times on onrandom random determined determined values. values. the probabilistic respectively (i.e. performance) solving examples (c) processing times: pd˜d˜ii˜ » E(α αα = p λα·αµµ = 1, λvalues. = µ = 1 ddates: (case i µ(d)) on the basis of = µ i= i, λ α iestablished i= αdi(case ∼ ∼ E(β E(β , , µ µ ), ), β β = d · , , µ µ = 1, 1, (d) (d) latest latest due due dates: αjobs’ α times on random determined values. ˜ β β i i i i i i β β β β β β with deterministic or random times of jobs’ performance) for with deterministic or random times of performance) for ∼ E(β , ), β = d · µ , µ = 1, (d) latest due dates: times on random determined values. times on random determined ˜ ˜ statistical data relating to road construction and on carried out i i i i β β β ∼ E(β , µ ), β = d · µ , µ = 1, ∼ E (d) latest due dates: d (d) latest due dates: d ˜ i i i i i the TWT problem. By π weAD denote ajobs’ solution (a permutaβ β), β on carried with deterministic orwhere random times of performance) for and (d) latest due dates: di » E(β βi = didata ¢μβ,βμrelating δ and Let Let AA = =AP} {AD, {AD, AP} AP} where AD and and AP AP isissolution the the deterministic deterministic and statistical i, μ β = 1, data relating to road construction and statistical to roadout con LetLet Athe = {AD, where AD AP is the deterministic and TWT problem. By π we denote a (a permutathe TWT problem. By π we denote a solution (a permutacomputational experiments. A = {AD, AP} where AD and AP is the Let deterministic A = {AD, AP} and where AD and AP is the deterministic and δ δ Parameters’ Parameters’ values αα(case = =0.2 0.2 (case (case (a)), cc(case = =0.1 0.1(b)) (case (case (b)) (b)) an an the TWT problem. By algorithm πδ werespectively denote Parameters’ valuesvalues α = 0.2 (a)), c = 0.1 (case (b)) andand tion) determined byalgorithm, the A a solution for a(i.e. data(a permutation) δ . Then, letParameters’ values α = 0.2 (case (a)), c (a)), = 0.1 the the probabilistic probabilistic algorithm, respectively (i.e. solving solving examples examples computational experiments. computational experiments. values α= 0.2 (case (a)), Parameters’ c= 0.1 (case values (b)) and αfor =0 thethe probabilistic algorithm, respectively (i.e. solving examples tion) determined byalgorithm the algorithm Athe for a data δexamples .(A, π Then, let λParameters’ tion) determined by the algorithm Arespectively for a= data δ .11(c) Then, let For simplicity, each of the these data sets (respectively probabilistic algorithm, respectively (i.e. probabilistic solving algorithm, (i.e. solving examples determined by the A for a data δ. Then, let f , φ) = μ = 1 (case i (d)) were established on the basis of λ λ = µ µ = = (case (case (c) (c) i i (d)) (d)) were were established established on on the the basis basis oo δ α β , ϕ) be the cost of jobs’ execution (1) for the example f (A, π α α β β λ µβFor 1 for (case i (d)) established the of(c) δ deterministic with deterministic or or random random times times of ofperformance) jobs’ jobs’ performance) for for simplicity, each ofwere the these data sets (respectively For simplicity, of the thes αλ= λand = µ= = 1jobs’ (case (c) i (d)) were established µcarried =the 1basis (case basis offor i( with deterministic or random times of jobs’ ˜on α α = , ϕ) be the cost of jobs’ execution (1) for example fwith (A, πthe ,jobs’ ϕ) be the performance) costthe ofinfor jobs’ execution the(c) example f (A, πof β(1) βoneach the case(a), (b), (c) and (d)) is denoted by Ω. with deterministic or random times with deterministic performance) or for random times of performance) for δ be cost of jobs’ execution (1) for the example φ a sestatistical data relating to road construction on out δ statistical statistical data data relating relating to to road road construction construction and and on on carried carried ou ou ϕ in a TWT sequence determined bywe a solution permutation) πδstatistical ˜ on(c) data relating road construction and carried outout the thein TWT problem. problem. By By ππ aa(asolution solution (a (a permutapermutathe case(a), (b), (c)toand (d)) is denoted by Ω. the case(a), (b), and (d)) is den δδϕwe statistical relating to construction and on data carried relating thethe TWT problem. ByBy πδby denote asolution solution permutaϕ aproblem. sequence determined by adenote (a(a permutation) in (a permutation) adenote sequence determined by (adata permutation) quence determined π(a TWT πwe TWT problem. Byπaδπsolution we denote athe solution permutaweInitial denote aexperiments. solution (aπroad permutation. The quality ofstatistical solutions calculated byto δpermutaδ determined δa solution δcomputational computational computational experiments. experiments. determined by the algorithm A for data δ . Then, computational experiments. tion) tion) determined determined by by the the algorithm algorithm A A for for a a data data δ δ . . Then, Then, let let Initial permutation. The quality of solutions calculated by Initial permutation. The qualit computational experiments. computational experiment tion) determined by thethe algorithm for a data δ . δalgorithm Then, let by the algorithm A for data determined δ.AA Then, For simplicity, these data sets (respectively for point. determined byby algorithm for . determined Then, the Athe foralgorithm data δ .For Then, tion) determined algorithm A data for tion) aδby data . Then, bylet Asimplicity, foralgorithm aeach dataof δthe .strongly Then, tabu search depend on sets the starting For simplicity, each each of of the thelet these these data data sets (respectively (respectively fo fo For simplicity, each of the these data sets (respectively for , , ϕ) ϕ) be be the the cost cost of of jobs’ jobs’ execution execution (1) (1) for for the the example example f f (A, (A, π π ˜ tabu search algorithm strongly depend on the starting point. tabu search algorithm strongly de δδ the cost of 1 For simplicity, each of these by data For (respectively simplicity, each for of the case(a), (b), (c)(1) and (d)) isthe denoted Ω. sets be jobs’ execution for example f (A, πδ π, ϕ) f (A, πδf(1) ,(A, ϕ) −δfor fthe (AD, π ϕ , ϕ) ˜ ˜ , ϕ) be the cost of jobs’ execution (1) , ϕ) the be example the cost of jobs’ execution for the example f (A, π Below we present the constructive heuristic algorithm which δ the the case(a), case(a), (b), (b), (c) (c) and and (d)) (d)) is is denoted denoted by by Ω. Ω. ˜ present πδ ,permutation) ϕ) −(a (AD, πBelow , ϕ) −(b), f(b), (AD, πϕand , ϕ) 1 a∑solution D(δ )) = determined , fthe 1 πϕπ, ϕ) case(a), (c) and (d)) is denoted byheuristic Ω. ϕsequence ϕ in inδaa,sequence sequence determined by byf a(A, a solution solution (af permutation) permutation) π(A, πδδthe ˜case(a), we present the constructive algorithm δcase(a), Below we the by constructiv Initial permutation. The of solutions calculated (c) (d)) is denoted by the Ω. (c)which and (d ϕϕ in in a ∆(A, determined δπ δ , D(δ )) = ∆(A, δϕ,f(a D(δ , quality |D(δ )|byϕ∈D(δ (AD, π= a ∆(A, sequence determined by a∑ solution in(aa)) permutation) sequence determined by aInitial solution (a permutation) πquality computes these solutions. It’s based the idea of(b), NEH algoϕ , ϕ) ∑ δ,, δ onof Initial permutation. permutation. The The quality of solutions solutions calculated calculated bb ) determined determined by by the the algorithm algorithm A A for for data data δ δ . . Then, Then, Initial permutation. The quality of solutions calculated by tabu search algorithm strongly depend on the starting point. |D(δ )| f (AD, π , ϕ) |D(δ )| , ϕ) f (AD, π computes these solutions. It’s based on the idea of NEH algocomputes these solutions. It’s ϕ ϕ Initial permutation. The quality of solutions Initial calculated permutation. bybase T determined byby thethe algorithm A for data Then, )δ . δ ) determined algorithm Aϕ∈D(δ for data determined . Then, by the ϕ∈D(δ algorithm A for data δ . Then, rithm [19] and creates n elements’ permutation π ∈ Φ. tabu tabu search search algorithm algorithm strongly strongly depend depend on on the the starting starting poin poin Below we present the constructive heuristic algorithm which tabu search algorithm strongly depend on the starting point. rithm [19] and creates n elements’ permutation π ∈ Φ. rithm [19] and creates n elements’ tabuFor search algorithm tabu on the search starting algorithm example determined we call the solution stability f,fan (A, (A, ππexample ϕ)− −δπδ) ff)ϕ(AD, (AD, ππϕϕ,,ϕ) ϕ) computes the job i ∈strongly anddepend the number xalgo≥ point. 0, whic letstr 11 f π(A, δ π(of δexample δ,f,ϕ) Below Below we we present present the theJπJconstructive constructive heuristic heuristic algorithm algorithm whic is called the solution stability (of an determined these solutions. It’s based on the idea ofjob NEH πδcall ϕ) − (AD, ,)ϕ) 1= δ(of an δ determined we call the solution stability π Below we present the constructive heuristic algorithm which (of an example δ ) determined we the solution stability π f (A, π , ϕ) − f (AD, π , ϕ) f (A, π , ϕ) − f (AD, , ϕ) 1 1 For the job i ∈ and the number x ≥ 0, let For the i ∈ J and t ∆(A, ∆(A, δ δ , , D(δ D(δ )) )) = , , δ δ ϕ ϕ δ δ Below we present the constructive heuristic Below algorithm we present which the con ∑ ∑ ∆(A, δ , δD(δ ))algorithm = =A on , , by the algorithm the set of of disturbed D(δ ).ϕ) = max{0, x −solutions. dielements’ }. f∑ ∑ ∆(A, D(δ )) ∆(A, δA ,D(δ). D(δ )) = , on i (x) |D(δ |D(δ )|)| fdata fϕ(AD, (AD, π π the set disturbed data rithm [19] andD(δ creates nsolutions. permutation computes computes these these It’s It’s based based on onπ 2 Φ. the the idea idea of ofalgoNEH NEH algo algo ∑ ϕϕ,,ϕ) |D(δ )|on , ϕ) f (AD, π byby the,the algorithm AA on the set of disturbed data D(δ ). computes these solutions. It’s based the idea of NEH by the algorithm on the set of disturbed data ). ϕ∈D(δ ϕ∈D(δ ) ) (x) = max{0, x − d }. f (x) = max{0, x − d }. f |D(δ )| |D(δ )| , ϕ) , ϕ) f (AD, π f (AD, π i i i i these solutions. It’s based on computes the ofthese NEH algo) ) examples ϕ the ϕ the number Let ΩLet be Ωa be seta set of deterministic ϕ∈D(δ ϕ∈D(δ )the [19] ofϕ∈D(δ deterministic examplesfor for the problem problem ofof computes For job i 2 J and x ¸ 0, let fidea 0,solutions. rithm rithm [19] and and creates creates nnelements’ elements’ permutation permutation ππ{∈∈ Φ. Φ. i (x) = max rithm [19] and creates n elements’ permutation π[19] ∈ Φ. Let Ω be a set of deterministic examples for the problem of Let Ω be a set of deterministic examples for the problem of rithm [19] and creates n elements’ permutation rithm π ∈ and Φ. creates ne jobs’ arrangement. stability rate the jobs’ arrangement. TheThe stability ofof the algorithm Adetermined onthe the x ¡ d Algorithm CA {Constructive Algorithm} (of (of an an example example δδ))A on determined we we call call the the solution solution stability stability πan πrate i}. For For the the job job i)ithe ∈ ∈ JJ Algorithm and and the the CA number number xx ≥ ≥let0, 0,Alg lel δδ rate example δalgorithm )the determined wewe call the solution stability πstability For the job i ∈ J and the number x ≥ 0, jobs’ arrangement. The of the algorithm A on the δ π(of jobs’ arrangement. The stability rate of the algorithm A on Algorithm CA {Constructive Algorithm} {Constructive (of an example δ ) determined (of an example δ determined call the solution stability we call solution stability π For the job i ∈ J and the number For the x ≥ job 0, i let ∈ δ δ Ω is defined in the following setby Ωset isthe defined in the the following way: Enumerate jobs xsuch as p1 /w1 ≥ p2 /w2 ≥, ... , ≥ pn /wn ; J by the algorithm algorithm AAon onthe thedisturbed set setof ofway: disturbed disturbed data data D(δ D(δ).). (x) (x) = = max{0, max{0, x − − d d }. }. f f i i i i byby the algorithm A on set of data D(δ ). = max{0, −xdata d−i }. fway: setalgorithm Ω is defined in the set following Ωway: is defined in the ≥,max{0, ...such , ≥ pas /w ; 1≥ Enumerate jobs such).asAlgorithm} p1 /wEnumerate i (x) jobs pd1n/w the A on ofset disturbed by data the algorithm D(δ ). following A on the set of= disturbed max{0, diD(δ }. (x)2 = xn− f (x) 1 ≥ p2f i/w Algorithm. CAx {Constructive i }. Let LetaΩ Ω be be aadeterministic set set of of deterministic deterministic examples examples for for the the problem problem of ofi l := 1; LetLet ΩΩ be set of examples for the problem of 1 l := 1; jobs lof:=2/w 1; 2 ¸, …, ¸ pn/wn; Enumerate such as p1/w1 ¸ p aS(A, set Ω) of deterministic examples Let for the be problem a.set deterministic of examples for the problem 1∑ = The ∆(A, δrate , D(δ ·Ω 100%. (42) (42) be 1 ofAA for i := 2 to CA nCA do{Constructive jobs’ jobs’ arrangement. arrangement. The stability stability of of))the the algorithm algorithm on on the theδ , D(δ Algorithm Algorithm {Constructive Algorithm} Algorithm} jobs’ arrangement. The stability rate of rate the algorithm A on the Algorithm CA {Constructive Algorithm} l := 1; S(A, Ω) = ∆(A, δ , D(δ )) · 100%. (42) S(A, Ω) = ∆(A, )) · 100%. (42) |Ω| for i := 2 to n do for i := 2 to n do CA {Construc jobs’ arrangement. The stability rate of jobs’ the algorithm arrangement. A on The the stability rate of the algorithm A on the Algorithm CA {Constructive Algorithm} Algorithm ∑ ∑ δfollowing ∈Ω begin |Ω| δ ∈Ω set Ω Ωisisdefined defined in inthe the|Ω| following way: way: /w ≥ p≥, p22/w /w ≥,p... ... ,,≥ ≥ ppnn/w /w ;; Enumerate Enumerate jobs jobs such such as as1 p≥ p11/w 11 ≥ for i := 2 to n do setset ΩΩ isset defined in the following way: δ ∈Ω /w p /w ...22... ,≥, ≥ Enumerate jobs such as p n /w n ; nsuch 1 2 2 begin begin is defined in the following way: set Ω is defined in the following way: jobs such as p1 /w1 ≥ p2 /w2Enumerate ≥, ,,≥ pjobs ; nnas Enumerate n /w Insert a job i on one of positions 1, 2, ... l l l := := 1; 1; the following section we willpresent present numerical numerical experIn theInfollowing section will exper- l := 1;begin 11 wewe Insert a jobexperi on one of positions 2, Insert a job l := 1;sonumerical l1,:= 1;... ,ilon one of pos 1 1= In the that following section will present numerical experIn thethe following section we (42) will present that the 1 S(A, S(A, Ω) Ω) = ∆(A, ∆(A, δ δ , , D(δ D(δ )) )) · · 100%. 100%. (42) iments allow comparisons of deterministic stability Insert ado insum one of positions 1, 2, …, l for for ii:= := 22job to to non do do ∑ ∑ iments that allow comparisons of the deterministic stability S(A, Ω)Ω) = = ∑|Ω| ∆(A, δ , D(δ )) · 100%. (42) for i := 2 to n so that the sum so that the S(A, S(A, Ω) = ∆(A, δ , D(δ )) · 100%. (42) ∆(A, δ , D(δ )) · 100%. (42) |Ω|δthe fordeterministic i := 2lthat to n do for i :=sum 2 to n do ∑ iments that allow comparisons of that the deterministic stability iments allow comparisons the stability |Ω||Ω| ∈Ωprobabilistic coefficient S(AD, Ω) with stability coefficient so (C ∑ begin begin |Ω|ofδ∑ π( j)sum π( j) ) was minimal, l j=1 l fthe δ ∈Ω coefficient S(AD, Ω) with theδ∈Ω probabilistic stability coefficient δ ∈Ω ∈Ωbegin l∑ j=1 f π( j) (Cπ( j) ) was minimal, f (C ) was mini ∑ begin begin π( j) j=1 coefficient S(AD, Ω) with the probabilistic stability coefficient coefficient S(AD, Ω) with the probabilistic ∑ stability jminimal, fπ( j)(Ccoefficient ) was j =1Insert Insert aon aj)j)job job iion on one of of; positions positions 1, 1, 2, 2, ... ...π(,,lj)l j where Ciπ(π( = pπ(s) S(AP,S(AP, Ω). Ω). j one Insert a job one of positions 1, 2, ... , l j∑ In In the the following following section section we we will will present present numerical numerical experexpers=1 Insert a job i on one of positions 1, 2, ... Insert , l a i on on In In the following section wewe will present numerical experwhere pπ(s);experwhere pπ(s) ; where Cπ( j) = ∑job S(AP, Ω). S(AP, Ω). In π(C j) = the following section will present numerical the following expersection we will present numerical ssum π( j)∑= =1∑s=1 s=1 pπ(s) ; so soC that that the the sum so that the sum iments iments that that allow allow comparisons comparisons of of the the deterministic deterministic stability stability l := l + 1 so thatldeterministic so that the sum iments that allow comparisons of of thethe deterministic l the sum iments that allow comparisons iments deterministic that stability allow stability comparisons of l lthe fπ( (C )stability )was wasminimal, minimal, ∑ ∑j)j=1 π( j)j) π( π(j)j) Bull. Pol. Ac.: Tech. XX(Y) 2016 j=1 coefficient coefficient S(AD, S(AD, Ω) Ω) with with the theprobabilistic probabilistic stability stability coefficient coefficient fπ( (Cfπ( )(C was minimal, ∑end { for} 6. Computational experiments j) j=1 f (C ) was minimal, coefficient S(AD, Ω) with the probabilistic stability coefficient ∑ ∑lj=1 fπ( j) (Cπ( j)99) w π( j) coefficient j=1 π( j)stability Bull. Pol.S(AD, Ac.: Tech. 2016 Bull. Pol. Ac.: Tech. XX(Y) 2016 coefficient Ω) XX(Y) with the probabilistic coefficient stability S(AD, coefficient Ω) with the probabilistic jj j where where=CC∑ S(AP, S(AP,Ω). Ω). ∑s=1 j= j π( π(j)j) = π(s);; where C p∑ ; p;pπ(s) S(AP, Ω). π( π(s) ps=1 where Cj) j) = where Cπ( j) = ∑s= S(AP, Ω). S(AP, Ω). s=1 ∑s=1 Presented in Chapter 3.2 tabu search algorithm was adopted The CA algorithmπ(requires O(n2)π(s) time.
to solve the deterministic TWT problem (i.e. 1k∑wi Ti ) and its
Deterministic algorithms CA and AD on the examples from
Bull. Bull. Pol. Pol. Ac.: Ac.: Tech. Tech.2016 XX(Y) XX(Y)2016 2016 Bull. Pol. Ac.: Tech. XX(Y) Bull. Pol. Ac.: Tech. XX(Y) 2016parameters. InBull. Pol.the Ac.:deterministic Tech. XX(Y) 2016 variants with random short, the set Ω, and probabilistic AP1, AP2 on the examples from the
algorithm will be denoted by AD, and the stochastic one by AP,
228
˜. setc Ω
Bull. Pol. Ac.: Tech. 65(2) 2017 Unauthenticated Download Date | 4/25/17 7:48 PM
9 9
Stable scheduling of single machine with probabilistic parameters
Parameter selection. For parameters’ values determination preliminary experiments have been conducted for randomly generated instances (5 per each number of jobs n = 40, 50, 100). After analysis of the received results we have assumed: the length of tabu list lTS(iter) = d n e, and the number of iterations algorithm Maxiter = n2.
Table 3 The comparison of results of AD, AP1, AP2 algorithms with the results of the constructive algorithm CA (for data of the class (ii))
6.1. Comparative study. Firstly, the quality of the examples of solutions designated by separate algorithms from (i) was examined. For probabilistic algorithms it was assumed that the execution time of tasks are random variables with normal distribution (see case (A)). We compared the results of the deterministic (CA, AD) and probabilistic algorithms (AP1, AP2) with the results taken from the OR-Library page [20] For each test instance we have computed values: F A – the makespan found by the algorithm A 2 {CA, AD, AP1, AP2}, and (F A ¡ UB)/UB ¢100% – the relative percentage deviation of the makespan F A to the best known upper bound UB (taken from [20]). For each size n (group of test instances) we have calculated the following values: δaprd – the average value (for 125 instances) of the relative percentage deviation of the cost function F A (found by algorithm A) to the best known upper bound, δmrpd – the maximal relative percentage deviation from the upper bound value. We present the results obtained for the test problems of class (i) in Table 2. When comparing the average relative error it turns out that, regardless of the number of tasks, deterministic algorithm AD sets significantly better solutions than both probabilistic algorithms. The average relative error of the deterministic algorithm is 0.51% and is considerably smaller (by about about 30%) than errors for probabilistic algorithms. It is especially clear for examples with a greater number of tasks. Both probabilistic algorithms AP1 and AP2 set similar solutions because their errors vary little and are respectively 0.70% and 0.75%. According to predictions the worst algorithm appeared to be a construction algorithm CA. The average relative error was 6.06% and almost 12 times greater than the error of the algorithm AD. Average maximum errors (δmrpd) have similar proportions. By far the best appeared to be algorithm AD. The worst set by this algorithm solutions differ on average by 0.67% from the best currently known values given in [20]. Computation time of one algorithm for all the 375 examples did not exceed 5 seconds. Since for the examples of the number of Table 2 Comparison of results of the algorithms with results taken from the OR- Library [20] (class (i)) n
Algorithm CA Algorithm AD Algorithm AP1 Algorithm AP1 δaprd
δmrpd
δaprd
δmrpd
δaprd
δmrpd
δaprd
δmrpd
40
4.21
5.23
0.11
0.16
0.19
0.26
0.17
0.21
50
5.33
7.57
0.24
0.32
0.28
0.39
0.31
0.44
100
8.64 10.14
1.17
1.54
1.63
2.42
1.76
2.16
all
6.06
0.51
0.67
0.70
1.02
0.75
0.94
7.65
Algorithm AD
Algorithm AP1
Algorithm AP1
δaprd
δmrpd
δaprd
δmrpd
δaprd
200
–8.74
–11.27
–5.14
–8.44
–5.93
–7.93
300
–10.87 –14.55
–9.38
–12.37
–8.17
–12.11
400
–13.41 –17.84
–9.98
–13.53
–9.81
–14.72
500
–16.58 –19.26
–11.37 –14.81
–10.47 –13.52
all
–12.40 –15.73
–8.96
–8.59
n
–12.28
δmrpd
–12.07
tasks bigger than 100, there is no reference data in the literature, we compare the results of algorithms AD, AP1 and AP2 with the results of the design algorithm CA. Calculations performed on test instances of class (ii). The obtained results were presented in Table 3. They are similar to those listed in Table 2, the highest average (relative) improvements of solutions –12.40% set by AC algoritm was obtained by AD algoritm. Both stochastic algorithms set similar solutions. Average improvements differ slightly between each other and are respectively –8.96% and –8.59%. Similar proportions were observed for maximum improvement. Computation time of one algorithm in all 600 examples, did not exceed 3 minutes. Taken into consideartion the simplicity of algorithms and a small number of iterations it must be emphasized that the set solutions are fully satisfactory. 6.2. Stability of algorithms. In order to investigate the stability of algorithms there were disturbed data sets generated. The basis was constituted on 975 examples of deterministic data from the set Ω described in (i) and (ii). For the example of deterministic data δ = (pi, wi, di)i=1,2,…,n, δ 2 Ω there were 100 examples of disturbed data D(δ) generated according to the following distributions: a) problem 1j p˜i » N(pi, a¢pi)j∑wi Ti ), p˜i » N(pi, 0,2¢pi), b) problem 1jd˜i » N(di, c¢di)j∑wi Ti ), d˜i » N(di, 0,1¢di), c) problem 1j p˜i » E(αi, λ)j∑wi Ti ), p˜i » E(pi, 1), d) problem 1jd˜i » E(βi, χ)j∑wi Ti ), d˜i » E(pi, 1). In total, for each variant of probabilistic problem a), b), c) and d) there 97 500 examples of disturbed data generated. They were then solved by the algorithm AD, whose solutions constituted the basis for determining the stability coefficient of examined algorithms. Table 4 shows the results for problems with random parameters with the normal distribution. Both probabilistic algorithms have stability coefficient significantly smaller than the deterministic algorithm. For random times of the tasks execution probabilistic algorithm AP1 has a stability coefficient of 3.91% and it is more than twice lower than 8.25% – coefficient of deterministic algorithm AD. Coefficient 3.91% of AP1 algorithm shows that the random disturbance of tasks execution times decreases the value of the objective function (in relation to solutions of AD) algorithm on average by 3.91%. In case of random deadlines for the completion of tasks the smallest stability coefficient has AP2 algorithm. It amounts to
229
Bull. Pol. Ac.: Tech. 65(2) 2017 Unauthenticated Download Date | 4/25/17 7:48 PM
W. Bożejko, P. Rajba, and M. Wodecki
Table 4 Stability coefficients (42) for random parameters examples with the normal distribution (case A and B) n
Case A ( p˜i » N( pi, a ¢ pi))
Case B (d˜i » N(di, c¢ di))
AD
AP1
AP2
AD
40
4.75
2.16
1.89
50
4.99
2.08
2.25
100
6.07
3.75
3.61
200
6.83
3.63
4.38
13.27
7.55
7.66
300
8.66
4.11
5.35
15.17
9.13
7.72
400
11.07
4.82
5.89
17.75
8.96
9.74
500
15.36
6.86
7.06
22.46
11.04
11.57
Average
8.25
33.91
4.35
12.58
7.23
7.16
AP1
AP2
4.88
4.11
3.72
6.03
3.87
3.54
8.48
6.28
6.21
7.16%. AP1 algorithm has slightly larger coefficient: 7.23%, whereas deterministic AD algorithm – the largest: 12.58%. Computational results for the examples of processing times or tasks due dates with Erlang distribution were given in Table 5. For cases where the task parameters are random variables with Erlang distribution stability of solutions (presented in Table 5) is similar to the normal distribution (Table 4). Both probabilistic algorithms are much more stable than the deterministic algorithm. In summary, on the basis of the given results it can be concluded that probabilistic algorithms are significantly more stable. The solutions determined by them are much less sensitive to any random change in the parameters of the problem. For random processing times of tasks algorithm AP1 is more stable, whereas for the tasks due dates – slightly better is algorithm AP2. Table 5 Stability coefficients (42) for examples random parameters examples with Erlang distribution (case C and D) Case C ( p˜i » E(αi, λ))
Case D (d˜i » E(βi, μ))
AD
AP1
AP2
AD
AP1
AP2
40
2.56
1.83
2.01
4.89
3.34
2.86
50
3.16
1.97
1.62
5.11
3.82
3.44
100
4.27
2.64
3.07
7.93
5.56
4.73
200
5.08
3.37
3.12
9.36
7.28
8.12
300
7.11
4.27
4.68
13.17
9.88
7.87
400
9.48
5.13
5.26
16.58
9.66
10.08
500
13.88
5.76
6.24
21.43
10.08
11.17
Average
6.51
3.56
3.71
11.21
7.08
6.89
n
7. Remarks and conclusions In this work there was considered a problem of uncertain data modeling methods with the use of random variables with normal or Erlang distribution, which well describes the ‘natural’
230
randomness most often met while dealing with management practices. The paper presents the design of algorithm based on the tabu search method for a single machine jobs scheduling problem. Computational experiments were conducted to investigate the stability of algorithms, that is, the problem of the impact of the disorder parameter on changes in values of the optimized criterion. The obtained results clearly indicate that much more stable are the probabilistic algorithms, that is the algorithms, in which, as the comparative criterion, there was a function of central moments of random goal functions adopted. Possible implementation of proposed methods includes optimization of single production nest of the manufacturing system (bottleneck) and its generalization to multi-machine problems with uncertain parameters (e.g dependent on the weather) which can be modelled by probabilistic processing times, for instance in construction or industry.
References [1] B. Alidee, I. Dragon, “A note on minimizing the weighted sum of tardy and early completion penalties in a single machine”, Journal of Operational Research 96 (3), 559–563 (1997). [2] W. Bożejko, J. Grabowski, M. Wodecki, “Block approach-tabu search algorithm for single machine total weighted tardiness problem”, Computers & Industrial Engineering 50, 1–14 (2006). [3] W. Bożejko, M. Wodecki, “On the theoretical properties of swap multimoves”, Operations Research Letters 35(2), 227–231 (2007). [4] W. Bożejko, “Parallel path relinking method for the single machine total weighted tardiness problem with sequence-dependent setups”, Journal of Intelligent Manufacturing 21(6), 777–785 (2010). [5] X. Cai, X. Zhou, “Single-machine scheduling with expotential processing times and general stochastic cost functions”, Journal of Global Optimization 31, 317–332 (2005). [6] X. Cai, X. Wu, X. Zhou, “Optimal stochastic scheduling”, International Series in Operations Research & Management Science 207, Springer, 2014. [7] H.A.J. Crauwels, C.N. Potts, L.N. Van Wassenhove, “Local search heuristics for the single machine total weighted tardiness scheduling problem”, INFORMS Journal on Computing 10(3), 341–350 (1998). [8] B.C. Dean, “Approximation algorithms for stochastic scheduling problems”, PhD thesis, Massachusetts Institute of Technology (2005). [9] M.T. Den Basten, M. Stützle, M. Dorigo, “Design of iterated local search algoritms. An example application to the single machine total weighted tardiness problem”, J.W.Boers et al. (eds.) Evo Worskshop 2001, Lecture Notes in Computer Science 2037, 441–451 (2001). [10] A. Elyasi, N. Salmasi, “Stochastic scheduling with minimizing the number of tardy jobs using chance constrained programming”, Mathematical and Computer Modeling 57, 1154–1164 (2013). [11] Hu, K., Zhang, X., Gen, M., Jo J., “A new model for single machine scheduling with uncertain processing time”, Journal of Intelligent Manufacturing, DOI 10.1007/s10845‒015‒1033‒9, 1–9 (2015).
Bull. Pol. Ac.: Tech. 65(2) 2017 Unauthenticated Download Date | 4/25/17 7:48 PM
Stable scheduling of single machine with probabilistic parameters
[12] W. Jang, “Dynamic scheduling of stochastic jobs a single machine”, European Journal of Operational Research, 138, 518–503 (2002). [13] G. Kirlik, C. Oguz, “A variable neighbourhood search for minimizing total weighted tardiness with sequence dependent setup times on a single machine”, Computers&Operatins Research 39, 1506–1520 (2012). [14] A. Lann, G. Mosheiov, “Single machine scheduling to minimize the number of early and tardy jobs”, Computers and Operations Research, 23(8), 769–781 (1996). [15] J.K. Lenstra, A.G.H. Rinnoy Kan, A.G.H., P.Brucker, “Complexity of machine scheduling problems”, Annals of Discrete Mathematics 1, 343–362 (1977). [16] W. Li, W.J. Braun, Y.Q. Zhao, “Stochastic scheduling on a repairable machine with Erlang uptime distribution”, Advances in Applied Probability 30(4), 1073–1088 (1998). [17] Y. Li, R. Chen, “Stochastic single machine scheduling to minimize the weighted number of tardy jobs”, B.-Y.Cao et al. (eds.): Fuzzy Information and Engineering, AISC, 363–368 (1998). [18] M.M Mazdeh., H. Hadad, P. Ghanbari, “Solving a single machine stochastic scheduling problem using a branch and bound algorithm and simulated annealing”, International Journal of Management Science and Engineering Management 7(2), 110–118 (2012). [19] M. Nawaz, E.E. Enscore, I. Ham, “A heuristic algorithm for the m-machine, n-job flow-shop sequencing problem”, OMEGA 11(1), 91–95 (1983). [20] OR-Library https://files.nyu.edu/jeb21/public/jeb/info.html
[21] M.L. Pinedo: Scheduling: Theory, Algorithms, and Systems, fourth edition, Springer New York Dordrecht Heidelberg London (2010). [22] C.N. Potts, L.N. Van Wassenhove, “A branch and bound algorithm for the total weighted tardiness problem”, Operations Research 33, 177–181 (1985). [23] C.N. Potts, L.N. Van Wassenhove, “Single machine tardiness sequencing heuristics”, IIE Transactions 23, 346–354 (1991). [24] W.E. Smith, “Various optimizers for single-stage production”, Naval Research Logist Quartely 3, 59–66 (1956). [25] H.M. Soroush, “Single machine scheduling with stochastic processing times or stochastic due-dates to minimize the number of early and tardy jobs”, International Journal of Operations Research 3(2), 90–108 (2006). [26] H.M. Soroush, “Minimizing the weighted number of early and tardy jobs in a stochasic single machine scheduling problem”, European Journal of Operational Research 181, 266–287 (2007). [27] J.M. Van den Akker, H. Hoogeveen, “Minimizing the number of late jobs in a stochastic setting using a chance constraint”, Journal of Scheduling 11, 59–69 (2008). [28] J. Vondrák, “Probabilistic methods in combinatorial and stochastic optimization”, PhD Thesis, MIT (2005). [29] M. Wodecki, “A branch-and-bound parallel algorithm for single-machine total weighted tardiness problem”, Advanced Manufacturing Technology 37, 996–1004 (2008). [30] M. Wodecki, “A block approach to earliness-tardiness scheduling problems”, International Journal on Advanced Manufacturing Technology 40, 797–807 (2009).
231
Bull. Pol. Ac.: Tech. 65(2) 2017 Unauthenticated Download Date | 4/25/17 7:48 PM