ISSN 1750-9653, England, UK International Journal of Management Science and Engineering Management, 6(2): 84-93, 2011 http://www.ijmsem.org/
A two-stage PSO algorithm for job shop scheduling problem∗ Thongchai Pratchayaborirak , Voratas Kachitvichyanukul† School of Engineering and Technology, Asian Institute of Technology P. O. Box 4, Klong Luang, Pathumthani 12120, Thailand (Received 22 November 2009, Revised 12 April, 2010, Accepted 19 June 2010)
Abstract. This paper presents a two-stage particle swarm optimization algorithm for job shop scheduling problems. The algorithm is named 2S-PSO to reflect that two stages with multiple swarms are used. The random key representation is used and the schedules are constructed using a permutation with m-repetitions of job numbers. Performances of the algorithms are tested using benchmark instances for both the single objective and multiobjective cases. For the single objective cases, makespan and weighted tardiness are considered. For the multiobjective cases, three criteria are considered: makespan, weighted earliness, and weighted tardiness. The solutions are found using the weighted aggregate approach to allow for comparison with published work. In general, the solutions obtained via 2S-PSO dominate those published results in that it provided equal or better solution quality in much shorter computational time. Keywords: two-stage PSO, Particle Swarm Optimization, job shop scheduling problem, multi-objective, metaheuristic
1 Introduction The classical job shop scheduling problem (JSP) is to schedule a group of jobs on a group of machines to optimize a certain criterion followed by the constraint that each job has precedence and fixed processing time which are known in advance. Various solution methods developed for job shop scheduling problems include exact mathematical programming techniques, priority dispatching rules, simulation-based methods, artificial intelligencebased methods, and heuristics. Many job shop scheduling problems are known to be NP-hard (Gary et al., 1976 [11]), i.e., it may take extremely long time to achieve the optimal solution when the problem size increases. This is the main motivation for many researchers to search for alternatives by developing heuristic techniques to achieve near optimal solution with acceptable response time. Furthermore, almost all real-world practical scheduling problems are multi-objective in nature. No single solution can simultaneously satisfy the manufacturing strategy because of the conflicts, even the easiest objectives such as minimize tardiness, which reflects maximization of service level, and minimize earliness, which consider minimization of inventory. As a consequence, the preferences must be set by the decision maker. Traditional means for solving job shop scheduling problems by using exact mathematical programming technique is ineffective for large problems due to the long computation time. Over the years, many researchers developed heuristics to search for near optimal solutions in-
stead of searching for optimum. Yet, heuristics are problem specific and they may not be applicable to all situations. Metaheuristics approaches are the latest advancement of approximate search methods to solve intricate optimization problems. Previous applications of the metaheuristic techniques to solve JSP include Tabu Search (Nowicki and Smutnicki, 1996 [17]), Simulated Annealing (Matsuo et al., 1988 [16]; VanLaarhoven et al., 1992 [27]), Genetic Algorithm (Goncalves et al., 2005 [12]; Kachitvichyanukul and Sitthitham, 2009 [13]; Yamada and Nakano, 1995 [28]), Ant Colony Optimization (Udomsakdigool and Kachitvichyanukul, 2006 [24]; Udomsakdigool and Kachitvichyanukul, 2008 [25]) and Particle Swarm Optimization (Pongchairerks and Kachitvichyanukul, 2009 [19]; Pratchayaborirak and Kachitvichyanukul, 2007 [20]; Rookapibal and Kachitvichyanukul, 2006 [21]; Tasgetiren et al., 2004 [23]). This research presents a two-stage particle swarm optimization algorithm for job shop scheduling problems. The algorithm uses the random key representation (Bean, 1994 [4]) and each particle is decoded into a permutation with m-repetition of job numbers (Bierwirth, 1995 [5]). The evolutionary process of the algorithm is carried out in two stages and employs multiple populations for independent evolution. Four swarms are used in the first stage. Each swarm of particles is evolved by using a combined aggregative function. Except the first swarm, each successive swarm is composed of 80% of newly generated particles and 20% of particles randomly selected from previous swarm. In stage 2, equal numbers of particles are randomly selected from all
∗ The joint activities received funding from the National Research Council of Thailand, the Engineering Management Program of Kasetsart University, and the Asian Institute of Technology. The research is a part of the research program of the High Performance Computing Group at Asian Institute of Technology (AITHPC) which received support from the Thai GRID project. The authors wish to thank the AITHPC and the Thai GRID Center for the support. †
Correspondence to: E-mail address:
[email protected]. ©International Society of Management Science And Engineering Management®
Published by World Academic Press, World Academic Union
International Journal of Management Science and Engineering Management, 6(2): 84-93, 2011
four previous swarms for the final evolution process using same aggregating function. The algorithms are evaluated using the benchmark problems provided in OR- Library and compared with best known results from published literature.
2 Job shop scheduling problem The classical JSP is described as a situation in which n different jobs are to be scheduled on m different machines, subject to two main sets of constraints which are the precedence constraints and the conflict constraints. Each job has sequential operations and each operation has deterministic processing time which is known in advance. The precedence constraints ensure that each operation of a certain job is processed sequentially and the conflict constraints guarantee that each machine processes only one job at a time. The objective of JSP is to sequence operations on machines and specify starting time and ending time of each of the operations in order to minimize certain performance measures subject to the constraints. In addition, pre-emption is not considered. For the single objective cases, makespan and weighted tardiness are considered. For the multiple objective cases, the solutions are found via weighted aggregative function of all three criteria; makespan minimization, total weighted earliness minimization and total weighted tardiness minimization. Let W1 , W2 and W3 be the weighted values of three objective functions which are makespan minimization, total weighted earliness minimization and total weighted tardiness minimization respectively. In addition, let w j be the weight for earliness or tardiness for job j, and the variables used in the model are described as follows: p j,k : the process time of job j on machine k; x j,k : the start time of job j on machine k; M : a large positive number; rj : the ready time of job j; dj : the due date of job j; y j, j‘ ,k : a binary variable defined as ( 1 if job j is before job j‘ on machine k y j, j‘ ,k = 0 otherwise The mathematical model of the problem can be formulated as follows: Minimization of the aggregating function f 0 : min W1 f 1 + W2 f 2 + W3 f 3 .
(1)
Minimization of makespan f 1 : min max { x j,k + p j,k }. j,k
(2)
Minimization of total weighted earliness f 2 : min
∑ w j × (max{0, d j − (x j,k + p j,k )}).
(3)
j,k
85
x j,k + p j,k ≤ x j,k‘ ∀ j, k, k‘ , x + p j,k ≤ x j‘ ,k + M(1 − j j, j‘ ,k ), ∀ j, j‘ , k, j,k s.t. x j‘ ,k + p j‘ ,k ≤ x j,k + My j, j‘ ,k , ∀ j, j‘ , k, x j,k ≥ r j ≥ 0 ∀ j, k, y j, j‘ ,k binary ∀ j, j‘ , k.
(5) (6) (7) (8) (9)
where j, j‘ = {1, · · · , n}, k = {1, · · · , m}. Eq. (1) is used for the minimization of multiple objectives, while Eq. (2) is used for the objective of makespan minimization, Eq. (3) for the minimization of total weighted earliness and Eq. (4) for minimization of the total weighted tardiness. Eq. (5) is the precedence constraints. Eqs. (6) and (7) are the conflict constraints. Eq. (8) ensures that any job is not to start before its ready time.
3 Representation of solutions In PSO the individual is particle and a fixed number of particles form a swarm. From iteration to iteration, PSO searches by the movement of particles. The basic representation of a particle serves the same role as those of a chromosome for genetic algorithm. The solution representation is discussed here via a simple illustrative example. A simple job shop problem of 4 jobs and 3 machines is given in Tab. 1 to illustrate the coding scheme. Table 1 The 4 × 3 example of JSP Job Machine Sequence Processing Time A
M1
M2
M3
4
9
5
B
M3
M1
M2
5
5
4
C
M3
M2
M1
6
3
3
D
M1
M3
M2
3
4
6
A solution of the problem can be represented using a vector with dimension equal to the number of jobs time the number of machines. For the example problem, the vector holds 12 values as shown in Tab. 2. Each of the values in the vector is initialized with a uniform random number in the interval [0, 1]. These numbers are sorted and then the permutation of 3-repetition of 4 jobs is applied to decoded into the permutation of m-repetition and n jobs (Bierwirth, 1995 [5]). The sample particle shown in Tab. 1 is decoded and the result is a sequence as shown in Tab. 3. The advantage of this approach is that any permutation of this representation always leads to a feasible schedule. The sequence of operations in Tab. 2 is then transferred into a schedule by taking the first, second, etc. operation from the list and assign machine in the best available processing time without delaying other scheduled operations. During the schedule generation, each operation is allocated to a required machine in the best available processing time without delaying other scheduled operation. This procedure yields an active schedule. The procedure yields an active schedule as shown in Fig. 1.
Minimization of total weighted tardiness f 3 : Table 2 An encoded particle for a 4 × 3 job shop
min
∑ w j × (max{0, (x j,k + p j,k − d j )}) j,k
(4)
Task
1
2
3
4
5
6
7
8
9 10 11 12
Particle .3 .5 .6 .8 .1 .2 .4 .8 .9 .2 .9 .2
IJMSEM email for subscription:
[email protected]
Particle
.3 7 B
.5 1 C
.6 4 C
.8 1 C
.1 3 A
.2 6 A
.4 4 B
.8 3 D
.9 5 D
.2 3 A
.9 7 D
.2 9 B
Figure 3: The operation-based representation a particle T. Pratchayaborirak & V. of Kachitvichyanukul: A two-stage PSO algorithm
86
Fig. 4: 1 The The schedule schedule after Figure afterdecoding decoding
bors. The 2S-PSO adopts the CB neighborhood (Yamada and Nakano, 1995 [28]). Since the local search is usually Task 5 10 6 12 1 7 2 3 4 8 9 11 very time consuming, the local search is activated in 2SThis.1section discussed of the 2S-PSO is iteration Particle .2 .2 briefly .2 .3 .4 .5 .6 .8the .8 2S-PSO .9 .9 algorithm. PSOThe onlyoverall on theframework best particle when a certain illustrated in Figure 5. number is reached and the procedure will be repeated evA A A B B B C C C D D D ery fixed number of iteration. The re-initialize strategy is Task 1 2 3 4 5 6 7 8 9 10 11 12 applied to diversify the particles over the search space from The 2S-PSO algorithm works in two stages, stage 1time of the algorithm includes 4 swarms which are to time to avoid the propensity of being trapped in Particle .3 .5 .6 .8 .1 .2 .4 .8 .9 .2 .9 .2 executed in sequence by using the same objective function. When a certain swarm comes to anwill end,start when a local optima. The re-initialize algorithm B C C C A A B D D A D B certain iteration selected number is thenext procedure will a fixed percentage of particles, setting in advance, will be randomly to reached migrate and to the repeat again every fixed number of iteration. swarm to join with the newly initialized particles. By using some shared information from the
Table 3 The operation-based representation of a particle
4. Algorithm 2S- PSO for JSP
4 Algorithm forhasten JSPthe convergence of solution. The first stage ends when the fourth previous 2S-PSO swarm, it can This Section briefly discussed the 2S-PSO algorithm. The swarm is terminated. overall framework of the 2S-PSO is illustrated in Fig. 2. The 2S-PSO algorithm works in two stages, stage 1 of the algorithm includes 4 swarms which are executed in sequence by using the same objective function. When a certain swarm comes to an end, a fixed percentage of particles, setting in advance, will be randomly selected to migrate to the next swarm to join with the newly initialized particles. By using some shared information from the previous swarm, it can hasten the convergence of solution. The first stage ends when the fourth swarm is terminated. After a swarm met the stopping criteria, a fixed percentage of particles will migrate to the next swarm. The migration helps with the convergence rate by utilizing the search experience of the previous swarm. For Stage 1, 20% of particles from previous swarm will be migrated to the next swarm. For Stage 2, equal numbers of particles are randomly selected from the four previous swarms to form a single swarm and the PSO algorithm is repeated until the stopping condition is met. The best result obtained at the end of the second stage will be used as the best answer found. The components of the 2S-PSO are discussed in the next Section include the local search, re-initialization and migration strategies.
Table 4 The parameters used in 2S-PSO Makespan Weight
0.4
Earliness Weight
0.4
Tardiness Weight
0.3
Parameter setting for Phase I Phase II C p and Cg
1.6
1.6
wmax
0.9
0.9
wmin
0.6
0.6
Vmax
8
8
Rratio
0.8
0.8
th Rstart
150
150
Riter
100
100
th Sstart
160
160
Siter
30
30
Maximum Iteration
500
1000
Number of Particle
100
100
Migration Fraction
0.2
0.2
4
1
Number of Swarm
4.1 Parameters of PSO The particles’ movement in 2S-PSO is defined using the standard PSO algorithm from (Kennedy and Eberhart, 1995 [14]) which includes the inertia term, the cognitive learning term, and the social learning term. The parameter set for both stages follows the values used in Pratchayaborirak and Kachitvichyanukul (2007) [20], i.e., th th C p , Cg , wmax , wmin , vmax , Rratio , Rstart , Riter , Sstart , Siter = {1.6, 1.6, 0.9, 0.6, 8, 0.8, 150, 100, 160, 30}. Number of particles is set to 100 and the maximum number of iterations for stage I and stage II is set to 500 and 1000, respectively. The parameters used in 2S-PSO are summarizing in Tab. 4 below. 4.2 Local search, re-initialized and migration strategies Theoretically, local search attempts to improve solution quality by searching the better solutions around its neigh-
IJMSEM email for contribution:
[email protected]
5 Experimental results The experiments are implemented by using the C# language of the Microsoft Visual Studio 6.0. For each experiment step, the different parameters are input to the program as discussed above. The experiments are performed on desktop computer with Pentium 4, 2.8 GHz processor with 512 MB RAM. The objective of the experiment is to find the solution for the combined three objectives: minimize makespan, minimize total weighted earliness, and minimize total weighted tardiness. To whatever extent, the 2S-PSO algorithm can also be used for one or two objectives with only modification to the fitness function. The benchmark instances are investigated so as to evaluate the effectiveness and the efficiency of the proposed algorithm both in terms of the solution quality and computation time.
International Journal of Management Science and Engineering Management, 6(2): 84-93, 2011 Stage 1: Swarm evolve independently
87
Start
100 %initialization
80 %initialization
80 %initialization 20 %migrate
80 %initialization 20 %migrate
20% migrate
Swarm 1
Swarm 2
25%
25%
Stage 2: Initial last swarm by randomly equally migrate the evolved particles from all swarms in Stage 1
Swarm 3 25%
Swarm 4 25%
Last Swarm End
Figure 5: The Overall Framework of 2S-PSO Fig. 2 The overall framework of 2S-PSO After a swarm met the stopping criteria, a fixed percentage of particles will migrate to the next 5.1.2 JSP with total weighted tardiness swarm. The migration helps with the convergence rate by utilizing the search experience of the the benchmark problems from the OR-Library previous swarm. For Stage 1, 20% of particles from Because previous swarm will be migrated to the next 5.1.1 JSP with the minimization of makespan designedselected for thefrom makespan cases, a due swarm. For Stage 2, equal numbers of particles were are randomly the four minimization previous date and a priority given to each job ismust be added. In swarms to form a single swarm and the PSO algorithm is repeated until the stopping condition The results are taken from 43 benchmark problems from practice, it is frequently observed that 20% of the jobs are met. The best result obtained at the end of the second stage will be used as the best answer found. two types of the standard JSP: Fisher and Thompson essential and 60% of them are common and the remaining The components of the 2S-PSO are discussed in the next section include the local search, re(1984) [15]. The PSO parameters’ 20% are not very important. Thus, supposedly there had (1963) [10] and Lawrence initialization strategies.is solved setting is shown earlier in Tab.and 4.migration Each instance
5.1 Single objective cases
been 10 jobs in a shop; the weights were specified as 4 for 10 times by the 2S-PSO algorithm. In order to evaluate the job 1 and 2 (important); and 2 for jobs 3 to 8 (regular); 4.1. Parameters of PSO performance of the 2S-PSO algorithm, the results from the and 1 for jobs 9 and 10 (not important). All release dates The particles’ movement in 2S-PSO is defined using the standard PSO algorithm from [14] which following heuristic approaches are selected to compare the are set to be 0. The due date of job j was set to be equal and the social learning term. The parameter quality of solutions; includes the inertia term, the cognitive learning term, to the release date plus the sum ofth the processing times of set for both stages follows the values used in [21], i.e., { Cp, Cg, wmax, wmin, vmax, Rratio, Rstart , Riter, • Genetic algorithm (GA) by Croce et al. (1995) [7]. its operations multiplied by a due date tightness factor t th = {1.6, 1.6, 0.9,and 0.6,Pesch 8, 0.8, (1995) 150, 100, 160, 30}. Number of particles is set to 100 and the start , Siter}by • Genetic algorithm S(P-GA) Dorndorf (Eilon and Chowdhury, 1976 [9]). maximum number of iterations for stage I and stage II is set to 500 and 1000, respectively. The [8]. # " parameters in 2S-PSO are summarizing in Table 1 below. • Greedy randomized adaptiveused search procedures (GRASP) 10 by Binato et al. (2002) [6]. (10) d j = r j + t × ∑ pi j , 4.2. Local Search,(2ST-GA) Re-initialized Migration Strategies • Two-stage genetic algorithm byand Kai =1 chitvichyanukul and Sitthitham (2009) [13]. where d j is the due date of job j, r j is the ready time of Theoretically, local search attempts to improve solution quality by searching the better solutions
job j, t is a factor e.g. 1.5, 1.6, p j is the processing time of Tab. 5 shows the around comparison of the The quality of the soluits neighbors. 2S-PSO adopts the CB neighborhood [28]. Since the local isearch is tions. Column 3 presents the best known solution (BKS). operation i in job j. usually very time consuming, the local search is activated in 2S-PSO only on the best particle when For each algorithm, the results show the best solution found The results are achieved from the implementation over a certain iteration number is reached and the procedure will be repeated every fixed number of and the error as percentage of the best known solution. Col- 24 benchmark instances from two classes of standard JSP: iteration. The re-initialize strategy is applied to diversify the particles over the search space from umn 4 presents the results from GA by Corce et al. (1995) Adam, Balas and Zawack (1988) [1] and Lawrence (1984) timethe to time to avoid propensity of being trapped local optima. The re-initialize algorithm [7]. Column 5 presents results fromthe P-GA by Dorndorf [15].inThe parameter setting follows the optimal sets in Tab. start when a certainthe iteration number reached4.and the procedure every fixed and Pesch (1995) [8].will Column 6 shows results fromisBiEach instancewill is repeat solvedagain 10 times by the 2S-PSO algoof iteration. nato et al. (2002) [6]number and column 7 shows the results from rithm. 2ST-GA by Kachitvichyanukul and Sitthitham (2009) [13]. To compare the performance of the 2S-PSO algorithm Finally, last column shows the results from 2S-PSO. By with other existing heuristic approaches, the results from comparing between the 2S-PSO and the others aforemen- the 2S-PSO are compared with the results from Pinedo and tioned algorithm: Singer (1999) [18] and Asano and Ohta (2002) [3] for the • The 2S-PSO yields the best solutions in 21 of 43 bench- 10 × 10 benchmark instances. mark problems. Tab. 6 illustrated the experimental results of benchmark • The 2S-PSO outperforms GA by Corce et al. (1995) [7] instances. There are 12 instances with t = 1.5 and the in 5 of 8 instances based on the best solution among 10 other 12 instances with t = 1.6. Generally, the instance replications. with t = 1.5 cases is more intricate than the instance with • The 2S-PSO performs better than P-GA by Dorndorf and t = 1.6 cases, because the due dates of these instance are Pesch (1995) [8] in 25 of 26 instances based on the best so- tighter. To whatever extent, the last 5 jobs and 5 machine lution among 10 replications. of the instance LA21-LA24 were eliminated so as to turn • The 2S-PSO holds better solution than GRASP by Bi- them in to the smaller size, 10 × 10. The best known solunato et al. (2002) [6] in 13 of 19 instances based on the best tions in column 3 are obtained by the branch and bound solution among 10 replications. method (Singer and Pinedo, 1998 [22]). Column 4 shows • The 2S-PSO surmounts 2ST-GA by yanukul and Sit- the results from Asano and Ohta (2002) [3]. Columns 5 thitham (2009) [13] in all of 14 in stances based on the and 6 show the results from Pinedo and Singer (1999) [18] best solution among 10 replications. and Kachitvichyanukul and Sitthitham (2009) [13], respec-
IJMSEM email for subscription:
[email protected]
88
T. Pratchayaborirak & V. Kachitvichyanukul: A two-stage PSO algorithm Table 5 Makespan-case comparison of 2ST-PSO with GA, P-GA, GRASP and 2ST-GA Dorndorf & Binato et al. 2ST-GA 2ST-PSO Pesch P-GA (2002) [6] [13] Instance Size GRASP [8] Fit Value Fit Value Fit Value Fit Value Fit Value Best % Error Best %Error Best % Error Best % Error Best % Error FT06 06 × 06 55 n/ a n/ a n/ a n/ a * * * * * * FT10 10 × 10 930 946 1.72 960 3.23 938 0.86 952 2.37 937 abcd 0.75 FT20 20 × 05 1165 1178 1.12 1249 7.21 1169 0.34 1245 6.87 1192bd 2.32 LA01 10 × 5 666 * * * * * * * * * * 10 × 5 655 666 1.68 681 3.97 * * * * ∗ ab * LA02 LA03 10 × 5 597 666 11.56 620 3.85 604 1.17 603 1.01 ∗ abcd * 10 × 5 590 n/ a n/ a 620 5.08 * * * * ∗b * LA04 LA05 10 × 5 593 n/ a n/ a * * * * * * * * LA06 15 × 5 926 * * * * * * * * * * LA07 15 × 5 890 n/ a n/ a * * * * * * * * LA08 15 × 5 863 n/ a n/ a * * * * * * * * 15 × 5 951 n/ a n/ a * * * * * * * * LA09 LA10 15 × 5 958 n/ a n/ a * * * * * * * * LA11 20 × 5 1222 * * * * * * * * * * 20 × 5 1039 n/ a n/ a * * * * * * * * LA12 LA13 20 × 5 1150 n/ a n/ a * * * * * * * * LA14 20 × 5 1292 n/ a n/ a * * * * * * * * LA15 20 × 5 1207 n/ a n/ a 1237 2.49 * * * * ∗b * LA16 10 × 10 945 979 3.60 1008 6.67 946 0.11 955.3 1.09 ∗ abcd * 10 × 10 784 n/ a n/ a 809 3.19 * * 786 0.26 ∗bd * LA17 LA18 10 × 10 848 n/ a n/ a 916 8.02 * * 850.3 0.27 850bd 0.24 LA19 10 × 10 842 n/ a n/ a 880 4.51 * * 853 1.31 851bd 1.07 LA20 10 × 10 902 n/ a n/ a 928 2.88 907 0.55 907 0.55 907bcd 0.55 LA21 15 × 10 1046 1097 4.88 1139 8.89 1091 4.30 n/ a n/ a 1092 ab 4.40 15 × 10 927 n/ a n/ a 998 7.66 960 3.56 983.2 6.06 957bcd 3.24 LA22 LA23 15 × 10 1032 n/ a n/ a 1072 3.88 * * 1034.3 0.22 ∗bd * LA24 15 × 10 935 n/ a n/ a 1014 8.45 978 4.60 n/ a n/ a 967bc 3.42 LA25 15 × 10 977 n/ a n/ a 1014 3.79 1028 5.22 n/ a n/ a 1015c 3.89 LA26 20 × 10 1218 1231 1.07 1278 4.93 1271 4.35 1288.6 5.80 1266bcd 3.94 LA27 20 × 10 1235 n/ a n/ a 1378 11.58 1320 6.88 n/ a n/ a 1308bc 5.91 LA28 20 × 10 1216 n/ a n/ a 1327 9.13 1293 6.33 n/ a n/ a 1256bc 3.29 LA29 20 × 10 1152 n/ a n/ a 1336 15.97 1293 12.24 n/ a n/ a 1251bc 8.59 LA30 20 × 10 1355 n/ a n/ a 1411 4.13 1368 0.96 1439.6 6.24 1387bcd 2.36 LA31 30 × 10 1784 * * n/ a n/ a * * n/ a n/ a * * LA32 30 × 10 1850 n/ a n/ a n/ a n/ a * * n/ a n/ a 1857 0.38 LA33 30 × 10 1719 n/ a n/ a n/ a n/ a * * n/ a n/ a 1719 0.00 LA34 30 × 10 1721 n/ a n/ a n/ a n/ a 1753 1.86 n/ a n/ a 1766 2.61 LA35 30 × 10 1888 n/ a n/ a n/ a n/ a * * n/ a n/ a * * LA36 15 × 15 1268 1305 2.92 1373 8.28 1334 5.21 1355.6 6.91 1311bcd 3.39 LA37 15 × 15 1397 n/ a n/ a 1498 7.23 1457 4.29 1486.8 6.43 1473bd 5.44 15 × 15 1196 n/ a n/ a 1296 8.36 1267 5.94 n/ a n/ a 1282b 7.19 LA38 LA39 15 × 15 1233 n/ a n/ a 1351 9.57 1290 4.62 n/ a n/ a 1283bc 4.06 LA40 15 × 15 1222 n/ a n/ a 1352 10.64 1259 3.03 n/ a n/ a 1290b 5.56 * solution found by 2S-PSO is better than or equal to the best known solution. (a) solution found by 2S-PSO is better than or equal to GA by Corce et al. (1995) [7]; (b) solution found by 2S-PSO is better than or equal to P-GA by Dorndorf and Pesch (1995) [8]; (c) solution found by 2S-PSO is better than or equal to GRASP by Binato et al. (2002) [6]; (d) solution found by 2S-PSO is better than or equal to 2ST-GA by Kachitvichyanukul and Sitthitham (2009) [13]. Best known solution
Corce et al. (1995) [7]
IJMSEM email for contribution:
[email protected]
International Journal of Management Science and Engineering Management, 6(2): 84-93, 2011
89
Table 6 Total weighted tardiness of the 10 × 10 case comparison of 2ST-PSO with Asano and Ohta (2002) [3], Pinedo and Singer (1999) [18] and 2ST-GA (Kachitvichyanukul and Sitthitham, 2009 [13]) Asano and Best Instance
Size
Pinedo and
Ohta (2002)
known
[3]
solution
Fit Value Best
Singer
2ST-PSO
2ST-GA [13]
(1999) [18] Fit Value
Error
Best
Fit Value
Error
Time [Sec.]
Fit Value
Best Error
Best
Time [Sec.]
Error
t = 1.5 ABZ5
10 × 10
69
736
667
109
40
ABZ6
10 × 10
0
*
*
*
*
*
*
4711
75 ab
6.0
31
4585
*
*
30
FT10
10 × 10
394
1024
630
*
*
637
243
4656
566.5 ac
172.5
32
LA16
10 × 10
166
*
*
178
12
212
46
4599
203.5c
37.5
32
LA17
10 × 10
260
573
313
*
*
428
168
4532
260 ac
0.0
30
LA18
10 × 10
34
255
221
83
49
134
100
4523
78.5 abc
44.5
29
LA19
10 × 10
21
494
473
76
55
47
26
4779
63 ab
42.0
29
LA20
10 × 10
0
1246
1246
*
*
54
54
4656
28.5 ac
28.5
31
LA21
15 × 10
0
77
77
16
16
41
41
4710
4.5 abc
4.5
27
LA22
15 × 10
196
537
341
*
*
312
116
4702
194.5∗abc
−1.5
29
LA23
15 × 10
2
466
464
*
*
*
*
4746
∗a
*
30
LA24
15 × 10
82
465
383
*
*
106
24
4665
127 a
45.0
28
4815
Sum error
172
818
379
t=1.6 ABZ5
10 × 10
0
*
*
*
*
*
*
3826
*
*
30
ABZ6
10 × 10
0
*
*
*
*
*
*
3763
*
*
31
FT10
10 × 10
141
538
397
184
43
305
164
4843
193.8 ac
52.8
32
LA16
10 × 10
0
20
20
14
14
42
42
3819
6.8 abc
6.8
30
LA17
10 × 10
65
129
64
81
16
97
32
3680
63.8 abc
−1.2
30
LA18
10 × 10
0
35
35
*
*
*
*
3696
∗a
*
30
LA19
10 × 10
0
*
*
*
*
*
*
3754
*
*
28
LA20
10 × 10
0
89
89
*
*
*
*
3734
∗a
*
30
LA21
15 × 10
0
*
*
*
*
*
*
4711
*
*
29
LA22
15 × 10
0
260
260
*
*
13
13
4770
∗∗ac
*
29
LA23
15 × 10
0
96
96
*
*
*
*
4770
∗a
*
28
4663
∗a
*
28
LA24
15 × 10 Sum error
0
124
124
*
1085
* 73
*
* 251
58.4
* solution is equal to the best known solution. (a) solution found by 2S-PSO is better than or equal to Asano and Ohta (2002) [3]; (b) solution found by 2S-PSO is better than or equal to Pinedo and Singer (1999) [18]; (c) solution found by 2S-PSO is better than or equal to 2ST-GA by Kachitvichyanukul and Sitthitham (2009) [13].
tively. The last column shows the solutions achieved from the 2S-PSO. In addition, for the comparison of all 24 instances, the 2S-PSO yielded the new best known solution for the LA22 instance with t = 1.5 and LA17 instance with t = 1.6. By comparing between the 2S-PSO and the Asano and Ohta (2002) [3]; • The 2S-PSO provided better solutions for 18 of 19 in stances based on the best solution among 10 replications. • The 2S-PSO is able to solve 13 instances to optimality including the better-than-best known solutio while the algorithm by Asano and Ohta (2002) [3] can solve 6 instances to optimality. • The deviation of the results from the 2S-PSO is signifi cantly lower compared to others.
By comparing between the 2S-PSO with results from Pinedo and Singer (1999) [18]; • The 2S-PSO yielded better solutions in 7 of 13 instances related to the best known solution. • The 2S-PSO can solve 13 instances to optimality including the better-than-best known solution while the algorithm by Pinedo and Singer can solve 16 instances to optimality. • The sum of error for the algorithm by Pinedo and Singer is lower in the t = 1.5 cases but not for the t = 1.6 cases. By comparing between 2S-PSO and 2ST-GA; • The 2S-PSO can solve 13 instances to optimality including the better-best known solution while the 2ST-GA can solve 11 instances to optimality. • The deviation of results from 2S-PSO is considerably lower compared to those from 2ST-GA.
IJMSEM email for subscription:
[email protected]
90
T. Pratchayaborirak & V. Kachitvichyanukul: A two-stage PSO algorithm
Moreover, the 15 × 15 benchmark instances solved by the Asano and Ohta (2002) [3] and the 2S-PSO algorithm are compared. This concerns five instances with t = 1.6 and the other five instances with t = 1.5. Because Asano and Ohta (2002) [3] solved the instances by using the approximation method; hence, the best known solutions found by them are not guarantee to be optimal. Tab. 7 illustrated the computational results for the instance LA36-LA40 achieved from 2S-PSO compared to the results found by Asano and Ohta (2002) [3]. By comparing between the 2S-PSO and the Asano and Ohta (2002) [3]; • The 2S-PSO found all new best known for all of t = 1.5 case and yielded new best known 2 in 5 instances for t = 1.6 case, based on the best solution among 10 replications. • The sum of error for 2S-PSO is negative because the solution from 2S-PSO is better than previous best known solutions. 5.2 Multi-objective cases The algorithm are evaluated by employing 92 benchmark instances from four classes of standard JSP which are Fisher and Thompson (1963) [10], Lawrence (1984) [15], Adams et al. (1988) [1] and Applegate and Cook (1991) [2]. The parameter setting follows the optimal sets in Tab. 4. Each instance is solved by 2S-PSO with 10 runs. In order to compare the performance of 2S-PSO algorithm with other existing heuristic fashions, the results from 2S-PSO are compared with the results from multi-stage genetic algorithm (MSGA) by Lam et al. (2005) [26] and two-stage genetic algorithm (2ST-GA) by Kachitvichyanukul and Sitthitham (2009) [13]. Table 7 Total weighted tardiness of the 15 × 15 case comparison of 2ST-PSO with Asano and Ohta (2002) [3] Asano and Best Instance
Size
known solution
Ohta (2002)
2ST-PSO
[3] Fit Value
Fit Value
Best Error Best Error t=1.5 LA36 15 × 15
2928
2928
0
1506 −1422
LA37 15 × 15
2761
2761
0
1142 −1619
LA38 15 × 15
2236
2236
0
1258
−978
LA39 15 × 15
966
966
0
407
−599
LA40 15 × 15
684
684
0
635
−49 −4627
0
Sum error t=1.6 LA36 15 × 15
1038
1038
0
419
−619
LA37 15 × 15
448
448
0
270
−178
LA38 15 × 15
404
404
0
443
39
LA39 15 × 15
0
0
0
8
8
LA40 15 × 15
92
92
0
116
24
Sum error
0
−726
* solution found by 2S-PSO is better than previous best known solution
Since the multi-objective function uses in this research are the makespan minimization, the total weighted earliness minimization and the total weighted tardiness minimization, the weights of each objective are set equal to 0.3,
IJMSEM email for contribution:
[email protected]
0.3 and 0.4 for makespan, earliness and tardiness, respectively. Tab. 8 shows the comparison of results obtained from MSGA and 2ST-GA algorithms. By comparing the results obtained from 2S-PSO and MSGA; • The results obtained by 2S-PSO are better than the results from MSGA in 5 of 6 instances. Therefore, the 2S-PSO is capable in exploration of the new best known solutions for these instances. By comparing the results obtained from 2S-PSO and 2ST-GA; • The results from 2S-PSO are better than those of 2ST-GA in 3 of 6 instances; however, the 2S-PSO is more efficient in term of computational time. 5.2.1 Multi-objective for medium size cases For other medium size of instance, there are 23 instances with t = 1.5 and also 23 instances with t = 1.6. Theoretically, the instances with t = 1.5 is more difficult to solve because due dates of all jobs are tighter than the instances with t = 1.6. The results are solved by the 2S-PSO with the parameter setting as presented in Tab. 4. The computational results are shown in Tab. 9 and Tab. 10 for t = 1.5 case and t = 1.6 case, respectively. Table 9 The multi-objective results for medium instance compare to 2ST-GA (t = 1.5) Instance
Size
2ST-GA [13] Time 2ST-PSO Time Best
[Sec.]
Best
[Sec.]
t=1.5 FT06
6×6
37.4
833
38.4
9
FT10 10 × 10
680.5
3654
706.2
32
FT20 20 × 05
5177.7
4249
4220.1∗
59
LA01
10 × 5
821.9
1273
831.1
15
LA02
10 × 5
620.1
1252
625.3
17
LA03
10 × 5
719.8
1253
738.0
16
LA04
10 × 5
719.7
1221
755.3
14
LA05
10 × 5
680.7
1272
697.8
15
LA06
15 × 5
2018.6
2480
1936.6∗
34
LA07
15 × 5
2010.4
2438
2063.3
35
LA08
15 × 5
1941.3
2503
1931.9∗
32
LA09
15 × 5
2043.3
2554
2048.1
35
LA10
15 × 5
2173.6
2522
2169.5∗
33
LA11
20 × 5
3698.3
4420
4254.4
59
LA12
20 × 5
2917.0
4571
3687.8
56
LA13
20 × 5
3439.0
4435
4262.6
55
LA14
20 × 5
3914.1
4525
4502.1
59
LA15
20 × 5
3460.3
4403
4311.9
59 31
LA16 10 × 10
568.0
4649
567.9∗
LA17 10 × 10
575.4
4540
577.2
31
LA18 10 × 10
505.6
4539
516.5
31
LA19 10 × 10
482.1
4661
460.6∗
30
LA20 10 × 10
536.4
4653
553.6
31
* Solution found by 2S-PSO is better than 2ST-GA.
To evaluate performance of the 2S-PSO algorithm, the results for multi-objective JSP from 2ST-GA was compared.
International Journal of Management Science and Engineering Management, 6(2): 84-93, 2011
91
Table 8 The multi-objective case comparison of 2ST-PSO with MSGA and 2ST-GA MSGA (Lam and Instance
Size
2ST-GA [13]
et al., 2005 [26]) Best
Best
2ST-PSO Time [Sec.]
Time [Sec.] Best 519.5 a
32
29037
422.3
81
12901
368.5 ab
29
595.3
17483
553.7 ab
82
1060.4
619.1
22223
555.4 ab
96
590.9
423.4
2920
458.3 a
32
ABZ05
10 × 10
525.3
436.9
2903
ABZ06
20 × 15
383.1
321.0
LA21
15 × 10
503.2
412.4
LA27
20 × 10
777.4
LA36
15 × 15
ORB01
10 × 10
(a) solution found by 2S-PSO is better than MSGA; (b) solution found by 2S-PSO is better than 2ST-GA.
By comparing between the 2S-PSO and 2ST-GA (t = 1.5 cases); • The results achieved by the 2S-PSO are better than the results from MSGA in 6 of 23 instances. • In term of computational time, the 2S-PSO is much faster than the 2ST-GA algorithm. By comparing between the 2S-PSO and 2ST-GA (t = 1.6 cases); • The results achieved by the 2S-PSO are better than the results from MSGA in 1 of 23 instances. • In term of computational time, the 2S-PSO is faster than the 2ST-GA algorithm. Table 10 The multi-objective results for medium instance compare to 2ST-GA (t = 1.6) Instance
Size
2ST-GA [13] Time 2ST-PSO Time Best
[Sec.]
Best
5.2.2 Multi-objective for large size cases For the large size instance, there are 20 benchmarks with t = 1.5 and also 20 instances with t = 1.6. The results from the 2S-PSO are solved with the parameter as shown in Tab. 4. The computational results are shown in Tabs. 11 and 12 for t = 1.5 case and t = 1.6 case, respectively. By comparing between the 2S-PSO and 2ST-GA (t = 1.5 cases); • All of the results achieved by the 2S-PSO are better than the results from MSGA in both terms of solution quality and computational time. By comparing between the 2S-PSO and 2ST-GA (t = 1.6 cases); • All of the results achieved by the 2S-PSO are better than the results from MSGA in both terms of solution quality and computational time.
[Sec.] Table 11 The multi-objective results for large instance compare to 2ST-GA (t = 1.5)
t = 1.6 FT06 06 × 06
36.9
834
37.4
8
FT10 10 × 10
582.5
4537
606.6
31
FT20 20 × 05
3194.6
4564
3947.1
59
Instance
Size
2ST-GA [13] Time 2ST-PSO Time Best
[Sec.]
Best
1357.47
8081
505.8∗
28 30
[Sec.]
t=1.5
LA01
10 × 5
721.7
1614
741.4
15
LA02
10 × 5
552.7
1529
570.5
17
LA03
10 × 5
651.4
1604
660.4
16
LA04
10 × 5
668.2
1526
672.2
15
LA05
10 × 5
582.9
1595
585.5
15
LA06
15 × 5
1843.2
3242
1872.7
33
LA07
15 × 5
1859.9
3141
1936.1
35
LA08
15 × 5
1731.0
3179
1788.9
32
LA09
15 × 5
1786.5
3185
1896.6
33
15 × 5
1981.3
3318
1978.3∗
LA29 20 × 10
LA10
33
LA30 20 × 10
LA11
20 × 5
3212.2
5559
4027.4
57
LA31 30 × 10
LA12
20 × 5
2842.6
5511
3510.9
56
LA13
20 × 5
3100.2
5487
3813.4
LA14
20 × 5
3638.5
5463
LA15
20 × 5
3336.6
LA16 10 × 10
LA21 15 × 10 LA22 15 × 10
1485.18
8060
557.9∗
LA23 15 × 10
1241.46
8120
446.5∗
31
LA24 15 × 10
1112.58
8105
515.6∗
29
LA25 15 × 10
1342.80
8095
574.5∗
28
LA26 20 × 10
4905.54
17345
3746.2∗
111
LA27 20 × 10
4747.86
16880
3715.3∗
112
LA28 20 × 10
4527.63
17199
3626.9∗
110
5277.69
17295
3764.7∗
113
5541.21
16994
4071.1∗
111
14406.56
30454 12604.1∗
282
LA32 30 × 10
15757.28
30732 13932.7∗
279
56
LA33 30 × 10
14139.20
31098 12048.9∗
273
4349.0
58
LA34 30 × 10
15237.20
30526 13063.2∗
285
5291
4124.5
61
LA35 30 × 10
15877.20
29743
12051.3∗
301
541.8
8272
558.1
31
LA36 15 × 15
1844.91
21374
985.4∗
94
LA17 10 × 10
479.7
4587
489.5
31
LA37 15 × 15
2130.84
21836
1116.1∗
96
LA18 10 × 10
478.5
4598
517.8
31
LA38 15 × 15
1872.81
21559
1082.7∗
91
LA19 10 × 10
442.1
4729
446.3
31
LA39 15 × 15
1727.10
21677
1007.3∗
87
LA20 10 × 10
526.9
4537
534.8
30
LA40 15 × 15
1820.52
22016
945.3∗
91
* Solution found by 2S-PSO is better than 2ST-GA
* Solution found by 2S-PSO is better than 2ST-GA.
IJMSEM email for subscription:
[email protected]
92
T. Pratchayaborirak & V. Kachitvichyanukul: A two-stage PSO algorithm
6 Conclusions A two-stage particle swarm algorithm (2ST-PSO) is developed in order to find optimal or near optimal schedule for job shop scheduling problem. The proposed algorithm applies a serial particle swarm with migrated particle for the first phase which boosts the solutions’ convergence by using the information from the previous swarm. Then the randomly migrated particle are collected from all swarm in the first phase and used as initial population for the second phase. The iteratively evolution of the second phase is based on the aggregating objective function which weight all three objectives and combine them as a single aggregating function. The 2ST-PSO is evaluated by using the benchmark problems provided by OR-Library and compared with best known results from published works for both single and multi-objective cases. Based on the methodology and the experimental results, the following conclusions can be drawn as follows: The two-stage particle swarm algorithm can efficiently achieve good solutions in both single and multi-objective job shop scheduling problem. Moreover, the 2ST-PSO algorithm discovers 10 new best known solutions for singleobjective cases with weighted tardiness objective. For multi-criteria cases, the experimental result illustrates that the 2ST-PSO algorithm is efficient both in terms of computational time and solution quality. In the medium size problem, the proposed algorithm was compared with 2ST-GA and the results showed solutions for these cases are similar to solutions achieved by 2ST-GA and MS-GA. Table 12 The multi-objective results for large instance compare to 2ST-GA (t = 1.6) Instance
Size
2ST-GA [13] Time 2ST-PSO Time Best
[Sec.]
Best
1115.4
8113
539.1∗
29 30
[Sec.]
t=1.6 LA21 15 × 10 LA22 15 × 10
1157.8
8205
512.7∗
LA23 15 × 10
1029.9
8074
449.4∗
32
LA24 15 × 10
855.8
8223
490.5∗
29
LA25 15 × 10
1085.7
8084
553.8∗
30
LA26 20 × 10
4096.0
17415
3380.8∗
111
LA27 20 × 10
4615.2
17494
3223.8∗
113
LA28 20 × 10
3976.6
17772
2844.6∗
109 106
LA29 20 × 10
4718.3
17385
3164.0∗
LA30 20 × 10
4887.4
17543
3834.7∗
110
LA31 30 × 10
14265.1
30623 11205.6∗
281
LA32 30 × 10
14300.2
30738 12882.3∗
282
LA33 30 × 10
13705.0
30573 11045.1∗
258
LA34 30 × 10
14579.2
30848 11959.9∗
280
LA35 30 × 10
14583.2
30659
11754.0∗
299
LA36 15 × 15
1620.9
21574
936.7∗
91
LA37 15 × 15
1539.9
21374
915.1∗
93
LA38 15 × 15
1731.6
21505
975.4∗
91
LA39 15 × 15
1314.2
21440
879.7∗
89
LA40 15 × 15
1334.6
21401
812.1∗
90
* Solution found by 2S-PSO is better than 2ST-GA.
IJMSEM email for contribution:
[email protected]
In addition, for the large size problem of the multicriteria cases, the proposed algorithm performs best both in terms of computational time and solution quality compare to solution discovered by 2ST-GA and MS-GA.
References [1] Adams, J., Balas, E., and Zawack, D. (1988). The shifting bottleneck procedure for job shop scheduling. Management Science, 34(3):391–401. [2] Applegate, D. and Cook, W. (1991). A computational study of the job-shop scheduling problem. ORSA Journal on Computing, 3(2):149–156. [3] Asano, M. and Ohta, H. (2002). A heuristic for job shop scheduling to minimize total weighted tardiness. Computers and Industrial Engineering, 42(2-4):137–147. [4] Bean, C. (1994). Genetic algorithms and random keys for sequencing and optimization. ORSA Journal on Computing, 6(2):154–160. [5] Bierwirth, C. (1995). A generalized permutation approach to job shop scheduling with genetic algorithms. OR-Spektrum, 17(2):87–92. Special issue: Applied Local Search. [6] Binato, S., Hery, J., and et al. (2002). A Grasp for job shop scheduling. In Ribeiro, Celso, C., and et al., editors, Essays and surveys in metaheuristics. Kluwer Academic Publishers. [7] Croce, F., Tadei, R., and Volta, G. (1995). A genetic algorithm for the job shop problem. Computers and Operations Research, 22(1):15–24. [8] Dorndorf, U. and Pesch, E. (1995). Evolution based learning in a job shop environment. Computers and Operations Research, 22(1):25–40. [9] Eilon, S. and Chowdhury, G. (1976). Due date in job shop scheduling. International Journal of Production Research, 14(2):233. [10] Fisher, H. and Thompson, L. (1963). Probabilistic learning combinations of local job shop scheduling rules. In Muth, J. and Thompson, G., editors, Industrial Scheduling, pages 225–251. Prentice-Hall, Englewood Cliffs, NJ. [11] Gary, R., Johnson, S., and Sethi, R. (1976). The complexity of flow-shop and job shop scheduling. Mathematics and Operations Research, 1:117–129. [12] Goncalves, F., Jos´ e, J., and Resende, C. (2005). A hybrid genetic algorithm for the job shop scheduling problem. European Journal of Operation Research, 167:77–95. [13] Kachitvichyanukul, V. and Sitthitham, S. (2009). A twostage multi-objective genetic algorithm for job shop scheduling problems. Journal of Intelligent Manufacturing. DOI: 10.1007/s10845-009-0294-6. [14] Kennedy, J. and Eberhart, C. (1995). Particle Swarm Optimization. In Proceedings of the IEEE International Conference on Neural Networks, pages 1942–1948, Piscataway. [15] Lawrence, S. (1984). Supplement to, “resource constrained project scheduling: an experimental investigation of heuristic scheduling techniques.” In Technical Report, GSIA, Carnegie Mellon University. [16] Matsuo, H., Suh, J., and Sullivan, S. (1988). A controlled search simulated annealing method for the general job-shop scheduling problem. In Graduate School of Business. The University of Texas at Austin, Austin, Texas, USA. [17] Nowicki, E. and Smutnicki, C. (1996). A fast tabu search algorithm for the job shop problem. Management Science, 42(6):797–813. [18] Pinedo, M. and Singer, M. (1999). A shifting bottleneck heuristic for minimizing the total weighted tardiness in a job shop. Naval Research Logistics, 46(1):1–17. [19] Pongchairerks, P. and Kachitvichyanukul, V. (2009). A two-level particle swarm optimization algorithm on job-shop scheduling problems. International Journal of Operational Research, 4(4):390–411.
International Journal of Management Science and Engineering Management, 6(2): 84-93, 2011 [20] Pratchayaborirak, T. and Kachitvichyanukul, V. (2007). A two-stage particle swarm optimization for multi-objective job shop scheduling problems. In Proceedings of the 8th Asia Pacific Industrial Engineering and Management Society Conference, pages 9–13, Kaohsiung, Taiwan. [21] Rookapibal, L. and Kachitvichyanukul, V. (2006). Particle swarm optimization for job shop scheduling. In Proceedings of the International Computers and Industrial Engineering Conference, Taipei, Taiwan. [22] Singer, M. and Pinedo, M. (1998). A computational study of branch and bound techniques, for minimizing the total weighted tardiness in job shops. IIE Transactions, 30(2):109– 118. [23] Tasgetiren, F., Sevkli, M., and et al. (2005). Particle swarm optimization and differential evolution algorithms for job shop scheduling. Journal of Heuristics. Submitted to Journal of Heuristics. [24] Udomsakdigool, A. and Kachitvichyanukul, V. (2006). Twoway scheduling approach in ant algorithm for solving job shop
93
problems. International Journal of Industrial Engineering and Management Systems, 5(2):68–75. [25] Udomsakdigool, A. and Kachitvichyanukul, V. (2008). Multiple colony ant algorithm for job-shop scheduling problem. International Journal of Production Research, 46(15):4155– 4175. [26] Lam, N., Kachitvichyanukul, V., and Luong, T. (2005). A multistage parallel genetic algorithm for multi-objective job shop scheduling. In Proceedings of the APIEMS 2005 Conference, Manila, Philippines. [27] VanLaarhoven, M., Aarts, L., and Lenstra, K. (1992). Job shop scheduling by simulated annealing. Operations Research, 40(1):113–125. [28] Yamada, T. and Nakano, R. (1995). A genetic algorithm with multi-step crossover for job-shop scheduling problems. In Proceedings of the IEE/IEEE International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications, pages 146–151.
IJMSEM email for subscription:
[email protected]