Jan 18, 2016 - important to Malaysia as STIPULATED under the. OFFICIAL .... pengaruh saiz populasi terhadap algoritma juga dikaji. Keputusan kajian jelas.
A GENETIC SIMPLIFIED SWARM ALGORITHM FOR OPTIMIZING n- CITIES OPEN LOOP TRAVELLING SALESMAN PROBLEM
CHIENG HOCK HUNG
UNIVERSITI TUN HUSSEIN ONN MALAYSIA
UNIVERSITI TUN HUSSEIN ONN MALAYSIA STATUS CONFIRMATION FOR MASTER’S THESIS A GENETIC SIMPLIFIED SWARM ALGORITHM FOR OPTIMIZING nCITIES OPEN LOOP TRAVELLING SALESMAN PROBLEM ACADEMIC SESSION: 2015/2016 I, CHIENG HOCK HUNG, agree to allow this Master’s Thesis to be kept at the library under the following terms: 1. 2. 3. 4.
This Master’s Thesis is the property of the University Tun Hussein Onn Malaysia. The library has the right to make copies for educational purposes only. The library is allowed to make copies of this report for educational exchange between higher educational institutions. **Please Mark (√)
√
CONFIDENTIAL
(Contain information of high security or of great important to Malaysia as STIPULATED under the OFFICIAL SECRET ACT 1972)
RESTRICTED
(Contain restricted information as determined by the organization/ institution where research was conducted)
FREE ACCESS
Approved by,
(WRITER’S SIGNATURE)
(SUPERVISOR’S SIGNATURE)
Permanent address: P.O BOX 520, 98700 LIMBANG,
DR. NOORHANIZA WAHID
SARAWAK, MALAYSIA.
Date: __________________________
Date: __________________________
NOTE:
**
If this Master’s Thesis is classified as CONFIDENTIAL or RESTRICTED, please attach the letter from the relevant authority/organization stating reasons and duration for such classification.
A GENETIC SIMPLIFIED SWARM ALGORITHM FOR OPTIMIZING n- CITIES OPEN LOOP TRAVELLING SALESMAN PROBLEM
CHIENG HOCK HUNG
A thesis submitted in fulfillment of the requirement for the award of the Degree of Master of Information Technology
Faculty of Computer Science and Information Technology Universiti Tun Hussein Onn Malaysia
JANUARY 2016
ii
I hereby declare that the work in this thesis is my own except for quotations and summaries which have been duly acknowledged.
Student
:
...................................................... CHIENG HOCK HUNG
Date
:
18 JANUARY 2016
Supervisor
:
....................................................... DR. NOORHANIZA WAHID
iii
DEDICATION
Dedicated to the Lord Almighty God; Heavenly Father, Jesus Christ and Holy Spirit. Papa, Mama, Lydia, Debrorah, and Hope’s family.
iv
ACKNOWLEDGEMENT
Firstly, I would like to give thanks to my best friend, Heavenly Father, for His unconditional love and favors in my life. He is the source of my strength, wisdom and knowledge. He has empowered and granted me strength to soar on wing like eagle above the raging sea. Secondly, I would like to thanks my supervisor, Dr Noorhaniza Wahid as well. She was the direct contributor for this work. I would like to express my thankfulness to her from the bottom of my heart. Her understanding, wisdom and personal experiences have provided a good basis for this present thesis. She always guided and pointing me to the right direction toward fulfillment of this project. Other than that, she is also a kind person that always shares her life experiences to encourage and inspire me in this journey of research. I am grateful to my beloved parents and sisters for their support, provision, care and love. They were always encouraged me and even prayed a lot for me along my Master research’s journey. In addition, I would like to thanks Szakif Enterprize for providing me the best service in thesis proofreading. Lastly, my special gratitude is due to my brothers and sisters in Hope Life Group.
I wish to express my deepest and sincere thanks to them who had
accompanied me with prayers, love, concern, support and advice over the past whole two years. This research project would not have been completed without the help of many including the writing of others, who are acknowledged within the reference section.
Thank you!
v
PUBLICATION
Chieng, H. H., and Wahid, N. (2014). A Performance Comparison of Genetic Algorithm’s Mutation Operators in n-Cities Open Loop Travelling Salesman Problem. Recent Advances on Soft Computing and Data Mining pp. 89-97. Springer International Publishing. (Indexed by ISI, DBLP, El-Compendex, Scopus).
vi
ABSTRACT
Open Loop Travelling Salesman Problem (OTSP) is one of the extension of Travelling Salesman Problem (TSP) that finding a shortest tour of a number of cities by visiting each city exactly once and do not returning to the starting city. In the past, TSP and OTSP has been applied in various vehicle routing systems to optimize the route distance. However, in real-life scenario such as transportation problem does not seem similar as pictured in OTSP whereby do not all cities are required to be visited but simply restrain to several number of n cities. Therefore, a new problem called nCities Open Loop Travelling Salesman Problem (nOTSP) is proposed. In the past, Genetic Algorithm (GA) is a popular algorithm that used to solve TSPs. However, GA often suffers from premature convergence due to the difficulty in preventing the loss of genetic diversity in the population. Therefore, Genetic Simplified Swarm Algorithm (GSSA) is proposed in this study to overcome the drawback of GA. GSSA is an improved GA based algorithm with Simplified Swarm Optimization (SSO) algorithm’s characteristic named Solution Update Mechanism (SUM). The SUM is modified by embedding three GA mutation operators. Then, GSSA is used to optimize nOTSP in terms of finding the shortest tour. Later, the performance of GSSA is compared with GA without crossover operator (GA-XX) and GA with onepoint crossover operator (GA-1X). Performance of the proposed algorithm is measured based on the shortest distance and average shortest distance found by the algorithm. Meanwhile, an investigation on influence of population size towards algorithm was also studied. The experiment results show that GSSA can discover shorter tour than GA-XX and GA-1X. Nevertheless, the study also found that most of the good solutions are discovered in the larger population sizes from 3000 to 5000.
vii
ABSTRAK
Open Loop Travelling Salesman Problem (OTSP) merupakan salah satu lanjutan kepada Travelling Salesman Problem (TSP) yang mencari laluan tersingkat dengan syarat hanya mengunjungi setiap bandar sekali sahaja dan tidak kembali ke bandar di mana perjalanan bermula. Namun begitu, dalam situasi sebenar contohnya masalah pengangkutan, ia tidak sama dengan OTSP. Kebanyakan kenderaan tidak mengunjungi setiap bandar, tetapi hanya terhad kepada sejumlah bandar n sahaja. Oleh itu, masalah baru yang dinamakan n-Cities Open Loop Travelling Salesman Problem (nOTSP) dicadangkan di dalam kajian ini. Pada masa lalu, Genetic Algorithm (GA) merupakan algoritma yang popular untuk menyelesaikan masalah TSP. Namun, GA sering mengalami masalah penumpuan pra-matang yang disebabkan oleh kesukaran untuk mencegah kehilangan kepelbagaian genetik. Oleh itu, satu algoritma yang dinamakan Genetic Simplified Swarm Algorithm (GSSA) dicadangkan untuk mengatasi kelemahan GA di dalam kajian ini. GSSA merupakan penambahbaikan GA yang mengandungi ciri-ciri algoritma Simplified Swarm Optimization (SSO). Ciri-ciri ini dikenali sebagai Solution Update Mechanism (SUM). SUM diubahsuai terlebih dahulu dengan menerapkan tiga operator mutasi GA. Seterusnya, GSSA dilaksanakan untuk mengoptimumkan nOTSP dari segi pencarian laluan yang tersingkat. Di samping itu, prestasi GSSA akan dibandingkan dengan prestasi GA tanpa operator crossover (GA-XX) dan GA yang mengandungi operator one-point crossover (GA-1X). Prestasi algoritma dinilai melalui jarak terpendek dan purata jarak terpendek yang diperolehi. Selain itu, kajian ke atas pengaruh saiz populasi terhadap algoritma juga dikaji. Keputusan kajian jelas menunjukkan bahawa GSSA berupaya untuk meneroka laluan yang lebih pendek berbanding GA-XX dan GA-1X. Di samping itu, kajian ini juga mendapati bahawa kebanyakan penyelesaian yang baik ditemui dalam saiz populasi yang lebih besar iaitu dari 3000 hingga 5000.
viii
CONTENTS
TITLE
i
DECLARATION
ii
DEDICATION
iii
ACKNOWLEDGEMENT
iv
PUBLICATION
v
ABSTRACT
vi
ABSTRAK
vii
CONTENTS
viii
LIST OF TABLES
xii
LIST OF FIGURES
xiii
LIST OF ALGORITHMS
xv
LIST OF SYMBOLS AND ABREVIATIONS
xvi
LIST OF APPENDICES
xix
CHAPTER 1 INTRODUCTION
1
1.1
Background of Study
1
1.2
Problem Statement
3
1.3
Objectives of the Study
4
1.4
Scope of the Study
4
1.5
Thesis Outline
5
CHAPTER 2 LITERATURE REVIEW 2.1
Introduction
6 6
ix 2.2
2.3
2.4
The Concept of Optimization
6
2.2.1 Continuous Optimization
7
2.2.2 Combinatorial Optimization
8
Travelling Salesman Problem (TSP)
10
2.3.1 Variants of TSP
12
2.3.2 Open Loop Travelling Salesman Problem (OTSP)
13
2.3.3 n-Cities Open Loop Travelling Salesman Problem (nOTSP)
15
Genetic Algorithm
16
2.4.1 Genetic Operators
18
2.4.2 Mutation Operators
19
2.4.2.1 Inversion Mutation
20
2.4.2.2 Displacement Mutation
21
2.4.2.3 Pairwise Swap Mutation
21
2.4.3 Application of GAs on TSPs
22
2.5
Swarm Intelligence
24
2.6
Simplified Swarm Optimization
25
2.6.1 Algorithmic Structure and Flow
25
2.7
Chapter Summary
CHAPTER 3 RESEARCH METHODOLOGY
29 30
3.1
Introduction
30
3.2
Research Framework
30
3.3
nOSTP Topology Development
31
3.3.1 Nodes Generation
31
3.3.2 Nodes Mapping
33
3.3.3 Determining the Starting and Ending Points
34
x
3.4
3.5
3.6
3.7
3.3.4 nOTSP Creation
36
Genetic Simplified Swarm Algorithm (GSSA)
38
3.4.1 GSSA Development
40
3.4.2 Process of GSSA
45
Experimental Setup
46
3.5.1 Pre-processing: Determining the MaxGen
47
3.5.2 Parameter Setting and Data Collection
49
Performance Evaluation
49
3.6.1 Optimal Solution and Average
50
3.6.2 Influence of Population Size on Algorithm
50
Chapter Summary
50
CHAPTER 4 ANALYSIS AND RESULT
52
4.1
Introduction
52
4.2
Performance Analysis and Results
52
4.2.1 Shortest Path and Average Distance
53
4.2.2 Discussion on Algorithms’ Performance
56
4.2.2.1 New Characteristic of the Algorithm
56
4.2.2.2 Adequate Genetic Diversity
56
4.2.2.3 Destructive Effect of Crossover Operator
57
4.2.2.4 Crossing Path
58
4.3
The Influence of the Size Population on Algorithms
59
4.4
Chapter Summary
62
CHAPTER 5 CONCLUSION
63
5.1
Introduction
63
5.2
Research Contributions
63
xi
5.3
5.2.1 New Variant of TSP
63
5.2.2 Improved GA with SSO’s Characteristic
64
5.2.3 Performance of Proposed Algorithm-GSSA
64
5.2.4 Influence of the Population Size
65
Conclusion
65
REFERENCES
67
APPENDICES
81
VITA
xii
LIST OF TABLES
2.1
Related literatures on GAs for TSPs
22
3.1
Generated coordinates of each node
33
3.2
The symmetric distance-matrix
37
3.3
Fixed and adjusted parameters
47
3.4
Pre-processing results for n=10
48
3.6
Pre-processing results and MaxGen for n=10, 20, 30 and 40
49
3.6
Characteristics of the algorithms
49
4.1
The shortest distances discovered from the executed 10 runs using GSSA, GA-XX and GA-1X
4.2
The shortest distances discovered by GSSA, GA-XX and GA-1X for n=10, 20, 30 and 40.
4.3
4.4
53
54
Average distance over 10 runs using GSSA, GA-XX and GA-1X
55
Influence of the population size on the algorithms
60
xiii
LIST OF FIGURES
2.1
Illustration of the vehicle routing problem
10
2.2
Illustration of the Travelling Salesman Problem
11
2.3
Variants of TSP
13
2.4
The difference between OTSP and nOTSP
16
2.5
Before and after the inversion mutation was performed
2.6
Before and after the displacement mutation was performed
2.7
20
21
Before and after the pairwise swap mutation was performed
21
3.1
Research process
31
3.2
Nodes Mapped in Cartesian coordinate plane
34
3.3
The starting and ending points in the Cartesian coordinate plane
3.4
35
Nodes position before and after the manipulation of starting and ending points
35
3.5
New location of the starting and ending points
36
3.6
Population of chromosomes in matrix form
37
3.7
Arrangement of the first and second chromosomes with number of gene n in a matrix
38
3.8
The flowchart of the SSO
39
3.9
The flowchart of the GA
40
3.10
SUM with three decision points
41
3.11
New SUM after modification was performed
41
3.12
Embedding process of the new SUM into the GA
43
3.13
The flowchart of Genetic Simplified Swarm
3.14
Algorithm
44
Fixed and adjusted parameters
47
xiv 4.1
Cloning effect caused by crossover operator
58
4.2
Simulation of nOTSP’s paths
59
4.3
Influence of the population size on the GSSA
61
4.4
Influence of the population size on the GA-XX
61
4.5
Influence of the population size on the GA-1X
62
xv
LIST OF ALGORITHMS
2.1
Genetic Algorithm
17
2.2
Particle Swarm Optimization
26
2.3
Simplified Swarm Optimization
28
3.1
Genetic Simplified Swarm Algorithm
45
xvi
LIST OF SYMBOLS AND ABBREVIATIONS
d
-
Distance
𝐷
-
Variable domain
E
-
Edge
𝑓(𝑥)
-
Function of x
G
-
Graph
𝑔𝑏𝑒𝑠𝑡
-
Global best
gloFit
-
Global fitness
i
-
Location of the particle in PSO
Inf
-
Infinity
maxFit
-
Maximum fitness value
MaxGen
-
Maximum generation
m
-
Number of total given cities
min
-
Smallest elements in array (MATLAB SYNTAX)
n
-
Number of visited cities
newFit
-
New fitness
p
-
Population size
𝑝𝑏𝑒𝑠𝑡
-
Local best
randperm
-
Random permutation (MATLAB SYNTAX)
rand
-
Random (MATLAB SYNTAX)
𝑠
-
Optimal solution
S
-
Global optimal solution
t
-
Time
V
-
Vertex
w
-
Inertial weight
zeros
-
Create array of all zeros (MATLAB SYNTAX)
α
-
Personal experience
β
-
Population experience
xvii ℝ
-
Real number
𝑣𝑖𝑡
-
Particle velocity in PSO
𝑥𝑖𝑡
-
Particle position in PSO
𝑐𝑖
-
Constraints
𝐶𝑤 , 𝐶𝑝 , 𝐶𝑔
-
Predetermined constants
ACO
-
Ant Colony Optmization
ABC
-
Artificial Bee Colony
AI
-
Artificial Intelligent
AIS
-
Artificial Immune System
AS
-
Ant System
BF
-
Bacterial Foraging
CSO
-
Cat Swarm Optimization
EA
-
Evolutionary Algorithm
DPfGA
-
Distributed Parameter Free Genetic Algorithm
DPSO
-
Discrete Particle Swarm Optimization
GA
-
Genetic Algorithm
GA-1X
-
Genetic Algorithm with one-point crossover operator
GA-GSTM
-
Greedy Sub Tour Mutation operator
GA-XX
-
Genetic Algorithm without crossover operator
GNSS
-
Global Navigation Satellite System
GSO
-
Glowworm Swarm Optimization
GSSA
-
Genetic Simplified Swarm Algorithm
nOTSP
-
n-Cities Open Loop Travelling Salesman Problem
OTSP
-
Open Loop Travelling Salesman Problem
PSO
-
Particle Swarm Optimization
SA
-
Simulated Annealing
SUM
-
Solution Update Mechanism
TS
-
Tabu Search
TSP
-
Travelling Salesman Problem
xix
LIST OF APPENDICES
APPENDIX
A
TITTLE
Computational Results of GSSA, GA-XX and GA-X1
B
PAGE
81
Pre-processing results for the predetermined MaxGen on n=10, 20, 30 and 40
85
CHAPTER 1
INTRODUCTION
1.1
Background of Study
Road networks play a significant role for the economics growth and development of a city. Road networks can be described as the equivalent of the veins in the human body, and the vehicles are the blood cells that carry the nutrition from one part of the body to the another part. In the reality of transportation, it is important for the transportation system to identify the best route to navigate the drivers to their destination. For example, an emergency evacuation unit’s transportation fleet such as ambulances and firetrucks are required to reach the emergency site quickly by using shortest tour for evacuation and emergency purposes. The other example such as the Global Positioning System (GPS) that often used by the drivers in order to navigate and show them the shortest route to an unfamiliar place. Until today, many researches are still continuously improving the real-world vehicles routing system in order to provide more effective and efficient route to travel (Toth & Vigo, 2014; Gomez & Salhi, 2014; Royo et al., 2015). One of the common vehicles routing problem in computer science is Travelling Salesman Problem (TSP) (El-Gharably et al., 2013). This problem is used to simulate and solve routing problems. TSP is a well-known and important combinatorial optimization problem (Ausiello et al., 2012). It was first introduced by a mathematician from Ireland named William Rowan Hamilton and a British mathematician, named Thomas Penyngton Kirkman in the 1800s (Matai et al., 2010). Later, the problem was formulated by Karl Menger in 1930 (Maredia, 2010). TSP is closely related to the Hamiltonian path problem, where it devotes a path in an undirected or directed graph that visits all the vertices exactly once (Abdoun & Abouchabaka, 2012). However,
2 the idea of TSP is in regard to a salesman who supposed to travel by visiting all the given cities exactly once and returns to the city he started, with the shortest route. Despite TSP is devoted to a complete closed Hamilton path, TSP generally can be divided into two categories, which are Closed Loop TSP and Open Loop TSP (OTSP). Closed Loop TSP is similar with the ordinary TSP, while OTSP has a slight difference when compared with TSP. The difference between OTSP and TSP is that it has different starting and ending points. In other words, salesman in the OTSP travels to each city exactly once by departing from one city but does not return to the city where he departed. However, in reality, today’s transportation issue is not exactly similar to what has been described in the TSP and OTSP. In contrary, the numerous of transportation issues are not related with “visit all the given cities”, in fact, simply visiting certain number of cities rather than all the given cities which can lead to shortest tour distance. For example, in the logistics of merchandise delivery services, the drivers are required to plan the route by departing from the depot to the destination without required to visit all the cities along the route, yet, they are restrained to a certain numbers of cities for cost and time saving purposes. Inspired from this issue, this research proposes a new extension of OTSP called n-Cities Open Loop Travelling Salesman Problem (nOTSP). The nOTSP can be illustrated through the scenario where a salesman is given a set of cities, but only required to visit a certain number of cities rather than all the cities in a minimum tour. Over the past decades, many algorithms have been successfully applied to a wide range of combinatorial problems including TSPs. These algorithms include Tabu Search (TS) (Pedro et al., 2013), Genetic Algorithm (GA) (Nagata & Soler, 2012) and Simulated Annealing (SA) (Wang et al., 2013). Among them, GA is one of the most popular algorithms that used to solve permutation problems such as TSP (Ahmed, 2010). In GA, it generates a set of possible solutions through permutation of the genes. Hence, the solutions of TSP can also be easily represented as permutation of genes in the GA. Besides GA, swarm-based algorithms also have been used to solve high complexity problems such TSPs. The example of swam-based algorithms have been use to solve TSPs in the past are Particle Swarm Optimization (PSO) (Eberhart & Kennedy, 1995), Ant Colony Optimization (ACO) (Colorni et al.,1992), Artificial
3 Bee Colony algorithm (ABC) (Karaboga, 2005) and Bat algorithm (BA) (Yang, 2010). Recently, another new swarm-based algorithm that has been proposed was named Simplified Swarm Algorithm (SSO) (Chung & Wahid, 2012). SSO is the variant of PSO. In the past, SSO is used to solve the classification problem and has shown good performances. One of the reasons for its success is due to its special characteristic, known as the comparison strategy. The purpose of this comparison strategy is to update the global best (gbest) solution once a better solution is found. Therefore this research proposes a GA based algorithm by adopting the characteristic of SSO in GA. Hence, this algorithm is named as Genetic Simplified Swarm Algorithm (GSSA). Later, the algorithm is used to optimize the nOTSP.
1.2
Problem Statement
TSP is a well-known combinatorial problem that is often used to model vehicles’ routing issues such as in transportation scenarios. However, it is realized that the problems are not exactly similar as what has been pictured in TSP and OTSP. Conversely, the vehicle may only travel from the starting point to the ending point by visiting only a certain number of cities with minimum total travelling distance. For example, a logistic services company which is in charge of delivering goods from the depot to the destination by visiting only several numbers of cities, without passing through all the cities along the route to keep the minimum distance. In order to tackle this issue, this research models a variant of TSP called nOTSP. In recently year, nature-inspired algorithms are commonly used and are popular in the context of optimization. Among them, GA has been highlighted to have good performances in solving many combinatorial problems such as TSP (Ahmed, 2010; Dwivedi et al., 2012; Bahaabadi et al., 2012). However, GA often suffers from the tendency to converge towards local optima or also known as prematurely converge (Vashisht, 2013). Genetic diversity is often considered as the primary reason for prematurely convergence in GA (Gupta & Ghafir, 2012). The premature convergence is generally due to insufficiency of the diversity within the population (Malik & Wadhwa, 2014). Therefore, to ensure the adequate of genetic
4 diversity is crucial for the algorithm to avoid them to be trapped at the local optima (Gupta & Ghafir, 2012; Malik & Wadhwa, 2014). To overcome the drawback, this research proposes an improved GA with SSO’s characteristic in order to prevent the loss of genetic diversity and improve the solutions.
1.3
Objectives of the Study
The objectives of this research are: i. to propose a new extension of OTSP variant named n-Cities Open Loop Travelling Salesman Problem (nOTSP), ii. to propose an improved technique of Genetic Algorithm (GA) with Simplified Swarm Optimization (SSO) algorithm’s characteristic to prevent the loss of the genetic diversity in the population, iii. to develop the propose technique in (2) for optimizing the nOTSP in term of finding the shortest path, and iv. to evaluate the performance of the proposed technique with other GA variants in terms of shortest distance and population size.
1.4
Scope of the Study
This research focuses on single vehicle travels from a given starting point to an ending point in nOTSP with n number of cities. The performance of the proposed algorithm and the other GAs will be compared and analyzed in terms of the shortest tour and the influence of population size toward the solutions. In this research, the total number of cities m is set to be 50 as an experimental test case to represent the real cities. Thus, a set of 50 cities are represented as nodes in this study and all the nodes are generated randomly by computer. However, there is high possibility for starting and ending points are appeared to be closed to each other. Therefore, the manipulation of the starting and ending points will be conducted to ensure that they are far apart as what has been pictured in real-world scenario. In addition, four different data sets which represent the number of visited cities n are set to be 10, 20, 30 and 40 are employed. Moreover, each n is tested with 5 different population sizes
5 p, that are 1000, 2000, 3000, 4000 and 5000. During each experiment, each n is executed 10 times on different p. Meanwhile, the performances of the proposed algorithm are compared with other GA variants, which are GA without crossover operator (GA-XX) and GA with one-point crossover (GA-1X). On the other hand, computational time and iteration are not taken into account in this research.
1.5
Thesis Outline
The remaining of the chapters are structured as follows. Chapter 2 provides the fundamental theories regarding the optimization, TSP, GA, SSO and their applications. This is followed by reviews of the research made by past researchers and scholars in the similar field. In addition, this chapter lays a foundation for constructing the nOTSP and the proposed algorithm. Then, Chapter 3 discusses the methodology used in this research. This details how the problem is constructed, how the proposed algorithm is developed, how the experiment is carried out systematically and how the results are recorded, calculated and analyzed. Furthermore, Chapter 4 presents the analysis and results of this research. In this chapter, the proposed algorithm that has been developed in Chapter 3 is further validated for its efficiency and accuracy based on the recorded experimental results. The analysis and evaluations are carried out based on the computational results. Later, the reasons are justified. Lastly, Chapter 5 concludes the research and findings as well as summarizing the contributions of the proposed algorithm are summarized followed by the recommendations of future works.
CHAPTER 2
LITERATURE REVIEW
2.1
Introduction
Many techniques have been proposed to optimize the Travelling Salesman Problem (TSP) or its variants in term of discovering the shortest route. Although many literatures covered a wide variety of the theories, this review only focuses on four dominant themes regarding the research topic. The four dominant themes are: the concept of optimization and its category, the definition of the TSP and n-Cities Open Loop Travelling Salesman Problem (nOTSP), the Genetic Algorithm (GA) as the base of the proposed technique and its application on similar problems, and the introduction of the Simplified Swarm Optimization (SSO) and its comparison strategy called solution update mechanism (SUM). Although many literatures have presented these themes through a variety of contexts, this study focuses on improved technique and its application on nOTSP. Thus, this chapter lays the foundation for the further development that is discussed in the next chapter.
2.2
The Concept of Optimization
Optimization happens everywhere and anytime, it ranges from simple problem to complex problem in daily routines. In addition to the industrial and scientific worlds, optimization also plays a significant role in controlling and maintaining the performance in minimizing and maximizing an objective function. For instance, business organizations have to maximize their profit and minimize the cost, engineering design has to maximize the performances of the designed product while of course minimizing the cost at the same time (Yang, 2008).
7 The root of “optimization” is “optimal” that carries the meaning of “best”, “better” or “good enough” (Keeton et al., 2007; Fletcher, 2013). In other words, the phrase “optimal solution” can be explained as the “best solution”. Blum & Roli (2003) described optimization as concern the choice of a “best” configuration of a set of variables in order to achieve the goals. On the other hand, optimization can be defined as choosing the best solution among a given set of solutions (Khajehzadeh et al., 2011). Therefore, the optimization theory and methods are needed to deal with the problems by selecting the best alternative based on the given objective function (Chong & Zak, 2013). The area of optimization has received numerous attentions in recent years particularly in the field of computer science, including the development of userfriendly software, high performance processors as well as applied in solving various high complexity problems by providing efficient solution from all feasible solutions. For instance, in the Global Positioning System (GPS), optimization plays the role in guiding the driver to reach the destination by discovering and providing the best possible route. However, in general, the area of optimization can be divided into two main categories, which are continuous optimization and combinatorial optimization (Blum & Roli, 2003). Both types of optimization will be discussed in details in the next subsection.
2.2.1
Continuous Optimization
Continuous optimization is a branch of optimization in applied mathematics and it is the opposite of the discrete optimization, or combinatorial optimization. In continuous optimization, the variables are allowed to take on any values, which are usually real numbers (Gould, 2006). On the other hand, continuous optimization can be defined as finding the minimum or maximum value of a function of one or many real variable which subject to constraints. The constraints are usually in the form of equations or inequalities. This characteristic of continuous optimization shows it is different from combinatorial optimization, in which the variables in combinatorial problem may be binary, integer, or abstract objects that are drawn finitely from sets of many elements.
8 According to Gould (2006) & Saleh (2014), an optimization problem is the minimization or maximization of an objective function f over a vector of variables x. This is subject to a vector of constraints c that the variables in x must satisfy. An optimization problem can be derived as in equation 2.1:
min 𝑓(𝑥) 𝑛 𝑥∈ℝ
subject to
{
𝑐𝑖 (𝑥) = 0 1 ≤ 𝑖 ≤ 𝑘 𝑐𝑖 (𝑥) ≤ 0 𝑘 < 𝑖 ≤ 𝑚
(2.1)
where x = (𝑥1 , 𝑥2 , . . . ,𝑥𝑛 ) is a vector of n variables of the problem, f : ℝ𝑛 →ℝ which is the objective function to be minimized, {𝑐𝑖 (x) = 0 |1 ≤ i ≤ k} are the equality constraints over the variables in the vector x and {𝑐𝑖 (x) ≤ 0 |k < i ≤ m} are the inequality constraints over the variables in the vector x. By using this convention, the standard form defines a minimization problem. A maximization problem can be treated by negating the objective function f to −f. In real-world scenarios, the continuous optimization has been applied in many areas, such as optimize the high-pressure gas network in The National Grid Gas National Transmission System (NTS) at United Kingdom (UK) and optimize the electrical-power scheduling by minimizing the cost through controlling the flow of the current (Gould, 2006). However, the optimization problem in this research does not belong to this category; instead, it belongs to the combinatorial optimization problem. The combinatorial optimization will be discussed in detail in the next section.
2.2.2
Combinatorial Optimization
The field of optimization is a rapidly growing research field that concerned with the choice of optimal solutions for a set of variables into achieving the objectives. In the field of applied mathematics and theoretical computer science, combinatorial optimization is a topic that consists of finding an optimal from a finite set of objects (Schrijver, 2003). Cook et al., (2009) described that combinatorial optimization is a combination of both combinatorics linear programming and the theory of algorithms for solving the optimization problem over discrete structure. Furthermore, Luke (2012) mentioned that the combinatorial optimization problem is the solution that
9 consists of a combination of unique components selected from a typically finite or called set. In short, the ultimate purpose and objective are to find the optimal combinatorial of components. According to Blum & Roli (2003), a combinatorial optimization problem P=(S, f) can be defined by, a set of variables X = {𝑥1 , 𝑥2 …..𝑥𝑛 }, variable domains 𝐷1, 𝐷2 ….. 𝐷𝑛 , constraints among variables and an objective function f to be minimized, where f : 𝐷1 × 𝐷2 ×…..× 𝐷𝑛 → ℝ+ . As maximize the problem, one can simply negate the objective function f to –f. Hence, the set of all possible feasible assignments is shown here:
𝑆 = {𝑠 = {(𝑥1 , 𝑣1 ), … , (𝑥𝑛 , 𝑣𝑛 )} | 𝑣𝑖 ∈ 𝐷𝑖 , 𝑠 𝑠𝑎𝑡𝑖𝑠𝑓𝑖𝑒𝑠 𝑎𝑙𝑙 𝑡ℎ𝑒 𝑐𝑜𝑛𝑠𝑡𝑟𝑎𝑖𝑛𝑡𝑠},
(2.2)
where, S denotes the search space or solution space. Each element of the set can be treated as the possible solution to the problem. Once the optimal solution 𝑠 ∗ ∈ 𝑆 with minimum objective function has been found, that is, 𝑓(𝑠 ∗ ) ≤ 𝑓(𝑠) ∀𝑠 ∈ 𝑆. Thus, 𝑠 ∗ is called a global optimal solution of (S, f) and 𝑆 ∗ ⊆ 𝑆 is the set of globally optimal solution (Blum & Roli, 2003). One example of combinatorial optimization problem is the knapsack problem. The purpose of the knapsack problem is to fill the items into the knapsack with the total highest value yet without overfilling the knapsack (Luke, 2012). Another ubiquitous example of combinatorial optimization problem is the vehicle routing problem (VRP) that was proposed by Dantzig & Ramser (1959). The aim of this problem is to determine a set of path with least cost, where a vehicle will depart from the depot to visit each city to serve the demands exactly once by exactly one vehicle and all the routes will start and end at the same depot. While, the total demand on each route does not exceed the vehicle capacity. The VRP is illustrated in Figure 2.1 Besides
VRP,
another
common
problem
involving
combinatorial
optimization is the travelling salesman problem (TSP). The literature of TSP is covered in the next section.
10
Figures 2.1: Illustration of the vehicle routing problem. (Source: http://neo.lcc.uma.es/)
2.3
Travelling Salesman Problem (TSP)
TSP is a NP-hard problem that has been widely studied in the field of combinatorial optimization (Yan et al., 2012). It was first formulated by Karl Menger in 1930 (Maredia, 2010; Singh & Lodhi, 2013). The name “Travelling Salesman Problem” was introduced by Hassler Whitney in Princeton University at 1934 (Alexander, 2005). Figure 2.2 shows the illustration of the TSP. TSP can be described as follow, a salesman who desires to visit n cities, and supposed to find out the shortest Hamilton tour through visiting all the cities only once and finally returning to the city where he started. In 1954, TSP was derived as an integer program and solved by using cutting plane method (Dantzig et al., 1954). Later, TSP was revealed as an NP-hard problem due to its computational complexity in the manner of finding the optimal tour (Karp, 1972). Because the problem is computationally difficult, a large number of heuristic and exact methods have been proposed to provide the optimal solutions (Applegate et al., 2011).
11
Figure 2.2: Illustration of the Travelling Salesman Problem. (Source: http://www.pixbam.com/germany-map/file:blank-map-germanystates./2389)
According to Matai et al., (2010) the feasible solutions of TSP is given as (n1)!/2 where n represents the number of cities. TSP can be presented on a complete undirected graph G = (V, E), where V = {1,…,n} is denoted to the vertex node or city, E = {(i, j) : i, j ∈ V, i < j} is an edge set and A = {(i, j): i, j ∈ V, i ≠ j} is an arc set. A non-negative distance matrix D = (𝑑𝑖𝑗 ) is defined on E or on A. In particular, this is the case of planar problems in which the vertices are points 𝑃𝑖 = (𝑋𝑖 ,𝑌𝑖 ) in the plane, and 𝑑𝑖𝑗 =√(𝑋𝑖 − 𝑋𝑗 )2 + (𝑌𝑖 − 𝑌𝑗 )2 is the Euclidean distance. The triangle inequality is also satisfied if 𝑑𝑖𝑗 is the length of a shortest path from i to j on G. TSP is not just applied in route planning issue, it is applied in many of today’s industry. The TSP has several applications, such as in global navigation satellite system (GNSS) surveying networks in order to determine the geographical positions of unknown points on and above the earth by using satellite equipments (Saleh & Chelouah, 2004). In addition, Wakabayashi et al., (2014) have also employed the TSP to determine the optimum location of the central post office in Bangkok. Moreover, TSP has been applied in logistic practice such as in the
12 distribution of food products from producers to shops, the distribution of fuel to petrol stations and the distribution of various products from producers or distributors to customers (Filip & Otakar, 2011).
2.3.1
Variants of TSP
The idea of traditional TSP consists of a complete closed loop which visits all the given cities once and returns to its original point. Due to the different scenarios that happen today, TSP has been modelled into different variants. These variants of TSP can be classified into two main categories which are the closed loop travelling salesman problem (TSP) and open loop travelling salesman problem (OTSP). Both problems carry the same objective which is to find the minimum tour length. The only difference between them is the starting and ending points. TSP has the same starting and ending point, while OTSP has the different starting and ending point (Wang et al., 2013). Apart from that, TSP and OTSP can also be divided into two sub-categories, which are, single depot multiple salesman (SDMS) and multiple depots multiple salesman (MDMS) (Tang et al., 2000; Nallusamy, 2013; Wang et al., 2013). In TSP, the subcategories are known as single depots multiple salesman-TSP (SDMS-TSP) and multiple depots multiple salesman-TSP (MDMS-TSP). However, in OTSP, the subcategories are known as single depots multiple salesman-OTSP (SDMS-OTSP) and multiple depots multiple salesman-OTSP (MDMS-OTSP). Literally, both SDMS and MDMS are containing more than one salesman to operate in the same time; meanwhile, the difference between both SDMS and MDMS is the number of depot. The variants of TSP are shown in Figure 2.3. However, this study will only focuses on OTSP due to the research on OTSP is still limited compared with the researches on traditional TSP (Wang & Hou, 2013; Vashisht, 2013). Meanwhile, TSP may not be reflective of today’s real-life transportation scenarios whereby there are many vehicles today that travel from different location and end at another location. The further discussion on OTSP is discussed in the next section.
13 Variants of TSP
Closed Loop TSP (TSP)
Open Loop TSP (OTSP)
Depot Depot
Single Depot Multiple
Multiple Depot
Single Depot Multiple
Multiple Depot
Salesman -Travelling
Multiple Salesman -
Salesman - Open Loop
Multiple Salesman -
Salesman Problem
Travelling Salesman
Travelling Salesman
Open Loop Travelling
(SDMS-TSP)
Problem (MDMS-TSP)
Problem (SDMS-
Salesman Problem
OTSP)
(MDMS-OTSP)
Figure 2.3: Variants of TSP
2.3.2
Open Loop Travelling Salesman Problem (OTSP)
OTSP can be modelled according to the real-life scenario of today’s transportation services. The purpose of OTSP is to find the minimum total distance of the vehicle when travelling from a starting point to the ending point by visiting all the given cities exactly once. According to Čičková et al. (2013), the OTSP can be defined by, n which refers to a set of nodes, the indices i and j refer to customers and take values
14 between 2 and n, while index i = 1 refers to the depot, 𝑑𝑖𝑗 refers to distance between i and j, where i, j = 1, 2, …n. The binary variables 𝑥𝑖𝑗 , i, j = 1, 2, ... n with a following notation: 𝑥𝑖𝑗 = 1 if customer i precedes customer j in a route of the vehicle and 𝑥𝑖𝑗 = 0 otherwise, and variables 𝑢𝑖 , i = 2, 3,…n that based on the well-known Tucker’s formulation of the TSP (Miller et al., 1960). Then, the formula for OTSP is: 𝑛 𝑛 min Σ𝑖=1 Σ𝑗=1 𝑑𝑖𝑗 𝑥𝑖𝑗
(2.3)
subject to 𝑛 Σ𝑖=1 𝑥𝑖𝑗 = 1
j= 2, 3,…n
i≠j
(2.4)
𝑛 Σ𝑗=2 𝑥𝑖𝑗 ≤ 1
i=1, 2,...n
i≠j
(2.5)
𝑛 Σ𝑗=2 𝑥1𝑗 = 1
i= 2, 3,...n
𝑢𝑖 − 𝑢𝑗 + 𝑛𝑥𝑖𝑗 ≤ 𝑛 − 1 𝑥𝑖𝑗 ∈ {0,1}
(2.6) i≠j
(2.7)
i, j = 1, 2,...n i ≠ j
(2.8)
i, j= 2, 3,...n
The objective function (2.3) expresses the minimization of the total distance of vehicle route; (2.4) is the standard constraints that ensure the vehicle visits every customer; (2.5) is a constraints that ensure the vehicle does not need to depart from every costumer, because the route ends after serving the last person; constraints (2.6) to ensure the vehicle starts its route exactly once, (2.7) is the sub-tour elimination constraints and (2.8) is the integrality constraints. Recently, Vashisht (2013) have implemented GA in OTSP, and showed that GA has proved its suitability to solve OTSP. However, the author also claimed that GA has its difficulty to maintain the optimal solution over many generations. Furthermore, he was suggested that perhaps there is better crossover or mutation operators can be found and implemented to generate better solutions. Meanwhile, Wang & Hou (2013) had employed a Simple Model (SModel) in multi-depots OTSP to determine the best numbers of salesman with nearly minimum total distance. From the experimental results, the reported performance was excellent and this almost generates the minimum total distances. Despite TSP and several of its extensions (e.g. Time windows) have been applied perfectly for the route planning problem of today, but, only limited number of researches has considered applying the OTSP. Meanwhile, OTSP still can be
15 modified and applied in today’s vehicle routing problems to provide better solution for single vehicle travelling between the given source and its destination, this is especially
beneficial
for
logistic
transportation
routing
such
as
for
merchandise delivery. Therefore, this study proposes another new variant of OTSP.
2.3.3
n-Cities Open Loop Travelling Salesman Problem (nOTSP)
The new variant of OTSP, which is proposed in this study named as n-Cities Open Loop Travelling Salesman Problem (nOTSP). In the nOTSP, the salesman departs from the starting city to another city without requiring him to visit all the given m cities. However, he is restrained to visit only n cities with the minimum total distance. This problem was inspired and modelled on real-life transportation problems. For example, in the logistic transportation routings of merchandise delivery, the delivery starts from the depot to the destination without passing through all the cities. Hence, only limited number of cities is required. Figure 2.4 illustrates the difference of OTSP and nOTSP. In Figure 2.4 (a) the pathway of a vehicle travels from a starting point to the destination by required to visit all the cities in OTSP is illustrated. Meanwhile, Figure 2.4 (b) illustrates the pathway of a vehicle that travels from the starting point to the destination without being required to visit all the cities in the nOTSP. The formulation of the nOTSP is nearly similar as the formulation in OTSP. In OTSP, the number of given cities, m is equal to the number of visited cities, n by the salesman. In the other word, this can be defined as n = m. But in the nOTSP, the number of cities to be visited n is not exactly equal to the total number of cities m that has been given to the salesman and therefore, n≠m. In the past, nature-inspired metaheuristic algorithms were highlighted to be effective and efficient in solving the combinatorial problem like TSP. Examples of these algorithms include the Tabu Search (TS) (Pedro et al., 2013), Particle Swarm Optimization (PSO) (Gao et al., 2012), Ant Colony Optimization (ACO) (Hlaing & Khine, 2011), Genetic Algorithm (GA) (Nagata & Soler, 2012) and Simulated Annealing (SA) (Wang et al., 2013), which have been applied in solving TSP and its variant.
16 Although there are many algorithms that can applied in TSP and its variants, one of the best metaheuristic algorithms is GA (Abound & Abouchabaka, 2012; Vashisht, 2013; Singh & Lodhi, 2013). The major reason behind this is its flexibility, robustness and versatility, which have been widely studied to solve combinatorial and optimisation problems such as in Singh & Lodhi (2013) and Singh & Singh (2014). In addition, the study proposed by Philip et al. (2011) stated that GA is a very good local search algorithm for solving TSP through generating a present number of random tours and then improving the population until its stop condition is met. Moreover, Vashisht (2013) also stated that GA is suitable to solve TSP because it does not need to explore every possible solution in the feasible region in order to obtain a good result. Hence, the GA will be discussed in the next section.
(a)
(b)
Figure 2.4: The difference between OTSP and nOTSP. (a) Classic Open Loop Travelling Salesman Problem (OTSP), (b) n-Cities Open Loop Travelling Salesman Problem (nOTSP).
2.4
Genetic Algorithm
GA is one of the population based metaheuristic algorithm which belongs to the larger class of evolutionary algorithm (EA). It was invented in the 1970s by John Holland (Holland, 1975). GA is a type of randomized search technique which is
17 based on the natural selection and survival of the fittest chromosomes (Albayrak & Allahverdi, 2011; Bahaabadi et al., 2012). Each chromosome is formed by genes. The set of chromosomes is known as “population”. The GA process starts by generating a random population based on the principles of natural selection. In the population, each chromosome is evaluated to determine the potential chromosomes. The potential chromosomes are selected for a recombination process to produce new chromosomes to replace the poorer chromosomes (Sallabi & El-Haddad, 2009). In this way, the better chromosomes produced each new generation. The process will continue for many generations until the condition is met. The following pseudocode describes the processes of GA.
A Genetic Algorithm Pseudocode Step 1: Choose an initial random population of individuals, p. Step 2: Evaluate the fitness of the individuals, f. Step 3: repeat Step 4:
Select the best individual to be used by the genetic operators.
Step 5:
Generate new individuals using crossover and mutation operators.
Step 6:
Evaluate the fitness of the new individuals.
Step 7:
Replace the worst individuals of the population by the best new individuals.
Step 8: until some stop criteria is met.
Algorithm 2.1: Genetic Algorithm (Goldberg & Holland, 1988)
In the past decades, GA has been successfully applied to many areas. For example, GA was applied in the area of classification for the identification of genes of similar function from a gene expression time series (To & Vohradsky, 2007). Moreover, GA also has been applied in Airline Revenue Management (ARM) to maximize the revenue of airline (George et al., 2012). In the area of control engineering, GA was applied to control the seismic vibration in nonlinear multidamper configuration (Patrascu, 2015).
18 However, GAs often suffered from premature convergence that caused by the loss of genetic diversity in the population (Malik & Wadhwa, 2014). Gupta & Ghafir (2012) have considered the insufficiency of genetic diversity as the major reason that causes GA to prematurely converge. In GA, insufficiency of genetic diversity tends to lead the solutions converge towards the local optima or even the arbitrary points rather than toward the global optimum (Ghosh, 2012). This phenomenon occurs when the genetic operators can no longer produce offspring with a better performance than their parents. In other word, sufficient genetic diversity in the population could allow the algorithm continues searching for the better solutions, avoiding them to be trapped at the local optima and become stagnant (Gupta & Ghafir, 2012). Hence, in order to avoid the premature convergence happen, the action of preserving the genetic diversity is needed.
2.4.1
Genetic Operators
GA maintains the genetic diversity and combines the existing chromosomes with others through some mechanisms called genetic operators, such as encoding, selection, crossover and mutation. Each operator has its own purpose and responsibility. The first operator in GA is known as the encoding operator. The purpose of this encoding operator is to transform the problem solution into chromosome or called gene sequence. There are many encoding techniques such as binary encoding, permutation encoding, value encoding and tree encoding that can be applied according to the model of the problem (Malhotra et al., 2011). Since that every chromosome is a string of numbers in a sequence, the permutation encoding is the best encoding for ordering or queuing problems such as TSP (Malhotra et al., 2011). The second operator in GA is the selection operator. The role of the selection operator is to select some chromosomes from the population based on their fitness (Geetha et al., 2009; Malhotra et al., 2011). Individuals which are nearer to the solution will have a high chance to be selected. In addition, there are few types of selection operators, such as roulette wheel selection, proportional selection, ranking selection, tournament selection, range selection, gender-Specific selection (GACD) and GR based selection (Sivaraj & Ravichandran, 2011). The individuals that have
19 been selected will be moved to the mating pool while the remaining unselected individuals are eliminated. The mating pool is a place where the selected chromosomes (parents) will undergo the recombining (mating) process to produce a new child (new chromosome or offspring) (Geetha et al., 2009). Crossover operator is applied in this stage to expect the better offspring to be produced from the parents. The examples of crossover techniques are single point crossover, two point crossover, multi-point crossover, uniform crossover and three parent crossover (Geetha et al., 2009). Lastly, the new chromosomes are brought to the mutation operator. Mutation operator manipulates and reallocates the genes in the chromosome hope to produce better chromosomes or solutions that are closer to the fitness. Holland (1975) underlined that the roles of mutation is to provide a guarantee to the algorithm is not trapped on a local optimal and at the same time it introducing diversity. This view was also supported by Sallabi & El-Haddad (2009), and Negnevitsky (2011) from the perspective of algorithmic functioning that the purpose of the mutation is to prevent the algorithm from being trapped in a local minimum and to avoid the loss of genetic diversity. Therefore, implement the mutation operator in the algorithm is crucial in order to prevent the loss of diversity and from being trapped in local optima. The examples of mutation techniques such as, flipping mutation, interchanging mutation, boundary mutation and reversing mutation. More details in regards to the mutation operators are presented on the next section.
2.4.2
Mutation Operators
The quality of GA solution relies on two important operators which are crossover and mutation operators. The purpose of crossover operator is to exploit the current solution in order to find the better ones. Meanwhile, the role of mutation is to maintain the genetic diversity in order to prevent the algorithms from being trapped in a local optimal and preventing the population of chromosomes from becoming too similar to each other (Sivanandam & Deepa, 2007). There have been numerous debates among researchers on the “usefulness” and relative roles of crossover and mutation (Senaratna, 2005). Their opinion is
20 divided over the importance of crossover versus mutation. On one hand, Holland (1975) claimed that crossover operator is more important than mutation operator. On the other hand, there were scholars who believe that the role of mutation is more significant than crossover. Meanwhile, Fogel & Atmar (1990), and Fogel (1990, 1993 and 2006) made a strong claim that crossover has no general advantage over mutation since mutation can also do what crossover does. Furthermore, Fogel also stated that mutation alone can do everything and it is very useful in optimising the function task (Fogel & Atmar, 1990 and Fogel, 1993; 2006). Later, Sivanandam & Deepa (2007) revealed that applying crossover operator into GA to solve TSP does not produce good solution for overall performances. In addition, Thibert-Plante & Charbonneau (2007) also found that crossover was not particularly helpful in producing better solution and Zheng et al., (2010) have discovered the importance of mutation, where without mutation, GA tends to converge prematurely. There are few common mutation operators such as inversion mutation, displacement mutation and pairwise swap mutation are usually found to be implemented in GA to solve the TSP. These three mutations were used in the work of Albayrak & Allahverdi (2011) and Singh & Lodhi (2013) to optimize the TSP in term of finding the shortest tour. Therefore, these three mutation operators will be used in the algorithm. The details of these three mutation operators are explained in the next section.
2.4.2.1 Inversion Mutation
The inversion mutation performs inversion of the substring between two selected cities. Figure 2.5 explains the inversion mutation concept. Suppose two selected cities, which are city 9 and city 2. Then the substring is (9 3 7 4 6 2). After the inversion mutation is performed, the substring (9 3 7 4 6 2) was inverted and become (2 6 4 7 3 9).
Before mutation 1 5 9 3 7 4 6 2 8 0 After mutation
1526473980
Figure 2.5: Before and after the inversion mutation was performed.
21
2.4.2.2 Displacement Mutation
Displacement mutation pulls the first selected gene out of the set of string and reinserts it into a different place then sliding the substring down to form a new set of string. In this case, city 9 was taken out from the tour and placed behind city 2, at the same time the substring (3 7 4 6 2) was slid down to fill the empty space. This is shown in Figure 2.6.
Before mutation
1593746280
After mutation
1537462980
Figure 2.6: Before and after the displacement mutation was performed.
2.4.2.3 Pairwise Swap Mutation
In pairwise swap mutation, the residues at the randomly chosen two positions swapped. Sometimes, this technique is also called interchange mutation or random swap (Sallabi and El-Haddad, 2009). For this case, the location of city 9 and city 2 will be swapped. This is shown in Figure 2.7.
Before mutation
1593746280
After mutation
1523746980
Figure 2.7: Before and after the pairwise swap mutation was performed.
22 2.4.3
Application of GAs on TSPs
Many researches had proven that GA and its hybrid variant have the potentials to solve TSPs. Some of the reviews of the related literature are detailed in Table 2.1.
Table 2.1: Related literatures on GAs for TSPs.
Authors, year Sallabi & ElHaddad, 2009
Problem domain TSP
Techniqu e Improve d Genetic Algorith m (IGA)
Operators
Khan et al., 2009
Symmetric TSP (STSP) and asymmetric TSP (ATSP)
GA
Yang et al.,2013
TSP
GA
Arya et al., 2014
Multiple Traveling Salesmen Problem (MTSP)
GA
Crossover operator: One-point crossover. Mutation operators: Inversion mutation and pairwise swap mutation.
Shortest distance and computational time.
Liu, 2014
TSP
GA
Crossover operator: EdgeSwapping (ES) crossover. Mutation operators: None.
Shortest distance.
Crossover operator: Swapped Inverted crossover (SIC). Mutation operators: Multi mutation Crossover operator: OR crossover. Mutation operators: Inversion mutation Crossover operator: 2-points crossover Mutation operators: Pairwise swap mutation.
Performance measurements Shortest distance.
Shortest distance.
Shortest distance.
Results IGA can be effectively solving the TSP. The total distance found by IGA is nearoptimal. GA can give near-optimal solutions for both STSP and ATSP. GA successfully discovers the shortest tour when compared to other algorithms such as SA, ACO and PSO. The proposed GA produced better results; the run time was also optimized. This GA is suitable for large-size problems. The proposed GA has found the optimal or best known solutions for most benchmark instances and reduces computational cost.
23 Yao, 2014
TSP
GA + PSO
Chen & Chien, 2011a
TSP
GA+SA+ ACS+PS O
Chen & Chien, 2011b
TSP
GA+AC S
Zhang & Lu, 2012
TSP
GA+AC O
Crossover operator: EdgeSwapping (ES) crossover. Mutation operators: Inversion mutation and pairwise swap mutation. Crossover operator: Bonecrossover and twopoint crossover Mutation operators: simulated annealing mutation and pheromonemutation
Shortest distance and convergence rate.
Crossover operator: Bonecrossover Mutation operators: Routemutation and pheromonemutation Crossover operator: single point crossover
Shortest distance.
Shortest distance and convergence rate.
Shortest distance and convergence rate.
Mutation operators: Reversal mutation. Dong et al., 2012
TSP
GA+AS
Crossover operator: single point crossover Mutation operators: Reversal mutation.
Shortest distance and convergence rate.
The proposed algorithm overcomes the drawbacks as low convergence rate and local optimum when using PSO. The proposed algorithm generates better average tour lengths and smaller percentage deviations compared to previous studies. The proposed algorithm generates better average tour lengths compared to previous studies. Proposed algorithm has higher converging speed, stability and global optimization ability. Proposed algorithm has superior performance for solving TSPs in terms of capability and consistency of achieving the global optimal solution, and quality of average optimal solutions, particularly for small TSPs.
From the reviews stated above, GA are proved of their suitability in solving the TSPs in term of finding the shortest path. Apart from the finding of the shortest path in TSP, the reviews above also revealed that the researchers were integrated the
24 GA with the swarm based algorithms for the purpose of improving the convergent rate of the algorithm. This can be clearly seen in the researches of Yao (2014), Chen & Chien, (2011a; 2011b), Zhang & Lu (2012) and Dong et al. (2012) that the approach of integrating the GA with swarm-based algorithms have the potential to aid the problem of prematurely convergence in the algorithm. Therefore, this study also has the intention to integrate the swarm-based algorithm into the GA to overcome its drawback in term of insufficiency of the genetic diversity that causing the prematurely convergence of the algorithm.
2.5
Swarm Intelligence
Swarm intelligence is a sub-field of evolutionary computing. “Swarm” is a term often used to describe a huge number of homogeneous living creatures or organisms moving without central controls (Ahmed & Glasgow, 2012). For examples, colonies of ants and bees, flocks of birds or schools of fishes. In the recent years, swarm-based algorithms have been chosen and successfully applied in many areas to solve high complexity problems through producing a set of effective solutions (Blum & Merkle, 2008; Hiot, 2010). The expression of “Swarm Intelligence” has been used since 1989, when it was first introduced by G. Beni and J. Wang in the context of cellular robotic systems (Beni & Wang, 1989). Swarm intelligence can be defined as an efficient computational model in the artificial intelligence (AI) field which was inspired by the collective behaviors of the swarm of homogeneous living such as selforganization, decentralized control and communication (Blum & Merkle, 2008; Mishra et al., 2013). The first swarm intelligence model is the ACO, which was introduced by Dorigo et al. (1991; 1992; 2006). Consequently, more models were developed, such as Particle Swarm Optimization (PSO) (Eberhart & Kennedy, 1995), Artificial Bee Colony algorithm (ABC) (Karaboga, 2005) and Bat algorithm (BA) (Yang, 2010). All these algorithms are also has been applied to solve the TSPs in the past and proved to have good performance (Goldbarg et al., 2008; Li et al., 2011; Ouaarab et al., 2014).
25 There is another new swarm intelligence algorithm has also been proposed, called Simplified Swarm Optimization (SSO) (Bae et al., 2012). The details of SSO are discussed in the next section.
2.6
Simplified Swarm Optimization
Simplified Swarm Optimization (SSO) is the variant of PSO. It was developed by Yeh (2009) based on the idea of traditional PSO in order to address the inability of PSO to solve discrete problems (Yeh & Liu, 2008). In this research, he found that PSO tends to suffer from premature convergence especially in high dimension multimodals problems. As the number of iteration increased, the convergence speed will also be decreased. Therefore, SSO was proposed to improve the performances and overcome the drawbacks of PSO (Bae et al., 2012). From this experimental results, SSO was found to have a better convergent rate, better accuracy, much simpler, efficient and flexible than the PSO.
2.6.1
Algorithmic Structure and Flow
In the PSO, the particles manipulate four elements, which are the position, velocity, personal best position and global best position (or local best solution for neighborhood-based PSO). According to Ahmed & Glasgrow (2012), the equations for calculating updates on the velocity and position of the particle are as follow:
𝑣𝑖𝑡+1 = w 𝑣𝑖𝑡 + α 𝑟1(𝑝𝑏𝑒𝑠𝑡𝑖𝑡 - 𝑥𝑖𝑡 ) + β 𝑟2 (𝑔𝑏𝑒𝑠𝑡𝑖𝑡 - 𝑥𝑖𝑡 )
(2.9)
𝑥𝑖𝑡+1 = 𝑥𝑖𝑡 + 𝑣𝑖𝑡+1
(2.10)
Based on the equations 2.9 and 2.10, w represents inertia weight. This determines how much velocity should be retained from the previous steps. As the value increased, the ability of global searching also increased. Typically, the value is initialized to 1.0 and slowly reduced over the iterations of the algorithm. Meanwhile, 𝑣𝑖𝑡 and 𝑥𝑖𝑡 represent the particle’s velocity and particle position
26 respectively. Parameters α and β are denoted to the personal experiences and population experiences (or the learning factors) of the particle while exploring the search space (Ahmed & Glasgrow, 2012). The higher the values of α and β may leads particles will be more likely to increase their abilities to explore the undiscovered regions in the n-dimension search space. However there is a risk that the particles can deviate if without the proper balance (Parsopoulos & Vrahatis, 2010). Furthermore, 𝑟1 and 𝑟2 are represent the computational generated random values between 0 and 1. Moreover, 𝑝𝑏𝑒𝑠𝑡 and 𝑔𝑏𝑒𝑠𝑡 are represent the local best position and global best position of the particles. Lastly, t and i represent the computational time and the location of the particle. The following Algorithm 2.2 describes the process of PSO.
A Particle Swarm Optimization Pseudocode Step 1: Initialize the position 𝑥𝑖𝑡 and velocity 𝑣𝑖𝑡 of each particle in the population randomly. Step 2: Calculate the fitness value of each particle. Step 3: Calculate the 𝑝𝑏𝑒𝑠𝑡 and 𝑔𝑏𝑒𝑠𝑡 for each particle. Step 4: Do Step 5:
Update the velocity of each particle using equation 2.9.
Step 6:
Update the position of each particle using equation 2.10.
Step 7:
Calculate the fitness value of each particle.
Step 8:
Update the 𝑝𝑏𝑒𝑠𝑡 for each particle if the current fitness value is better than its 𝑝𝑏𝑒𝑠𝑡.
Step 9:
Update the 𝑔𝑏𝑒𝑠𝑡 for each particle, ie., choose the position of the particle with the best fitness value among all its neighborhood topology.
Step 10: While termination criterion is not attained.
Algorithm 2.2: Particle Swarm Optimization (Kennedy & Eberhart, 1995 ).
SSO is very much different in term of its structure and algorithm when compared with PSO. The difference between these two algorithms is in term of the particle velocity 𝑣𝑖𝑡 . SSO eliminates the use of particle velocity in equation 2.1 in PSO. In PSO, the velocity decides the next positions for the problem with continuous problems; however, it is not trivial and well-defined to be used on problems with discrete variables and sequencing problems (Yeh, 2009). In addition, the use of
27 velocity in PSO increases the computational and time complexity. Computational results showed that by eliminating the velocity the algorithm performs well in term of finding optimal solutions and improves accuracy. Equation 2.11 shows the formula of SSO.
(2.11)
where, 𝑥𝑖𝑑 represents the position of the particle i-th in d-dimension, the Cw, Cp and 𝑡−1 𝑡−1 Cg are predetermined constants which values are always positive. 𝑝𝑖𝑑 and 𝑔𝑖𝑑
indicate the current best position (pbest) and global best position (gbest) respectively, and 𝑥 represents the new updated value of the particle which was randomly generated from random function (rand ( )), where rand ( ) value is between 0 and 1. This unique mechanism contributes an extraordinary characteristic to SSO, in which the present best position (pbest) and global best position (gbest) were always updated based on the generated random values. This special characteristic is known as comparison strategy. Hence, the name solution update mechanism (SUM) is given to this comparison strategy in this research due to it characteristic that work as a mechanism that keep updating the solution once the best solution is found. The pseudocode of SSO is shown in Algorithm 2.3. SSO has been used and modified to various types of approach to solve different problem in the past. Yeh et al. (2011) have applied the SSO in data classification for discovering breast cancer classifications rules. From this experimental result, it is shown that SSO has the highest average accuracy when compared to other data mining methods such as Decision Tree, Neural Network, and Support Vector Machine. From the works of Bae et al. (2012), another variant of SSO algorithm was proposed using the exchange local search strategy (SSO-ELS). This approach improves and refines the searching process for data mining and classification problem. In the experiment, the SSO-ELS was first measured using 13 popular datasets from University of California, Irvine (UCI) repository. The performance of SSO-ELS had been compared with SSO, PSO, PSO-ELS and other
28 three most popular data mining techniques. The results show that SSO-ELS has the highest accuracy of more than 94% and can outperform the other algorithms in 13 datasets. Apart from this, another SSO algorithm with weighted local search strategy (SSO-WLS) was proposed by Chung & Wahid (2012) for the detection of network intrusion. The purpose of SSO-WLS is to discover the better solution from the neighborhood of the current solution produced by SSO. The testing results showed that SSO-WLS can achieve higher classification accuracy with 93.3% when compared to original SSO.
A Simplified Swarm Optimization Pseudocode
Step 1: Initialize the swarm size (m), predetermined constants (Cw, Cp, Cg), maximum generation (maxGen) and maximum fitness value (maxFix). Step 2: Generate and initialize the present best position (pbest) and the global best position (gbest) with random position (x). Step 3: Evaluate the fitness value according to the objective function for each. Step 4: Update the pbest and gbest. Step 5: A random number (R) is generated between 0 and 1. Step 6: The comparison strategy which based on the equation 2.11. Where, If (0 ≤ R ˂ 𝐶𝑤 ), then the original value is kept; Else if ( 𝐶𝑤 ≤ R ˂ 𝐶𝑝 ), then the original value 𝑥𝑖𝑑 is replaced by pbest; Else if ( 𝐶𝑝 ≤ R ˂ 𝐶𝑔 ), then the pbest is replaced by by gbest; Else if ( 𝐶𝑔 ≤ R ˂ 1), then the new value is generated to replace the original value. Step 7: When the termination criteria is met, the program will be terminated; otherwise, return to Step 3.
Algorithm 2.3: Simplified Swarm Optimization.
Based on the findings from the previous researches, it is clearly illustrate that SSO and its variants can bring significant improvement in classification accuracy over experimental database. Since that SSO has higher accuracy, this revealed that SSO has better convergent rate and avoid the solutions to being trapped in the local optima. Yet, to the best of knowledge, SSO still have not been applied in any optimization problem. This illustrates that SSO still have numerous of opportunities of application in the other areas, such as in the field of TSP.
29 2.7
Chapter Summary
TSP is one of the most widely studied combinatorial optimization problem which described how a salesman discover the shortest tour by visiting each city exactly once and returning to the starting city. However, in the real-life transportation scenarios, it is realized that the problems are not exactly similar as what has been pictured in TSP and OTSP, where the vehicle must require to visit to all of the given cities and return to its starting point. Therefore, another variant of TSP is proposed in this study named nOTSP. nOTSP is different from the TSP in term of its travelling characteristic in seeking the shortest path by visiting only a certain number of the cities rather than all the cities. In the past, GAs has been successfully applied in many different variants of TSP. However, GA often suffers from the tendency to converge towards local optima or also known as prematurely converge is due to insufficiency of the genetic diversity within the population. To overcome this drawback, GA may need to be improved in order to prevent the loss of genetic diversity and to ensure the sufficiency of the genetic diversity in the algorithm. This improvement is to adopt the characteristic of a new swarm-based algorithm named SSO. SSO was proposed to overcome the drawback of PSO which is easily suffered from the premature convergence. Based on the previous studies, SSO have proven to have better performance in term of convergent rate, better accuracy, much simpler to be applied, efficient and flexible in solving the classification problem. This successfulness is due to its special characteristic called SUM that work as a mechanism that always keep the solution up-to-date once the best solution is found. Therefore, due to the SSO the advantage in term of better convergent rate, this research has the intention to develop an improved GA-based algorithm by adopting the SUM into the GA. This modification is expected to improve the performance of the algorithm in term of the sufficiency of the genetic diversity and discover the good solution for the nOTSP. The approach and the process of development are outlined in the next chapter.
CHAPTER 3
RESEARCH METHODOLOGY
3.1
Introduction
This chapter discusses the methodology that is used in this study. Likewise, this chapter presents the various procedures and strategies for the development of the nOTSP topology and proposed algorithm. This includes the details of the methodology which involves the topology creation and the manipulation of the algorithm. Furthermore, this chapter also describes clearly the procedure of the experiment settings, parameters used and the approach for algorithm validation through performance evaluation. The process flow is explained in detail in the following sections.
3.2
Research Framework
The whole process consists of four stages. The research had started with the development of nOTSP topology. This was followed by the development of the Genetic Simplified Swarm Algorithm (GSSA). This stage included the descriptions and processes of the proposed algorithm. Later, the setup of the entire experiment is described. Lastly, this chapter also discusses the process of the performance’s evaluation. A summary is drawn from the research presented at the end of this chapter to facilitate the further validation that will be conducted in Chapter 4. The overall methodology for this research is simplified into a step by step process as shown in Figure 3.1.
31 \
Figure 3.1: Research process.
3.3
nOTSP Topology Development
At the beginning of the research, the understanding on the idea of the nOTSP is important. Hence, the formula of nOTSP is constructed based on the idea of OTSP. This section describes the development process of nOTSP. The process is started by generating a set of coordinates which represents the cities. This is followed by mapping the each node into the Cartesian coordinate plane. Later, the starting and ending points is determined. Lastly, all the nodes are arranged in the matrix form for further implementation.
3.3.1
Nodes Generation
In this study, each city is represented by a node and each node is denoted by a set of coordinates. Therefore, a set of fixed coordinates are randomly generated by using the equation as follows:
𝑁𝑜𝑑𝑒𝑥,𝑦 = a*rand(m,n)
(3.1)
32 where,
𝑁𝑜𝑑𝑒𝑥,𝑦 represents the cities which are located on the x-axis and yaxis of a Cartesian coordinate system,
rand returns a pseudorandom scalar drawn from the standard uniform distribution on the open interval (0,1),
a denotes the upper limit of the generated coordinates. On the other hand, it means that a set of coordinates is generated in between 0 to a. Without a the coordinates generated are only between 0 to 1, and
(m,n) refers to m-by-n matrix. m refers to the number of rows (or the number of cities) and n refers to the number of columns (or the dimensions).
In this study, the computer generated nodes will be used to simulate the actual cities in the real world. This approach also has been used in the study of Singh & Lodhi (2013) where their algorithms were tested on the computer generated TSP topologies with the number of nodes 20, 40, 60, 70 and 90. Meanwhile, the same approach was also used by Dorigo & Gambardella (2014) where 50 nodes that were used in their test case was computer randomly generated. However, this study decides the total number of nodes to be 50 as a test case to mimic the actual cities in the real world. A set of 50 nodes are generated according to the equation 3.1. Here, a is set as 50, m is set as 50 and n is set as 2. The outcome of the equation 3.1 is shown in
Table
3.1.
All
the
50
nodes
are
arranged
in
the
x-
and
y-
Cartesian coordinate system. Later, all the nodes are mapped into the Cartesian coordinate plane. The details of the process are discussed in the next sub-section.
33 Table 3.1: Generated coordinates of each node.
3.3.2
Nodes Mapping
In this stage, all the generated nodes are mapped into the two dimensions of the Cartesian coordinate plane, following the approach used by Pizlo et al. (2006), Goldbarg et al. (2008), Haxhimusa et al. (2011), Sur et al. (2013) and Šeda (2015). The purpose of nodes mapping is to plot the coordinates to simulate the location of the cities. By using this approach, each location of the city could be seen clearly and during the experiment the generated path could also be identified easily. Figure 3.2 presents the nodes that are mapped on the Cartesian coordinate plane.
34
Figure 3.2: Nodes mapped in Cartesian coordinate plane.
3.3.3
Determining the Starting and Ending Points
In nOTSP, the determination of starting and ending points are required. This study treats the node 1 and node 50 as its starting and ending point, respectively. However, the problem arises as can be observed in Figure 3.3. The starting point (6.5961, 4.5380), which belong to node one and ending point (8.0033, 1.3317) that belong to node 50, are both appeared to be closed to each other. This phenomena occurred due to the coordinate of the nodes are computer randomly generated and it is expected that the chance to get the generated starting and ending points to be far a part is low. This seems to be unreasonable in the context of real world vehicle routing where the starting and ending points are usually far apart from each other. Therefore, it is necessary to conduct the manipulations of the starting and ending points manually in this study. The manipulation begins by manipulating the starting point (6.5961, 4.5380) to (0.00, 0.00) and followed by changing the ending point (8.0033, 1.3317) to (50.0, 50.0) as shown in Figure 3.4. Meanwhile, Figure 3.5 presents the new location of the starting and ending points in the two dimensions of Cartesian coordinate plane after manipulation is performed.
35
Figure 3.3: The starting and ending points in the Cartesian coordinate plane.
Figure 3.4: Nodes position before and after the manipulation of starting and ending points.
36 0
Ending point
Starting point
0
Figure 3.5: New location of the starting and ending points.
3.3.4
nOTSP Creation
After determining the number of city (nodes) as m =50, and the starting and ending points are set, the creation of the nOTSP began. The creation starts from a matrix of the calculated Euclidean distances between the cities to be visited by the travelling salesman. The Euclidean distance 𝑑𝑖𝑗 , between any of the two cities with coordinate (𝑥𝑖 ,𝑦𝑖 ) and (𝑥𝑗 ,𝑦𝑗 ) is calculated by equation 3.2 (Al-Dulaimi & Ali, 2008; Panwar & Gupta, 2013).
𝑑𝑖𝑗 =√(𝑥𝑖 − 𝑥𝑗 )2 + (𝑦𝑖 − 𝑦𝑗 )2
(3.2)
Based on equation 3.2, the calculated Euclidean distances between the two cities are arranged in the form of symmetric distance-matrix (Al-Dulaimi & Ali, 2008; Ahmed, 2010). The example of symmetric distance-matrix is shown in Table 3.2. nOTSP is a permutation problem with an objective of finding shortest tour on an undirected graph. In GA or GA-based algorithm, each city can be represented as a gene in a chromosome x, and all the chromosomes are arranged horizontally in a
37 matrix form. First, the population of chromosomes with p-by-n matrix is created where, p represents the population size and n represents the number of visited cities. Figure 3.6 shows how genes and chromosomes are arranged in a matrix form.
Table 3.2: The symmetric distance-matrix.
Figure 3.6: Population of chromosomes in matrix form.
Hence, the genes are randomly permuted and arranged into the matrix horizontally until the p is reached. Figure 3.7 shows how randomly permuted chromosomes with number of gene n are arranged in matrix. Each chromosome in the population represents the possible shortest path. For example, the first row of permuted chromosome is [44, 6, 18 …33, 28] which represent the tour that departs
38 from city 44 to city 6, city 18 and so forth, then to city 33 and city 28. Later, the total distance is calculated using the summation method in equation 3.3. Where 𝑑𝑖𝑗 is defined on a distance between the node i and j, and 𝑥𝑖𝑗 is refers to the decision variable associates to the assignments of node i and j. Hence, when 𝑥𝑖𝑗 =1, the node i is assigned to the position j, this means that the routes has a stretch that is the sequence from city i to city j, and 0 otherwise (Zhang, 2009; Matai et al., 2010; Kai & Mingrui, 2012) 𝑛
𝑛
∑ ∑ 𝑑𝑖𝑗 𝑥𝑖𝑗
(3.3)
𝑖=1 𝑗=1
Figure 3.7: Arrangement of the first and second chromosomes with number of gene n in a matrix
3.4
Genetic Simplified Swarm Algorithm (GSSA)
In this section, the development of the proposed algorithm GSSA is presented. GSSA can be described as the hybridization of the GA with SSO’s unique characteristic called solution update mechanism (SUM). The main purpose of SUM is to update the pbest and gbest by comparing with a random number that has been generated. Therefore, this research has intention to adopt this unique characteristic into the GA to optimize the nOTSP in term of finding the shortest tour.
39
Figure 3.8: The flowchart of the SSO (Bae et al., 2012).
The original SSO was initially designed for classification and data mining. Thus, some modifications are needed to be performed on SUM in order to fit into the GA for route optimization purpose. Figure 3.8 shows the flowchart of the SSO, where the column which was shaded in red colour refers to the SUM that intended to be modified and employed in the GA. Nevertheless, the original GA was much simpler as compared to the SSO. GA is operated by four important operators such as encoding, selection, crossover and mutation operators to work together in seeking for the optimal solution. Figure 3.9 shows the flowchart of the GA.
40
Figure 3.9: The flowchart of the GA (Pham & Karaboga, 2012).
Even though the GSSA is a GA based algorithm, crossover operator is decided to be eliminated from the algorithm. This idea was inspired by few recent researches, whereby the GAs have been designed without crossover operator can also have better performances (refer to Chapter 2 section 2.4.2). Further details regarding the development of GSSA is presented in Section 3.3.1.
3.4.1
GSSA Development
This section describes the whole process of modification and integration of SUM into GA in details. Firstly, the process began with deciding the mutation operators that would be adopted in SUM. There are many types of mutation operators which were developed and employed in GA for solving different problems in different areas. In this research, mutation operators such as inversion, displacement and pairwise swap are used. These three are common mutation operators that have been used in the past to solve TSPs (Albayrak & Allahverdi, 2011; Singh & Lodhi, 2013).
41 The development starts with embedding the mutation operators into SUM. In the original SUM, there are three decisions points where the conditions of decision making take place. This can be seen clearly in Figure 3.10. The mutation operator is placed before each of the decision points. Later, each evaluator is placed in between the mutation operators and decision points for fitness evaluation purposes. Meanwhile, Figure 3.11 shows the new SUM after the modification was performed.
Decision points
Figure 3.10: SUM with three decision points.
Mutation operators
Evaluator s
Figure 3.11: New SUM after modification was performed.
42 Secondly, due to the facts that SSO was originally designed and applied on classification problems and data mining; therefore, the original comparison strategy may need to be changed to fit with the application on this permutation problem, nOTSP. The nOTSP is dealing with the optimization that finds the shortest tour; in other words it deals with problem minimisation. The comparison strategy in original SUM is functioning through comparing the predetermined constants (Cw, Cp, Cg) with computer generated random number R to updating the solutions (refer to Chapter 2 section 2.6.1). However, the new comparison strategy proposed here is much simpler. In nOTSP, the fitness value is measured by comparing the current found minimum value with the previous discovered minimum value by the algorithm. Once the current found minimum value is lower than the previous value, the previous value will be replaced by the current value. Likewise, this principle is adopted as the comparison strategy in SUM for optimizing the nOTSP. Meanwhile, the current best solution (pbest) and global best solution (gbest) in GSSA is renamed as new fitness (newFit) and global fit (gloFit) respectively. Lastly, the new SUM is embedded into the GA as shown in Figure 3.12. In GA, the operators such as the encoding and selection operators still remain in the algorithm. The encoding operator that used in GSSA is permutation encoding. Permutation encoding performs the permutation operation on the genes (cities) to form a population of chromosomes (possible solutions) at the beginning of the process. These chromosomes are evaluated and took over by the selection operator. Selection operator that implemented in GSSA is tournament selection. Tournament selection is the most popular selection method in GA due to its efficiency and simple implementation (Noraini & Geraghty, 2011). It operates by picking two individuals from the population and staging a tournament that determines which individual is selected. Furthermore, during the tournament, the fitter candidates are selected to proceed into next process; meanwhile the worst candidate will be eliminated. The complete GSSA flowchart is shown in Figure 3.13.
43
New SUM
Genetic Algorithm
Replace into GA
Figure 3.12: Embedding process of the new SUM into the GA.
44
Start
Initialization: p= Population size n= Gene size MaxGen= Maximum generation gloFit= Initial global fitness value
Encoding (Permutation) Evaluate the fitness of each p Selection Inversion mutation Evaluate the fitness of new p
newFit < gloFit
Yes
gloFit = newFit
No Displacement mutation Evaluate the fitness of new p
newFit < gloFit
Yes
gloFit = newFit
No Pairwise swap mutation Evaluate the fitness of new p
newFit < gloFit
Yes
gloFit = newFit
No Keep the original value of gloFit
No
MaxGen?
Yes End
Figure 3.13: The flowchart of Genetic Simplified Swarm Algorithm.
45 3.4.2
Process of GSSA
GSSA is modified from GA and at the same time it has the characteristic of SSO. The GSSA contains set of population or chromosomes, genes, fitness function, encoding, selection and mutation. GSSA began with creating a set of solutions that represented by chromosomes that are called population. The population are taken to be evaluated and the best chromosomes are selected according to their fitness to form new solutions in next process. Furthermore, the selected chromosomes are brought into the SUM. The process is repeated until the maximum generation is reached. The overall process and GSSA algorithmic is outlined as below:
A Genetic Simplified Swarm Algorithm (GSSA) Pseudocode:
Step 1 [Start]: Initialize the population size (p), gene size (n), maximum generation (MaxGen) and global fitness value (gloFit). Step 2 [Encoding]: Permute the genes into the sequences which is call chromosomes, or population. Step 3 [Evaluation]: Evaluate the fitness of each chromosome in the population and update the gloFit. Step 4 [Selection]: Select the chromosomes from the population according to their fitness. The fitness is determined based on the minimum distance of the route that has been found. The shorter the distance is, the bigger chance it will be selected.
//Solution Update Mechanism (SUM): Step 5~ Step 7 Step 5 [Inversion mutation]: Perform an inversion of the sub-chromosome between two selected genes on the chromosomes that have been selected. [Evaluation]
Evaluate the fitness of mutated chromosome. if (newFit < gloFit), then gloFit = newFit; else, proceed to the Step 6.
Step 6
[Displacement mutation]: Mutate the chromosome by pulling the first selected gene out of the chromosome and reinsert it into a different place then sliding the sub-chromosome down to form a new chromosome.
Algorithm 3.1: Genetic Simplified Swarm Algorithm
46 [Evaluation]
Evaluate the fitness of mutated chromosome. if (newFit < gloFit), then gloFit = newFit; else, proceed to the Step 7.
Step 7 [Pairwise Swap mutation]: Mutate the chromosome by swapping the position of two random chosen genes. [Evaluation]
Evaluate the fitness of mutated chromosome. if (newFit < gloFit), then gloFit = newFit; else, keep the value of gloFit.
Step 8 [Termination criteria]: If the MaxGen is reach, Stop; Else, go to Step 5
Algorithm 3.1 (Continued)
3.5
Experimental Setup
The application of GSSA on nOTSP is implemented using MATLAB. A personal computer with a Core i5 2.30GHz in processor and 8GB of RAM with a Window 7 as an operating system is used in this experiement. This study contains five important parameters, which are population size p, maximum generation MaxGen, global fitness value (or global distance) gloFit, number of cities m and number of visited cities n. However, these five parameters are divided into two categories; fixed parameters and adjusted parameters. Fixed parameters refer to the fix values of some particular parameters that will be used along the process of execution. In this study, parameter m is set to be 50 and gloFit is set to be Inf (infinity). In addition, the adjusted parameters are the parameters that will be changed along the process due to certain criteria that need to be adjusted, such as n, MaxGen and p. The overall summary and overview of the fixed and adjusted parameters are shown in Table 3.3.
47 Table 3.3: Fixed and adjusted parameters.
Fixed Parameters • 50
m (No. of cities)
• Inf (Infinity)
globFit (Global fitness)
Adjusted Parameters n (No. of visited cities) MaxGen (Maximum generation) p (Population)
3.5.1
• 10, 20, 30 and 40 • 30, 150, 250 and 500 • 1000, 2000, 3000, 4000 and 5000
Pre-processing: Determining the MaxGen
Pre-processing is a preliminary operation that needs to be performed before the actual experiment takes place. The purpose of pre-processing in this study is to determine the maximum generation, MaxGen for every number of visited cities, n. This is because as n increased, more iteration is required to achieve the good solution. This can lead to the increase of computational time which is unnecessary. In other words, the algorithm will take superfluous time to execute insignificant processes. Conversely, insufficient iterations can cause the algorithm unable to be achieved the good solution. Therefore, the study has to ensure the numbers of iterations are sufficient for the algorithm to discover the best solution. Due to these reasons, preprocessing is executed to obtain the appropriate MaxGen for each n before proceeding into the actual experiment. At the beginning of pre-processing, the fixed parameters have been set in advance. Later, the MaxGen is set to be 1000 and execute 10 times for each n, with n=10, 20, 30 and 40. When pre-processing is executed, the iteration will stop after the best solution for n has been achieved. At the same time the number of iteration for each run is recorded. Average number of the iteration is calculated and the highest iteration would be recorded. The outcome from the pre-processing process is shown in Table 3.3. From the figure, the column shaded in red represents the maximum generation that has been achieved when the optimal solution has been
48 found. The column shaded in green colour represents the average of the iteration over 10 executions. From the pre-processing result shown in Table 3.4, the average generation during the 10 runs was approximately 18 and the maximum generation for the algorithm to achieve the solution was 27. Due to stochastic behaviour of GSSA, there are possibilities for the algorithm to achieve the optimal solutions on greater iteration such as 28, 29, and 30. Therefore, the MaxGen for n=10 is decided to be 30. This is to ensure the number of iterations is sufficient in discovering the solutions and to eliminate unnecessary computation time. The same experimental procedure has also been applied on n=20, 30 and 40 to predetermine its MaxGen. The full preprocessing results can be seen in Appendix B. Meanwhile, the overview of predetermined MaxGen are tabulated in Table 3.5, where the break iteration for n=10, 20, 30 and 40 are 27, 146, 248 and 496 respectively, and the MaxGen is set to be 30, 150, 250 and 500.
Table 3.4: Pre-processing results for n=10
Parameter
Run
Break
Optimal
Iteration
Distance
1
26
113.9851
2
10
99.8721
p=1000,
3
23
107.1206
MaxGen=1000,
4
17
113.6920
m=50,
5
13
109.9806
n=10
6
14
106.9810
7
27
97.9614
8
15
113.7892
9
21
114.9823
10
18
119.6341
18
109.7998
settings
Average
49 Table 3.5: Pre-processing results and MaxGen for n=10, 20, 30 and 40 n
Break iteration
Average break
MaxGen
iteration
3.5.2
10
27
18
30
20
146
103
150
30
248
218
250
40
496
436
500
Parameter Setting and Data Collection
After the pre-processing was completed, the MaxGen for each n was determined and tabulated where, each n was executed 10 times for each population p, whereby p= 1000, 2000, 3000, 4000 and 5000. The average distance for the 10 runs were calculated and recorded. The similar settings were employed to GA without crossover operator (GA-XX) and GA with crossover operator (GA-1X) for the purpose of comparing performances.
3.6
Performance Evaluation
The experiment focused on the two aspects of performance evaluation, which are the optimal solution achieved and the influence of the population size on the solution. To ensure a fair comparison, GA-XX and GA GA-1X that using the three similar mutation operators were selected for the performance comparison with GSSA. In GA-1X, the single point crossover operator was used. The characteristics of GSSA, GA-XX and GA-1X are detailed in Table 3.6.
Table 3.6: Characteristics of the algorithms
Algorithms
Crossover operator
GSSA
No
GA-XX
No
GA-1X
Single point crossover
Mutation operator Three mutation operators : 1. 2. 3.
Inversion Displacement Pairwise Swap
50 3.6.1
Optimal Solution and Average
The optimal solution in this study is based on the shortest distance that has been found by the algorithms. The algorithm is executed 10 times on each different number of visited cities, n and number of population, p. During the execution, the shortest distances of the algorithm on nOTSP are identified. Later, the best average distance is calculated. From the results, the best algorithm is identified based on the shortest average distance that has been found by the algorithm.
3.6.2
Influence of Population Size on Algorithm
Sallabi & El-Haddad (2009) had conducted an experiment to see the effects of population size on the performance of their proposed algorithm, named improved genetic algorithm (IGA) in TSP. However, researchers argued that a small population size could guide the algorithm to poor solutions (Pelikan et al., 2000; Piszcz & Soule, 2006; Koumousis & Katsaras, 2006; Roeva et al., 2013). In order to validate this statement, this research also carries out the studies on influence of the population size toward the algorithm. For this analysis, the experiment has executed the algorithms on nOTSP with different population size p=1000, 2000, 3000, 4000 and 5000 for each visited city n. This method was also used by Sallabi & El-Haddad (2009) to evaluate the performances of the algorithm as the population size changed.
3.7
Chapter Summary
This chapter has discussed the methodology used during the research development. The chapter began with the development of a new variant of TSP, the nOTSP. In this study, the cities were represented as nodes where by each node was plotted in the two dimensions Cartesian coordinate plane. The development of GSSA starts by modifying the SUM by changing its comparison strategy and embedding the three mutation operators into it. Afterward, the new SUM was embedded in the GA. Later, the process of GSSA has been described in detail. This chapter also discussed how the experiment was setup and implemented, including the process of conducting the pre-processing to identify the MaxGen for each number of visited cities, n and the
51 parameter’s setting for the whole experiment before the actual experiment was carried out. Furthermore, the method used in performance evaluation has been discussed in detail. The performance of GSSA was validated through the comparison with GA-XX and GA-1X based on the shortest distance that had been found. The performance and results of the GSSA are presented on the next chapter.
CHAPTER 4
ANALYSIS AND RESULT
4.1
Introduction
This chapter undertakes further examination to validate the efficiency and accuracy of the proposed algorithm which is known as Genetic Simplified Swarm Algorithm (GSSA) on n -Cities Open Loop Travelling Salesman Problem (nOTSP). The performance of the GSSA was compared against GA without crossover operators (GA-XX) and GA with crossover operator (GA-1X). The performance criterion in this research focuses on two criteria which are the shortest paths that have been found and the influence of the population size on the algorithm in finding the shortest path on nOTSP. Each algorithm was implemented with the same procedure and data set that have been discussed in Chapter 3. The performance analysis results of the GSSA, GA-XX and GA-1X were annotated. Finally, the research summary is presented in the last section of this chapter.
4.2
Performance Analysis and Results
The details of the computational result for GSSA, GA-XX and GA-1X on nOTSP can be referred in Appendix A. The following section presents the results obtained from each algorithm, followed by the average distance of the algorithms and the influences of the population size on the algorithms. Lastly, the performance of the algorithm is discussed.
53 4.2.1
Shortest Path and Average Distance
Each algorithm was executed for 10 times and at the same time, the shortest distances for every data set was determined. All the shortest distances were recorded in Table 4.1. The bold numerical values in the table indicate the shortest distances over the population size. The results clearly show that GSSA had discovered the shortest distances in almost all the different data sets. For the n= 20, 30 and 40, where the p=1000, 2000, 3000, 4000 and 5000, GSSA seemed to be outstanding in generating the shortest distance. Nevertheless, the results of GSSA when n= 10, and p= 1000 and p=5000 seemed to be not ideal as compared with GA-XX, where, GAXX has better performance than GSSA. This phenomenon happens due to the behavior of GSSA as a stochastic algorithm when dealing with the large-scale problem. Thus, GSSA simply pick up random potential solutions and evaluate them according to its random behavior (James, 2003). Therefore, there is no guarantee that such an algorithm (based on random choices) will always find the global optimum (Collet & Rennard, 2007). However, as the whole, GSSA showed a comprehensively better performance.
Table 4.1: The shortest distances discovered from the executed 10 runs using GSSA, GA-XX and GA-1X.
Population, p n 10
20
30
40
Algorithm
1000
2000
3000
4000
5000
GSSA
98.003
95.289 96.446
87.922
91.0164
94.617
89.760
GA-XX
94.373
88.511 101.177
GA-1X
130.389
113.874
124.686
115.113
122.958
GSSA GA-XX
157.972 163.740
149.713 162.133
158.063 164.168
152.154 165.537
139.064 146.251
GA-1X
292.819
299.943
281.290
287.963
291.253
GSSA GA-XX
198.922 213.169
190.485 198.643
189.101 202.315
205.128 207.311
204.465 207.633
GA-1X
498.971
481.285
477.500
495.247
496.988
GSSA GA-XX
243.425 251.667
235.712 239.178
228.921 249.184
236.649 249.995
239.535 251.859
GA-1X
679.322
714.039
696.956
660.416
682.911
54 In term of shortest distances that had been discovered by the algorithms for every n are summarized in Table 4.2. The percentage error was calculated to find the difference between discovered shortest distance and the optimal distance (Bahaabadi et al., 2012). The formula of percentage error is shown in equation 4.1.
(4.1)
Table 4.2: The shortest distances discovered by GSSA, GA-XX and GA-1X for n=10, 20, 30 and 40.
n
Algorithms
Shortest distance
Error (%)
87.922
0
GA-XX
89.76
0.021
GA-1X
113.874
29.517
139.064
0
GA-XX
146.251
5.168
GA-1X
281.29
102.279
189.101
0
GA-XX
198.643
5.046
GA-1X
477.5
152.51
228.921
0
GA-XX
239.178
4.481
GA-1X
660.416
188.49
GSSA 10
GSSA 20
GSSA 30
GSSA 40
Optimal Distance 87.922
139.064
189.101
228.921
From the experiment, all the shortest distances tour for GSSA, GA-XX and GA-1X on n=10, 20, 30 and 40 were tabulated in the Table 4.2. Based on the table, the results show that GSSA successfully discovered the shortest distances in all the nOTSP instances n=10, 20, 30 and 40 as compared to GA-XX and GA-1X. The shortest distances are shaded in green as can be observed in Table 4.2. Meanwhile, these shortest distance values were used as the optimal distance to calculate the percentage error in this experiment. Based on the Table 4.2, all the optimal distances were obtained by GSSA, therefore, the percentage error for GSSA in all instances, n are 0%. On the other hand, the percentage error obtained by GA-XX for n=10, 20, 30
55 and 40 are 0.021%, 5.168%, 5.046% and 4.481%, respectively. However, GA-1X seemed to be not performing well for the all four nOTSP instances. The calculations showed that the percentage error for GA-1X increased drastically as n increased. The percentage error for n=10, 20, 30 and 40 are 29.517%, 102.279%, 152.51% and 188.49%, respectively.
Table 4.3: Average distance over 10 runs using GSSA, GA-XX and GA-1X.
Population, p n
Algorithm
1000
2000
3000
4000
5000
10
GSSA
107.23
100.994
101.658
96.686
95.865
GA-XX
113.584
112.511
105.342
106.946
104.419
GA-1X
144.732
133.404
132.311
130.761
131.565
GSSA
172.704
169.517
167.28
165.464
154.146
GA-XX
178.467
174.248
175.114
178.381
169.444
GA-1X
315.246
310.311
298.123
303.507
298.444
GSSA
214.061
212.557
211.482
215.406
211.02
GA-XX
229.145
224.96
219.822
219.309
225.971
GA-1X
524.062
522.027
504.504
514.896
512.536
GSSA
257.427
246.475
245.556
247.43
246.636
GA-XX
261.962
257.345
260.275
256.422
261.82
GA-1X
725.898
732.294
731.897
712.668
711.656
20
30
40
In terms of the average distance, the average distance over 10 runs using GSSA, GA-XX and GA-1X were calculated and are shown in Table 4.3. The average distance was comprehensively calculated in order to see the performances and capacities of each algorithm. The bold numerical values in Table 4.3 represents the shortest average distances for each different population p, and the bold numerical values in red represents the shortest average distances for the entire n. From Table 4.3, it is clearly showed that the GSSA obtained all the best average shortest average distances for all nOTSP instances n=10, 20, 30 and 40 with population size p=1000, 2000, 3000, 4000 and 5000. Furthermore, in term of shortest average distances, the results showed that GSSA can achieve shortest average distances for all the instances n when compared to GA-XX and GA-1X. The average distances that obtained by GSSA for all the n are 95.865, 154.146, 211.02 and 245.556 respectively. In
56 conclusion, GSSA has outperformed GA-XX and GA-1X in all the nOTSP instances, n and population size, p.
4.2.2
Discussion on Algorithms’ Performance
The results obtained from the experiments have clearly illustrated the performances of the algorithms. Obviously, the GSSA has the most outstanding performance than GA-XX and GA-1X. There are four reasons that caused GSSA to have a better performance than others. The contributions came from the new characteristic of the algorithm, adequate genetic diversity, being free from the destructive effect of crossover operator and having a low tendency in producing crossing path. These reasons are detailed in the following explanation.
4.2.2.1 New Characteristic of the Algorithm
GSSA is a GA based algorithm that contains of Simplified Swarm Optimization (SSO)’s characteristic. This characteristic called Solution Update Mechanism (SUM) was modified by embedding the three mutation operator to avoid the loss of genetic diversity in the population. In each iteration, if there is no better solutions are found, the SUM will produce more possible solutions through mutation operator. Once the better solutions are found, SUM will end the current iteration and proceed to the next iteration, else the SUM will carry the chromosomes to the next mutation operator to be mutated again. With this characteristic, the varieties of solutions are increased. Increase in the variety of solutions indicates that there will be more possible solutions. Therefore, the probability of discovering the best solution in the population is higher.
4.2.2.2 Adequate Genetic Diversity
One of the most important factors that determine the performance of GA or GA based algorithm is genetic diversity. The new characteristic of the GSSA has brought influences to the genetic diversity in the population. This is because genetic diversity has been considered as the main reason causing the premature convergence. Lacking
57 of genetic diversity could lead the algorithm to premature convergence in a suboptimal state. Therefore, GA or GA based algorithm would find the solution if the population has enough diversity. Thus, maintaining the diversity of the population is crucial to ensure the algorithm adequately explores the search space entirely. From the experimental result, the GSSA has discovered better optimal solutions (the optimal shortest distances) than the optimal solutions that have been found in GAXX and GA-1X. In addition, another reason that caused GA-XX and GA-1X unable to explore and seek for the better solution is due to the suboptimal have reached. As a result, genetic operators can no longer able to produce child that can outperform their parents. Thus, it can be proved that GSSA has provided the sufficient population diversity in the solution space to thoroughly explore and discover the optimal solution.
4.2.2.3 Destructive Effect of Crossover Operator
Crossover operator might not be destructive in all applications. However, it still depends very much on the problem application and coordination with other operators. In this study, the results have shown that GA-1X has not been extremely effective and it can be highly destructive for good solutions. Meanwhile, GA-XX and GSSA still managed to find better solutions with crossover operator. This destructive effect that caused by crossover operator is known as cloning effect (Senaratna, 2005). Figure 4.1 illustrates how the cloning effect happens. Cloning effect tends to produce the child chromosomes that similar to their parents. This duplication happens as the rate of producing the better chromosome in the algorithm over the generations was reduced and this lead to the loss of genetic diversity in the population and caused premature convergence to happen. Therefore, the decision to eliminate the crossover in GSSA is crucial to avoid the cloning effect of the chromosome.
58
Figure 4.1: Cloning effect caused by crossover operator.
4.2.2.4 Crossing Path
In route or path optimization issues, especially in TSPs, the total distance of a crossing path is always greater than non-crossing path (Applegate et al., 2011; Yong et al., 2012). Figure 4.2 shows the simulation of non-crossing path and crossing path. This issue arises due to insufficient genetic diversity and the cloning effect which have been mentioned previously. From the results, the optimal distances that have been produced by GSSA was much shorter than GA-1X and followed by GA-XX. This proves that the tendency for algorithm GSSA in producing crossing path is lower than in GA-XX and GA-1X.
59
(a)
(b)
Figure 4.2: Simulation of nOTSP’s paths. (a) Non-crossing path obtained from GSSA and (b) Crossing path obtained from GA-XX.
4.3
The Influence of the Population Size on Algorithm
Population sizing has been one of the most important topics to consider in evolutionary computation (Piszcz & Soule, 2006; Diaz-Gomez & Hougen, 2007). It is one of the most significant parameters of the algorithms since it has a direct influence on its search abilities. Some researches claimed that the increasing of population size will lead to better solutions (Rylander, 2002). In other word, insufficient population size would lead to the premature convergence and the result might not be optimized (Bakar & Mahadzir, 2010). Table 4.4 shows the influence of the population size on GSSA, GA-XX and GA-1X. The numeric values shaded in green represents the best solution for different n. As can be seen in Table 4.4, GSSA produced 75% of the best results in the largest population size p=5000 for the instances n=10, 20 and 30. Apart from that, GA-XX has produced 50% of the best result in the largest population size when n=10 and 20, and GA-1X has produced only 25% of its best result in the largest population size. From the results obviously showed that, not all the best results were fallen on the largest population size. This is happened due to the stochastic features of the algorithm that may allow the method to escape the local optimum and eventually approach a global optimum in the smaller population size. On the other hand, somehow there is another trend can be observed from the results. All of the best results that obtained by the three algorithms for all the instances n were fell between p=3000 to 5000. Apparently, the results here further
60 shows that better solutions are easier found in the larger population size. Hence, increases of the population size tend to give better solutions while insufficient population size leads to premature convergence.
Table 4.4: Influence of the population size on the algorithms.
Algorithm: GSSA n
Population, p 1000
2000
3000
4000
5000
10
107.23
100.994
101.659
96.686
95.865
20
172.704
169.517
167.28
165.464
154.146
30
214.061
212.557
211.482
215.406
211.02
40
257.427
246.475
245.557
247.43
246.636
Algorithm: GA-XX n
Population, p 1000
2000
3000
4000
5000
10
113.584
112.511
105.342
106.946
104.42
20
178.467
174.248
175.115
178.381
169.444
30
229.145
224.960
219.822
219.31
225.971
40
261.962
257.345
260.275
256.423
261.820
Algorithm: GA-1X n
Population, p 1000
2000
3000
4000
5000
10
144.732
133.404
132.311
130.761
131.565
20
315.246
310.311
298.123
303.507
298.444
30
524.062
522.027
504.504
514.896
512.5367
40
725.898
732.294
731.897
712.668
711.657
All numerical experiments for the influence of the population size on the result for algorithm GSSA, GA-XX and GA-1X are summarized in Figure 4.3, Figure 4.4 and Figure 4.5. From this section, it can be concluded that the gradual increment of the population size had not lead to the gradual improvement of the solution. However, the study still proves that the larger population size could lead to better solution compared to the smaller population size. In regard to the optimal population size, population size of 3000, 4000 and 5000 respectively have higher feasibility in discovering the optimal solution.
61
Result trend for GSSA 300 257.4272
Distance
250 200
246.4754
245.557
247.4305
246.6361 211.02 154.146
214.0612
212.557
211.4829
215.4063
172.7046
169.5175
167.2802
165.4649
No. of visited cities n 10
150
20 107.2306
100
100.9941
101.659
96.6869
95.8659
30 40
50 0 1
2
3
4
5
Population size p (,000)
Figure 4.3: Influence of the population size on the GSSA.
Result trend for GA-XX 300 250
Distance
200
261.9626
257.3451
229.1458
224.9606
178.4673
174.2488
113.5849
112.5118
256.423
261.8209
219.8223
219.31
225.9716
175.115
178.3819
260.2753
No. of visited cities n 10
169.444
150
20
100
105.3425
106.9464
104.42 30
50
40
0 1
2
3
4
5
Population size p (,000)
Figure 4.4: Influence of the population size on the GA-XX.
62
Result trend for GA-1X 800 725.8982
732.294
731.8972
700
712.6687
711.657
Distance
600 524.0626
522.027
500
504.504
514.8964
512.5367
No. of visited cities n 10
400 315.2468
310.3117
298.123
303.5078
298.4442
144.7328
133.4045
132.3117
130.761
131.5656
300 200
20 30
100
40
0 1
2
3
4
5
Population size p (,000)
Figure 4.5: Influence of the population size on the GA-1X.
4.4
Chapter Summary
The results obtained in this study showed that the GSSA was much more successful in respect to the optimal solution and best average distance when compared to GAXX and GA-1X. These have proven that the SSO’s characteristic, SUM that embedded in GSSA had enhanced the process of searching the solution throughout the search space and increase the ability in searching for the optimal solutions. In regard to the influence of the population size on the algorithm, this study found that the larger the population size could lead to the better solution compared to the smaller population size. However, the result of this study did not show that all the optimal solutions are fall onto the largest population size. Nevertheless, the range of the population size for the algorithm in discovering the optimal solution had been identified which were from p= 3000, 4000 and 5000. The next chapter summarizes and concludes this study as well as listing some recommendations for future works.
CHAPTER 5
CONCLUSION
5.1
Introduction
In the previous chapter, the performance of the proposed algorithm, Genetic Simplified Swarm Algorithm (GSSA) has been compared with GA without crossover operator (GA-XX) and GA with one-point crossover operator (GA-1X). The results showed that GSSA had outperformed the GA-XX and GA-1X in terms of generating the optimal solution, best average and the influences toward size population. This chapter summarizes the outcome of this research. Section 5.2 discusses the research contributions, followed by the conclusion obtained of this research in section 5.3. Lastly, some suggestions for future works are stated in Section 5.4.
5.2
Research Contributions
This research has studied the Travelling Salesman Problem (TSP) based on the daily transportation practical cases. Thus, a new variant of Open Loop Travelling Salesman Problem (OTSP) is inspired and developed. In addition, an improved genetic algorithm (GA) based algorithm is developed in order to optimize the problem in term of finding the shortest tour. The contributions of this research are discussed in details in the next sub-section.
5.2.1
New Variant of TSP
This research proposed a new variant of OTSP that mimic the real-life transportation scenarios. This is due to the real-life transportation scenarios may not exactly similar
65 as what has been pictured in TSP and OTSP. On the contrary, most of the vehicle in daily practical cases may only travel from the starting point and end the journey in another destination without visiting all the cities given and travel only to a certain number of cities with minimum total travelling distance. Therefore, a new variant of OTSP named n-Cities Open Loop Travelling Salesman Problem (nOTSP) is inspired, and successfully developed in this research.
5.2.2
Improved GA with SSO’s Characteristic
GA has been found to be a common classical technique for optimizing the TSPs. Unfortunately, GA often suffers from the tendency to converge towards local optima that caused by the insufficiency of genetic diversity within the population. In order to overcome this drawback, this research successfully adopted the unique characteristic of the Simplified Swarm Optimization (SSO) into GA. This characteristic called Solution Update Mechanism (SUM) which carries the role to provide adequate diversity to the algorithm. Hence, the SUM is modified by embedded the three mutation operators (inversion, displacement and pairwise swap mutation operators) for the application in optimization problem. This algorithm was named as Genetic Simplified Swarm Algorithm (GSSA) and has been implemented on the nOTSP to find the shortest tour.
5.2.3
Performance of Proposed Algorithm-GSSA
In order to verify the performance of GSSA, it has been compared with GA-XX and GA-1X. All the same datasets were run on these three algorithms and the results were recorded. The performance evaluations were carried out based on the discovered optimal solutions and the average solutions. The experimental results show that the proposed algorithm is an efficient approach to finding the shortest path as compared to GA-XX and GA-1X.
66 5.2.4
Influence of the Population Size
The influence of population size toward the algorithm has been investigated as the population size increased. The results also clearly showed that the increase in population size of the GSSA algorithm tends to produce better solution. In addition, the range of the population size for the population size for the algorithm in discovering the optimal solution had been identified which were from p=3000, 4000 and 5000. In other word, these population sizes have higher feasibility to produce better solution in this problem application.
5.3
Conclusion
This study has constructed a new extension of TSP named nOTSP which was inspired by real-life transportation issues. Hence, a new GA based algorithm with SSO characteristic (SUM) was developed to optimize the nOTSP. From the experimental results, GSSA has outperformed GA-XX and GA-1X in term of discovery of optimal solution. The average solutions that have been calculated described that GSSA also has better performance than GA-XX and GA-1X. GSSA have better performance due to the implementation of the new characteristic into the algorithm that leads to an adequate genetic diversity. GSSA is also free from the destructive effect of crossover operator and has a low tendency in producing crossing paths. By improving all the four reasons above could lead to better performance of the algorithm. On the other hand, the increase of the population size did not show any huge improvement on the result. However, the study still proves that the larger population size could lead the algorithm to produce better solution. Nevertheless, this study also discovered that not all the best solutions fell onto the largest population size. Surprisingly, all the good solutions have been found between the range of population sizes 3000, 4000 and 5000. On this range of population sizes, the sufficient diversity causes the increase in the feasibility of discovering good solutions. Overall, the study had successfully met all the objectives that has been mentioned in the Chapter 1, yet there are still room for future expansions. The recommendations are given below:
67 i.
Employ other mutation operator in the algorithm such as Greedy Sub Tour Mutation (GA-GSTM) as mentioned in Albayrak & Allahverdi (2011).
ii.
Improve or modify the structure of the algorithm in order to decrease the computational complexity and making it easier to implement.
iii.
Invent a new metaheuristic permutation algorithm to tackle nOTSP or other permutation problem such as the knapsack problem, Sudoku solver and magic cube solver.
iv.
SSO was used for classification in the past; perhaps it still can be modified and “tailor-made” for the implementation of nOTSP.
REFERENCES
Abdoun, O. & Abouchabaka, J. (2012). A Comparative Study of Adaptive Crossover Operators for Genetic Algorithms to Resolve the Traveling Salesman Problem. arXiv preprint arXiv:1203.3097 Ahmed, Z. H. (2010). Genetic algorithm for the traveling salesman problem using sequential
constructive
crossover
operator. International
Journal
of
Biometrics & Bioinformatics, 3(6), pp. 96. Ahmed, H. & Glasgow, J. (2012). Swarm Intelligence: Concepts, Models and Applications. Technical Report. Kingston: Queen's University. Albayrak, M., & Allahverdi, N. (2011). Development a new mutation operator to solve the Traveling Salesman Problem by aid of Genetic Algorithms. Expert Systems with Applications, 38(3), pp.1313-1320. Al-Dulaimi, B. F., & Ali, H. A. (2008). Enhanced traveling salesman problem solving by genetic algorithm technique (TSPGA). World Academy of Science, Engineering and Technology, 2(2), pp. 296-302. Alexander, S. (2005). On the history of combinatorial optimization (till 1960).Handbooks in Operations Research and Management Science: Discrete Optimization. Andrade, C. E., Miyazawa, F. K., & Resende, M. G. (2013, July). Evolutionary algorithm for the k-interconnected multi-depot multi-traveling salesmen problem. Proceeding of the fifteenth annual conference on Genetic and evolutionary computation conference. pp. 463-470. ACM. Applegate, D. L., Bixby, R. E., Chvatal, V., & Cook, W. J. (2011). The Traveling Salesman Problem: A Computational Study: A Computational Study. Princeton university press.
68 Arya, V., Goyal, A., & Jaiswal, V. (2014). An Optimal Solution to Multiple Travelling
Salesperson
Problem
using
Modified
Genetic
Algorithm. International Journal of Application or Innovation in Engineering & Management, 3(1). Ausiello, G., Crescenzi, P., Gambosi, G., Kann, V., Marchetti-Spaccamela, A., & Protasi,
M.
(2012). Complexity
and
approximation:
Combinatorial
optimization problems and their approximability properties. Springer Science & Business Media. Bae, C., Yeh, W. C., Wahid, N., Chung, Y. Y. & Liu, Y. (2012). A new simplified swarm optimization (SSO) using exchange local search scheme. International Journal of Innovative Computing, Information and Control, 8(6), pp. 4391 4406. Bahaabadi, M. R., Mohaymany, A. S. & Babaei, M. (2012). An Efficient crossover operator for travelling salesman. International Journal of Optimization in Civil Engineering, 2(4), pp. 607-619 Bakar, N. A., & Mahadzir, M. F. (2010). The Impact of Population Size on Knowledge
Acquisition in Genetic Algorithms Paradigm: Finding
Solutions in the Game of Sudoku. Knowledge Management International Conference 2010. Beni, G. & Wang, J. (1989). Swarm intelligence in cellular robotic systems. NATO Advance Workshop on Robots and Biological Systems. Blum, C. & Roli, A. (2003). Metaheuristics in combinatorial optimization: Overview and conceptual comparison. ACM Computing Surveys, 35(3), pp.268 - 308. Blum, C. & Merkle, D. (2008). Swarm intelligence: Introduction and Applications. Natural Computing. Berlin: Springer. Chen, S. M., & Chien, C. Y. (2011a). Solving the traveling salesman problem based on the genetic simulated annealing ant colony system with particle swarm optimization techniques. Expert Systems with Applications, 38(12), pp.1443914450.
69 Chen, S. M., & Chien, C. Y. (2011b). Parallelized genetic ant colony systems for solving
the
traveling
salesman
problem. Expert
Systems
with
Applications,38(4), 3873-3883. Chong, E. K., & Zak, S. H. (2013). An introduction to optimization. Vol. 76. John Wiley & Sons. Chung, Y. Y. & Wahid, N. (2012). A hybrid network intrusion detection system using simplified swarm optimization (SSO). Applied Soft Computing. pp. 3014 – 3022. Čičková, Z., Brezina, I., & Pekár, J. (2013). Open Travelling Salesman Problem with time window. First Logistics International Conference Belgrade. Collet, P., & Rennard, J. P. (2007). Stochastic optimization algorithms. arXiv preprint arXiv:0704.3780. Colorni, A., Dorigo, M., Maniezzo, V. & Trubian, M. (1994). Ant system for jobshop
scheduling. Belgian Journal of Operations Research, Statistics and
Computer Science, 34(1), pp. 39 - 53. Colorni, A., Dorigo, M., & Maniezzo, V. (1992). An Investigation of some Properties of an``Ant Algorithm''. PPSN, 92, pp. 509-520. Cook, W. J., Cunningham, W. H., Pulleyblank, W. R., & Schrijver, A. (2009). Combinatorial optimization. Dantzig, G., Fulkerson, R., & Johnson, S. (1954). Solution of a large-scale travelingsalesman
problem. Journal
of
the
operations
research
society
of
America, 2(4), pp. 393-410. Dantzig, G. B. & Ramser, J. H. (1959). The truck dispatching problem. Management science, 6(1), pp. 80-91. Diaz-Gomez, P. A., & Hougen, D. F. (2007). Initial Population for Genetic Algorithms: A Metric Approach. GEM, pp. 43-49.
70 Dong, G., Guo, W. W., & Tickle, K. (2012). Solving the traveling salesman problem using
cooperative
genetic
ant
systems. Expert
Systems
with
Applications, 39(5), pp. 5006-5011. Dorigo, M., Maniezzo, V., Colorni, A. & Maniezzo, V. (1991). Positive feedback as a search Strategy. Dipartimento di Elettronica. Dorigo, M. & Maniezzo, V. (1992). Optimization, Learning and Natural Algorithms. Ph. D. Thesis, Politecnico di Milano, Italy. Dorigo, M., Birattari, M. & Stutzle, T. (2006). Ant colony optimization. Computational Intelligence Magazine, IEEE, 1(4), pp. 28-39. Dorigo, M., & Gambardella, L. M. (2014). Ant-Q: A reinforcement learning approach to the traveling salesman problem. Proceedings of ML-95, Twelfth Intern. Conf. on Machine Learning, pp. 252-260. Dwivedi, V., Chauhan, T., Saxena, S., & Agrawal, P. (2012). Travelling Salesman Problem using Genetic Algorithm. IJCA Proceedings on Development of Reliable Information Systems, Techniques and Related Issues (DRISTI 2012), (1), 25. Eberhart, R. & Kennedy, J. (1995). A new optimizer using particle swarm theory. Proceedings of the Sixth International Symposium on Micro Machine and Human Science, pp. 39 - 43. El-Gharably, N. E., El-Kilany, K. S., & El-Sayed, A. E. (2013). Optimization Using Simulation of the Vehicle Routing Problem. Proceedings of World Academy of Science, Engineering and Technology. World Academy of Science, Engineering and Technology (WASET). 78(7), pp. 1573. Fletcher, R. (2013). Practical methods of optimization. John Wiley & Sons. Filip, E., and Otakar, M. (2011). The travelling salesman problem and its application in logistic practice. WSEAS Transactions on Business and Economics, 8(4), pp. 163-173.
71 Fogel, D. B. (1993). Empirical estimation of the computation required to reach approximate solutions to the travelling salesman problem using evolutionary programming. Proceedings of the 2nd Annual Conference on Evolutionary Programming, 685, pp. 56-61. Fogel, D. B., & Atmar, J. W. (1990). Comparing genetic operators with Gaussian mutations in simulated evolutionary processes using linear systems. Biological Cybernetics, 63(2), pp.111-114. Fogel, D. B. (2006). Evolutionary computation: toward a new philosophy of machine intelligence. Vol. 1. John Wiley & Sons. Fogel, L. J., Owens, A. J., & Walsh, M. J. (1966). Artificial intelligence through simulated evolution. John Wiley & Sons. Gao, Y. X., Wang, Y. M., & Pei, Z. L. (2012). An improved particle swarm optimisation
for
solving
generalised
travelling
salesman
problem. International Journal of Computing Science and Mathematics, 3(4), pp. 385-393. Geetha, R.R., Bouvanasilan, N. & Seenuvasan, V. (2009). A perspective view on Travelling Salesman Problem using genetic algorithm. Nature & Biologically Inspired Computing, 2009. World Congress. pp.356-361. George, A., Rajakumar, B. R., & Binu, D. (2012). Genetic algorithm based airlines booking terminal open/close decision system. Proceedings of the International Conference on Advances in Computing, Communications and Informatics. pp. 174-179. Ghosh,
D.
(2012).
A
diversification
operator
for
genetic
algorithms.
OPSEARCH, 49(3), pp. 299-313. Goldberg, D. E., & Holland, J. H. (1988). Genetic algorithms and machine learning. Machine learning, 3(2), pp. 95-99. Goldberg, D. E. (1989). Genetic algorithms in search, optimization, and machine learning. Vol. 412. Reading Menlo Park: Addison-wesley.
72 Goldbarg, E. F., de Souza, G. R., & Goldbarg, M. C. (2008). Particle swarm optimization algorithm for the traveling salesman problem. INTECH Open Access Publisher. Gomez, A., & Salhi, S. (2014). Solving capacitated vehicle routing problem by artificial bee colony algorithm. In Computational Intelligence in Production and Logistics Systems (CIPLS), 2014 IEEE Symposium on (pp. 48-52). IEEE. Gould, N. (2006). An introduction to algorithms for continuous optimization. Oxford University Computing Laboratory Notes.. Gupta, D., & Ghafir, S. (2012). An overview of methods maintaining diversity in genetic algorithms. International Journal of Emerging Technology and Advanced Engineering, 2(5), pp. 56-60. Haxhimusa, Y., Carpenter, E., Catrambone, J., Foldes, D., Stefanov, E., Arns, L., & Pizlo, Z. (2011). 2D and 3D traveling salesman problem. The Journal of Problem Solving, 3(2), pp. 8-35. Hiot, L. M. (2010). Adaptation, Learning, and Optimization: Handbook of Swarm Intelligence: Concepts, Principles and Applications. Vol. 8. Springer. Hlaing, Z. C. S. S., & Khine, M. A. (2011). An ant colony optimization for solving traveling salesman problem. International
algorithm
Conference
on
Information Communication and Management ,16, pp. 54-59. Holland, J. (1975). Adaptation in Natural and Artificial System: An Introduction with application to biology, control and artificial intelligence. Cambridge: University of Michigan Press. Holland, J. H., Booker, L. B., Colombetti, M., Dorigo, M., Goldberg, D. E., Forrest, S., & Wilson, S. W. (2000). What is a learning classifier system? Learning Classifier Systems. pp. 3-32. Springer Berlin Heidelberg. James, C. S. (2003). Introduction to Stochastics Search and Optimization. Johns Hopkins University. Applied Physics Laboratory. USA.
73 Kai, A., & Mingrui, X. (2012). A Simple Algorithm for Solving Travelling Salesman
Problem.
Instrumentation,
Measurement,
Computer,
Communication and Control (IMCCC), 2012 Second International C onference on. pp. 931-935. IEEE. Karaboga, D. (2005). An idea based on honey bee swarm for numerical optimization. Technical report-tr06, Erciyes University, engineering faculty, computer engineering department. Karp, R. M. (1972). Reducibility among combinatorial problems. Complexity of Computer Computations. Springer US. pp. 85-103. Keeton, K., Kelly, T., Merchant, A., Santos, C. A., Wiener, J. L., Zhu, X., & Beyer, D. (2007). Don't Settle for Less Than the Best: Use Optimization to Make Decisions. HotOS. Kennedy, J. & Eberhart, R. (1995). Particle swarm optimization. Neural Networks, 1995. Proceedings of IEEE International Conference, Vol. 4, pp. 1942-1948. Khajehzadeh, M., Taha, M. R., El-Shafie, A. & Eslami, M. (2011). A Survey on Meta-Heuristic Global Optimization Algorithms. Research Journal of Applied Sciences, Engineering and Technology, Maxwell Scientific Organization, 3(6), pp. 569 - 578. Khan, F. H., Khan, N., Inayatullah, S., & Nizami, S. T. (2009). Solving TSP problem by using genetic algorithm. International Journal of Basic & Applied Sciences, 9(10), 79-88. Király, A., & Abonyi, J. (2010). A novel approach to solve multiple traveling salesmen problem by genetic algorithm. Computational Intelligence in Engineering. Springer Berlin Heidelberg. pp. 141-151. Koumousis ,V. K. & Katsaras, C. P. (2006). A saw tooth genetic algorithm combining the effects of variable population size and reinitialization to enhance performance. Transactions on Evolutionary Computation, IEEE , 10(1), pp. 19–28.
74 Li, W. H., Li, W. J., Yang, Y., Liao, H. Q., Li, J. L., & Zheng, X. P. (2011). Artificial bee colony algorithm for traveling salesman problem. Advanced Materials Research, 314, pp. 2191-2196. Liu, Y. H. (2008). Solving the probabilistic travelling salesman problem based on genetic algorithm with queen selection scheme. INTECH Open Access Publisher. Liu, S. (2014). A powerful genetic algorithm for traveling salesman problem. arXiv preprint arXiv:1402.4699. Luke, S. (2012). Essentials of Metaheuristics. Retrieved on June 01, 2013, from http://cs. gmu. edu/_sean/book/metaheuristics. Malhotra, R., Singh, N. & Singh, Y. (2011). Genetic algorithms: Concepts, design for
optimization
of
process
controllers. Computer
and
Information
Science,4(2), pp.39-54. Malik, S., & Wadhwa, S. (2014). Premature Convergence In Genetic Algorithm Using Ellite Selection Scheme: Review Paper. Matai, R., Singh, S. & Mittal, M. L. (2010). Traveling Salesman Problem: an Overview of Applications, Formulations, and Solution Approaches. Traveling Salesman Problem, Theory and Applications, InTech. Maredia, A. (2010). History, Analysis, and Implementation of Traveling Salesman Problem (TSP) and Related Problems. Doctoral dissertation, University of Houston. Miller, C. E., Tucker, A. W., & Zemli, R. A. (1960). Integer programming formulation of travelling salesman problem. Journal of the ACM (JACM), 7(4), pp. 326-329. Mishra, E. A, Das, M. N. & Panda, T. C. (2013). Swarm Intelligence Optimization: Editorial Survey. International Journal of Emerging Technology and Advanced Engineering, 3(1), pp. 207 - 230.
75 Nagata, Y., & Soler, D. (2012). A new genetic algorithm for the asymmetric traveling salesman problem. Expert Systems with Applications, 39(10), pp. 8947-8953. Nallusamy, R. (2013). A new approach to solve multiple traveling salesman problems. Ph. D. Thesis. Anna University, Chennai, India. Negnevitsky, M. (2011). Artificial intelligence: a guide to intelligent systems. Pearson Education. Noraini, M. R., & Geraghty, J. (2011). Genetic algorithm performance with different selection strategies in solving TSP. Proceedings of the World Congress on Engineering. Vol 2. Ouaarab, A., Ahiod, B., & Yang, X. S. (2014). Discrete cuckoo search algorithm for the travelling salesman problem. Neural Computing and Applications, 24(78), pp. 1659-1669. Panwar, P., & Gupta, S. (2013). Brief Survey of Soft Computing Techniques Used for Optimization of TSP. International Journal of Computer Science, 3(6), pp. 376-380. Parsopoulos, K. E. & Vrahatis, M. N. (2010). Particle swarm optimization and Intelligence: advances and applications. pp. 1- 328. Patrascu, M. (2015). Genetically enhanced modal controller design for seismic vibration in nonlinear multi-damper configuration. Proceedings of the Institution of Mechanical Engineers, Part I: Journal of Systems and Control Engineering, 229 (2), pp. 158-168. Pedro, O., Saldanha, R., & Camargo, R. (2013). A tabu search approach for the prize collecting traveling salesman problem. Electronic Notes in Discrete Mathematics, 41, pp. 261-268. Pelikan, M., Goldberg, D. E. & Cantu-Paz, E. (2000). Bayesian optimization algorithm, population sizing, and time to convergence. Illinois Genetic Algorithms Laboratory, University of Illinois, Tech.
76 Pham, D., & Karaboga, D. (2012). Intelligent optimisation techniques: genetic algorithms, tabu search, simulated annealing and neural networks. Springer Science & Business Media. Philip, A., Taofiki, A. A., & Kehinde, O. (2011). A genetic algorithm for solving travelling salesman problem. International Journal of Advanced Computer Science and Applications, 2(1). Piszcz, A. & Soule, T. (2006). Genetic programming: Optimal population sizes for varying complexity problems. Proceedings of the Genetic and Evolutionary Computation Conference. pp. 953–954. Pizlo, Z., Stefanov, E., Saalweachter, J., Li, Z., Haxhimusa, Y., & Kropatsch, W. G. (2006). Traveling salesman problem: A foveating model. Journal of Problem Solving, 1, pp.83–101. Roeva, O., Fidanova, S., & Paprzycki, M. (2013). Influence of the population size on the genetic algorithm performance in case of cultivation process modelling. Computer Science and Information Systems (FedCSIS), 2013 Federated Conference, IEEE. pp. 371-376. Royo, B., Sicilia, J. A., Oliveros, M. J., & Larrodé, E. (2015). Solving a LongDistance
Routing
Problem
using
Ant
Colony
Optimization. Appl.
Math, 9(2L), 415-421. Rylander,
S.
G.
B.
(2002).
Optimal
population
size
and
the
genetic
algorithm. Population, 100. No. 400, pp. 900. Saleh, A. H. (2014). Constraint reasoning with local search for continuous optimization. Faculdade de Ciências e Tecnologia, Universidade Nova de Lisboa. The European Master’s Program in Computational Logic Masters Thesis. Saleh, H.A. & Chelouah, R. (2004). The design of the global navigation satellite system
surveying
networks
using
genetic
algorithms.
Applications of Artificial Intelligence, Vol. 17, pp. 111–22.
Engineering
77 Sallabi, O. M., & El-Haddad, Y. (2009). An Improved Genetic Algorithm to Solve the Traveling Salesman Problem. Proceedings of World Academy of Science: Engineering & Technology, 52. Schrijver, A. (2003). Combinatorial optimization: Polyhedral and efficiency. Vol. 24. Springer Verlag. Senaratna, N. I. (2005). Genetic algorithms: The crossover-mutation debate. Degree of Bachelor of Computer Science of the University of Colombo. Šeda, M. (2015). Computational Geometry and Heuristic Approaches for Location Problems.
International
Conference
of
Electrical,
Automation
and
Mechanical Engineering (EAME 2015). Singh, S., & Lodhi, E. A (2013). Study of Variation in TSP using Genetic Algorithm and Its Operator Comparison. International Journal of Soft Computing and Engineering (IJSCE), 2231-2307. Singh, A., & Singh, R. (2014). Exploring Travelling Salesman Problem using Genetic Algorithm. International Journal of Engineering Research & Technology. 3(2),ISSN 2278-0181. Sivanandam, S. N., & Deepa, S. N. (2007). Principle of Soft Computing. John Wiley & Sons. Sivaraj, R., & Ravichandran, T. (2011). A review of selection methods in genetic algorithm. International journal of engineering science and technology, 3(5), pp. 3792-3797. Sur, C., Sharma, S., & Shukla, A. (2013). Solving Travelling Salesman Problem Using Egyptian Vulture Optimization Algorithm–A New Approach. Language Processing and Intelligent Information Systems, 7912, pp. 254267. Springer Berlin Heidelberg. Tang, L., Liu, J., Rong, A., & Yang, Z. (2000). A multiple traveling salesman problem model for hot rolling scheduling in Shanghai Baoshan Iron & Steel Complex. European Journal of Operational Research, 124(2), pp.267-282.
78 Thibert-Plante, X., & Charbonneau, P. (2007). Crossover and evolutionary stability in the Prisoner's Dilemma. Evolutionary computation, 15(3), pp.321-344. To, C. C., & Vohradsky, J. (2007). A parallel genetic algorithm for single class pattern classification and its application for gene expression profiling in Streptomyces coelicolor. BMC genomics, 8(1), pp. 49. Toth, P., & Vigo, D. (2014). Vehicle Routing: Problems, Methods, and Applications (Vol. 18). SIAM. Vashisht, V. (2013). Open Loop Travelling Salesman Problem using Genetic Algorithm. International Journal of Innovative Research in Computer and Communication Engineering, 1(1). pp. 112 – 116. Wakabayashi, K., Watanabe, A., Toyotani, J., Suzuki, K., Murata, K., and Sala-ngam, S. (2014). A Study on the Optimum Location of the Central Post Office in Bangkok:
Applying the Travelling Salesman Problem.
In Logistics
Operations, Supply Chain Management and Sustainability. Pp. 525-537. Springer International Publishing. Wang, X., Liu, D., & Hou, M. (2013). A novel method for multiple depot and open paths, Multiple Traveling Salesmen Problem. Applied Machine Intelligence and Informatics (SAMI). 2013 IEEE 11th International Symposium, pp. 187192. Wang, Y., Tian, D., & Li, Y. (2013). An improved simulated annealing algorithm for traveling salesman problem. Proceedings of the 2012 International Conference on Information Technology and Software Engineering. Springer Berlin Heidelberg. pp. 525-532. Yan, X., Zhang, C., Luo, W., Li, W., Chen, W., & Liu, H. (2012). Solve traveling salesman problem using particle swarm optimization algorithm. International Journal of Computer Science, 9(2012), 264-271. Yang, X. S. (2008). Introduction to mathematical optimization: From linear programming to metaheuristic. Cambridge International Science Publishing.
79 Yang, X. S. (2009). Harmony search as a metaheuristic algorithm. In Music-inspired harmony search algorithm. Springer Berlin Heidelberg. Yang, X. S. (2010). A new metaheuristic bat-inspired algorithm. Nature Inspired Cooperative Strategies for Optimization 2010. Springer Berlin Heidelberg. pp. 65-74. Yang, X. S. (2010). Firefly algorithm, Levy flights and global optimization. Research and Development in Intelligent Systems XXVI. Springer London. pp. 209-218. Yang, X. S. (2012). Nature-Inspired Metaheuristic Algorithms: Success and New Challenges. arXiv preprint arXiv:1211.6658. Yang, X. S. & Rafael Parpinelli (Ed.) (2012). Swarm-Based Metaheuristic Algorithms and No-Free-Lunch Theorems. ISBN: 978-953-51-0364-6, In Tech.
Retrieved
on
June
01,
2013
from
http://www.intechopen.com/books/theory-and-new-applications-of-swarmintelligence/swarmbased-metaheuristic-algorithms-and-no-free-lunch theorems Yang, L., Lin, K., Lin, S., Gao, X., & Ye, S. (2013). Application of Intelligence Algorithm on TSP. International Conference on Advanced Computer Science and Electronics Information. Yao, W. Q. (2014). Genetic Quantum Particle Swarm Optimization Algorithm for Solving Traveling Salesman Problems. Fuzzy Information & Engineering and Operations Research & Management , 21(1), pp. 67-74. Springer Berlin Heidelberg. Yeh, W. C., & Liu, S. L. (2008). A Discrete Particle Swarm Optimization For Evaluating the Multiple Multi-Level Redundancy Allocation Problem. 5th International Conference on Information Technology and Applications. Yeh, W. C. (2009). A two-stage discrete particle swarm optimization for the problem of multiple multi-level redundancy allocation in series systems. Expert Systems with Applications, 36(5). pp. 9192 - 9200.
80 Yeh, W. C., Chang, W. W., & Chiu, C. W. (2011). A simplified swarm optimization for discovering the classification rule using microarray data of breast cancer. Internation al Journal of Innovative Computing, Information and Control, 7(5), pp. 2235-2246. Yong, S., Zenglu, L., Wenwei, L., Zhongkai, Y., Guangyun, L., Jirong, X. (2012). The research and application on improved intelligence optimization algorithm based on knowledge base. Computer Science and Electronics Engineering (ICCSEE), 2012 International Conference, 3, pp. 661-665. Zhang, J. (2009). Natural computation for the traveling salesman problem. Intelligent Computation Technology and Automation, 2009. ICICTA'09. Second International Conference on. pp. 366-369. IEEE. Zhang, W. G., & Lu, T. Y. (2012). The research of genetic ant colony algorithm and its application. Procedia Engineering, 37, 101-106. Zheng, F., Simpson, A. R., & Zecchin, A. C. (2010). A method of assessing the performance of genetic algorithm optimization water distribution design. Proceedings of the 12th Water Distribution System Analysis Symposium, Tucson, Arizona.
81 Appendix A
Computational Results of GSSA, GA-XX and GA-X1
82 Appendix A
83 Appendix A
84 Appendix A
85 Appendix B
Pre-processing results for the predetermined MaxGen on n=10, 20, 30 and 40
86 Appendix B
Parameter settings: p=1000, MaxGen=1000, m=50. n
10
Break Run Iteration 1 2 3 4 5 6 7 8 9 10
Average
n
30
Average
26 10 23 17 13 14 27 15 21 18 18
Break Run Iteration 1 2 3 4 5 6 7 8 9 10
204 248 233 215 206 245 240 214 208 170 218
Optimal Distance 113.9851 99.8721 107.1206 113.6920 109.9806 106.9810 97.9614 113.7892 114.9823 119.6341 109.7998 Optimal Distance 220.4562 197.6542 217.5830 223.8246 220.4826 207.8063 211.8961 209.6489 221.0350 201.6845 213.2071
n
20
Break Run Iteration 1 2 3 4 5 6 7 8 9 10
Average
n
40
Average
146 90 88 123 145 96 102 75 80 89 103
Break Run Iteration 1 2 3 4 5 6 7 8 9 10
482 496 390 443 356 468 409 390 440 482 436
Optimal Distance 187.5936 167.6426 158.6810 174.4582 154.3942 167.6010 156.8137 171.7060 166.6798 167.8263 167.3405 Optimal Distance 263.1973 261.8246 253.6974 264.8090 243.8921 249.6319 257.1526 260.8681 250.3120 250.9458 255.6331
VITA
The author was born in Jun 28, 1989 in Bintulu, Sarawak. His early education started at Sekolah Rendah Jenis Kebangsaan Chung Hwa in 1995 for his primary school. In 2001 he continues his secondary school at Sekolah Menengah Kebangsaan Limbang. After completing his secondary school in 2005, he entered Sekolah Menengah Kebangsaan Kubong for his pre-university in 2006 till 2008. Later, he pursued his first degree in Bachelor of Information Technology and Multimedia in Jun 2009 at Universiti Tun Hussein Onn Malaysia. After he received his bachelor degree in 2012, he started his study in degree of Master in Information Technology at the Universiti Tun Hussein Onn Malaysia in March 2013.