An Algorithm Development Environment for Problem-Solving-Software ...

3 downloads 254 Views 576KB Size Report
trajectory based metaheuristics include Tabu Search (TS), Simulated Annealing (SA) and others. These ... interfaces for exploring viable optimization engines.
An Algorithm Development Environment for Problem-Solving - Software Review Xianshun Chen

Intelligent Systems Centre, 50 Nanyang Drive, Research Techno Plaza Level 7, BorderX Block, Singapore 637553

[email protected]

Abstract ADEP is a development platform catering to the needs of designing and exploring computationally viable configurations of metaheuristic algorithms. It is motivated by the lack of tools capable of capitalizing on the richness of memetic computing techniques that surfaced in recent years. This software review article introduces the functional features of ADEP and describes the the various utility modules within the ADEP metaheuristics framework, in particular the LVRP tree data structure, configuration and simulation visualization, and the automated configuration via the problem-driven learning engine. Keywords memetic algorithm, metaheuristics, algorithms design, automated algorithms configuration, combinatorial optimization, continuous optimization.

1 Introduction Metaheuristics as a whole cogently supports the overall scope and objectives of memetic computation[1,2]. Metaheuristics are commonly classified into population or trajectory based. Examples of population based metaheuristics include techniques and approaches based on Evolutionary Computation (such as Genetic Algorithm (GA), Evolutionary Strategies (ES), and Evolutionary Programming (EP)) plus the various algorithms based on Swarm Intelligence (such as Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Bees algorithm) and others. Examples of trajectory based metaheuristics include Tabu Search (TS), Simulated Annealing (SA) and others. These two categories have complementary properties; population based metaheuristics allow for better exploration of the search space, while trajectory based metaheuristics allow for greater search intensification of the most promising search areas. Hybrids such as Memetic Algorithms (MAs) attempt to take advantage of the complementary nature of these two categories of search techniques. Metaheuristics are known for their efficiency and effectiveness in solving a wide range of NP-hard optimization problems. However, they tend to be problem-dependent and often require significant recoding to cater to different problems. Several of the most successful metaheuristics approaches that have been successful in handling difficult real-world combinatorial problems, such as GA, TS, and MA, rely upon the appropriate coordination of low-level heuristics that incorporate domain-specific knowledge (e.g., local search operators, constructive methods, genetic operators, etc.). The performance tends to vary significantly from one problem instance to another. Admittedly, for any given set of problems or instances of problems where one does not have any a priori knowledge about its characteristics, there is usually no easy way to ascertain which metaheuristics configuration is going to be the most efficient. As such, configuring and tuning an effective search algorithm is often a trial-and-error endeavor. In order to fully capitalize on the potential of metaheuristics in solving real-life problems, a profound knowledge in algorithm design and coding, coupled with domain knowledge specific to the problems being addressed are essential prerequisites [3]. Moreover, users may pose differing requirements and expectations on the performance of the algorithm particularly for problems derived from real-life scenarios. These inadvertently lead to significant differences in the configurations of metaheuristics. Clearly, without the

1

necessary expertise of algorithm design, it is unlikely that a metaheuristics-based search technique with satisfactory performance can be achieved without expending significant time and computational resources to explore the myriad of possibilities. There have been many attempts in constructing software toolkits dedicated to designing and configuring various metaheuristics as well as tools to support the various stages and aspects of systems development based on these techniques. These toolkits include libraries or frameworks that facilitate users in manual crafting of metaheuristics. For example, iOpt [4] and EasyLocal++ [5] allow users to interactively configure metaheuristics. Other examples include HOTFRAME [6], Templar [7], LOCAL++ [8], and also the work of [9]. Recent versions of Constraints Programming (CP) toolkits such as Eclipse and ILOG Solver have also integrated certain basic heuristic search methods such as Tabu Search and Simulated Annealing [10]. Most of these frameworks are implemented in either C/C++ or Java. On the other hand, toolkits such as Easy Specification of Evolutionary Algorithm (EASEA) [11] and Evolutionary Algorithm Modeling Language (EAML) [12] allow scripting which specifies the retrieval of the various EA procedures and routines from well-known EA libraries such as GALib [13] and Evolving Objects [14]. In addition, there are simulation based optimization environments such as Hierarchical Evolutionary Engineering Design System (HEEDS) [15] software that provide user-friendly interfaces for exploring viable optimization engines. In terms of problem modeling, some toolkits provide templates for users to define the problem model while others such as iOpt provides constraintbased problem modeling paradigm. There are also domain-specific frameworks for problem modeling, for example, in the area of scheduling, different frameworks have been developed such as DITOPS/OZONE [16], ASPEN, ILOG Schedule, with varying degrees of modeling capabilities. Although the manually crafted metaheuristics (and hybrids) using toolkits such as iOpt have demonstrated considerable success in a wide range of real-world applications, these algorithms have been crafted for the specific problem in hand. Furthermore, the process of manually designing and configuring tailor-made metaheuristics is tedious and time-consuming. High development cost aside, the source codes are often not directly re-usable for other problems. Naturally, many steps in the direction of automating the design and configuration of metaheuristics have been made in machine learning, adaptive algorithms, and automatic parameters tuning, etc., in order to design systems that automatically evolve and discover optimal metaheuristics configuration with minimal human intervention. Examples include [17] in which a racing algorithm was proposed for configuring the parameters of metaheuristics, and NERO [18] which encodes the neural networks as artificial chromosomes and uses a genetic algorithm to evolve the parameters and structure of the neural network. Genetic Programming [19], Hyper-Heuristics [20] and Meta-Lamarckian Learning [21] are among the most general approach to provide the automatic discovery of approach to automatically discover metaheuristics configuration. Genetic Programming allows a solution method (e.g., a metaheuristics configuration) to be encoded as a tree-based artificial chromosome and then adopts the operations in genetic algorithm to evolve the chromosomes (i.e. the solution method programs). Genetic Programming optimizes the structure of the solution method for the target optimization problem. On the other hand, Hyper-Heuristics (HH) and Meta-Lamarckian Learning (ML) explore the space of problem solvers and are mainly concerned with investigating approaches, choosing from a range of low level heuristics provided by the user. The different techniques can be seen as forms of meta-systems with various metaheuristics control and reproduction mechanism within the meta-system for solving the target optimization problem. The Algorithm Development Environment for Problem Solving (ADEP) attempts to address metaheuristics design and configuration issues, providing a wide range of useful features as discussed above for both manual and automated configuration of metaheuristics. One motivating factor for the development of ADEP is the lack of appropriate tools that adequately supports various stages of optimization algorithms development for real-world applications. ADEP has an underlying metaheuristics framework and a novel LVRP tree data structure that allow the configuration of a variety of population-based and trajectory-based metaheuristics including Genetic Algorithms, Tabu Search, Simulated Annealing, Particle Swarm Optimization, Ant Colony Optimization, Memetic Algorithms and various local search heuristics, as well as generation and compilation of optimized C/C++ and Java

2

codes or other optional “plugged-in” languages. In addition, ADEP provides a visually intuitive metaheuristics design and simulation environment for both combinatorial and continuous optimization problems, incorporating interactive graphical interface for visualizing the metaheuristics configuration and run-time behavior. Furthermore, ADEP incorporates a problem-driven learning module that implements an automated metaheuristics configuration technique which generates specialized (meta)heuristics instances using generic (meta)heuristics templates, enabling the automated configuration of efficient metaheuristics for the target optimization problem.

2 Metaheuristics Framework The ADEP metaheuristics framework serves as a platform for structured modularization of solutions for both combinatorial and continuous optimization problems. It covers both trajectory-based metaheuristics such as SA, TS, and population based metaheuristics such as GA, PSO, ACO as well as hybrids such as MAs. In another sense, it also serves as a platform for crafting new variants of techniques and neighborhood structures. The object class hierarchy in the current version of ADEP code library is depicted in Fig. 1 in UML notation.

Figure 1: ADEP framework main classes The functionality of a typical metaheuristics algorithm can be broken down into components from which component categories are defined and coded as interdependent classes that handle different aspects of metaheuristics search. Component categories are represented by interfaces such as Operator, Problem, Chromosome, and Algorithm while specific components are defined as derived classes of these classes. A search algorithm can be assembled from these components, either built-in or derived extensions of the framework. The framework handles all the relationships between classes, as well as their interactions by mutual method invocations. The framework allows a high degree of flexibility in terms of data structures and domain variables through interfaces and templates, but imposes a fixed structure for controlling the execution flow. With ADEP’s structured modularity, once the basic data structure and operations are defined and “pluggedin”, the system readily facilitates the direct implementation of all standard metaheuristics techniques and their variants. The classes in the framework are split into four categories:

3

Solution Representation: classes Population, Individual, Chromosome, and the derived classes of Chromosome define the solution representation of an optimization problem handled within the framework. A Population object is an aggregate of one (for trajectory based heuristics) or more (for population based heuristics) Individual objects, each representing a candidate solution. An Individual object is an aggregate of one (for single chromosome solution representation) or more (for multi-chromosomes solution representation) Chromosome objects, which together represent the solution. The derived Chromosome classes enable solution representation in different problem domains. For example, Permutation Chromosome is for solution representation in order-based combinatorial optimization problems; Continuous Value Chromosome can represent solution in continuous optimization problems; Binary Chromosome enables solution representation in both combinatorial and continuous optimization domains. The framework does not limit solution representation to fixed-length vector of values or symbols. For example, a derived Chromosome object can be a variable length vector (for Hyper-Heuristics) or a tree structure (for Genetic Programming). Problem Modeling: Classes BaseProblem, Problem, and their derived classes define the problem modeling within the framework. BaseProblem serves as abstract interface for an optimization problem. This interface allows usage of the same metaheuristics for a family of optimization problems without the need for instituting separate functionality. The interface has two main overridable methods: readInput() for loading data from file stream or other data sources and evaluate() for modeling the objective function of the optimization problem. BaseProblem also incorporates overridable interfaces for domain-specific implementation of various local search heuristics. The target optimization problem can be modeled by coding the derived classes for either BaseProblem or Problem. Such user-defined classes contain only problem-specific descriptions, but no control information for the algorithm. Classes such as XMLReader, XMLWriter, ExcelReader, and ExcelWriter are available inside the framework to allow manipulation of various data sources such as XML and Excel documents. Procedural Components: The Operator class and its derived classes form procedural components which are the basic entities for building a metaheuristics algorithm. Often, such an Operator object represents a basic concept adopted by a metaheuristics, such as a neighborhood search heuristic. Each Operator object is implemented as a finite state machine (FSM) with its state specified by the mState member variable. The Operator object is also able to write data to the Algorithm object using its mSignal member variable so that other Operator and Algorithm objects can be informed of changes initiated by the current Operator object. In the current version of ADEP, the search components group together basic procedural methods required for single solution heuristic searches such as Solution Initialization, Acceptance Strategy, Neighborhood Search function, Tabu Search based Aspiration Criterion, and Simulated Annealing based Monte Carlo Perturbation while for population-based heuristic searches, procedural methods such as Initial population generation, Crossover, Mutation, Selection, Restart population method, and others. Most of the components used are generic. Some components are specific to a particular solution representation, to cater for problems which may require them. The framework also allows user to generate and experiment with new combinations of search features in a graphically intuitive environment coupled with fast and efficient runtime prototyping capability. For example, in Fig. 1, the operator Simulated Annealing Local Search is a local search heuristic operator that encapsulates Simulated Annealing mechanism which can then be “plugged into” the execution flow of another metaheuristics such as Particle Swarm Optimization or Memetic Algorithm. Algorithm: The Algorithm class object and its derived classes define metaheuristics within the framework. Many popular metaheuristics such as SA, TS, MA, GA, ACO, and PSO are

4

implemented in the framework as derived classes of Algorithm. A complete Algorithm is a valid tree of Operator objects, whereby the execution flow traces a path through the tree. The Algorithm object contains one (for trajectory based metaheuristics and some population based metaheuristics such as ACO and PSO) or more (for population based metaheuristics such as GA or MA) Population objects. These are acted upon by the Operator procedural components of the Algorithm object, in a manner specified by the Problem object. The metaheuristics framework also supports advanced Memetic Algorithms described in [22] [23]. These algorithms can be classified based on the principles of universal Darwinism [24] for memetic systems, namely meme transmission, variation and selection. The categorization of these algorithms in the framework is as follow: 1st Generation MAs: These are hybrid Genetic Algorithms, a marriage between a population-based global searches (often in the form of an evolutionary algorithm) coupled with a cultural evolutionary stage. 2nd Generation MAs: These MAs mimic the mechanisms of memetic transmission and meme selection in their design. Examples include meta-Lamarckian [21], Hyper-Heuristics [20], and MultiMeme [25] algorithms. 3rd Generation MAs: These MAs include co-evolution and self-generation MAs [26][27][28][29] that satisfy all three principles of universal Darwinism of a basic memetic evolution system. The 1st generation MAs is the default implementation within the metaheuristics framework. The 2 nd and 3rd generations MAs implementation are supported in the framework via the Gadget interface of the Individual class. The whole framework is fully coded in Java and C/C++ languages with all the acknowledged benefits associated with these languages. In addition, ADEP allows other languages to be optionally “plugged-in”. Although the framework conveniently caters for extensions, it also maximizes codes reuse to reduce the needs for additional source code to be written for each new application. For example, many modules of codes are shared between different metaheuristics paradigms, such as TS and MA. Furthermore, the application development environment is enhanced cumulatively in the sense that the framework is enriched with modules from each new application, subsequently reducing further the effort in developing similar applications. Being an extensively built framework, in most cases users are only required to provide the problem modeling by coding the derived classes of either the BaseProblem or Problem class and making use of utilities within ADEP to complete the entire process of configuring metaheuristics, including codes generation and compilation. These ADEP features are further discussed in the next section.

3 Metaheuristics Configuration The process of deriving a set of metaheuristics configurations optimized for a specific real-world problem requires thorough understanding and familiarity with the class of algorithms available. Profound experience, cumulative expertise, and sometimes intuition acquired through years of working in the related area help in making the development process less daunting. This is especially true when dealing with state-of-the-art metaheuristics which over the years have become more complex. To ease this configuration process, ADEP uses a novel LVRP tree structure for storing and manipulating metaheuristics configurations, coupled with an intuitive graphical configuration and simulation environment, and an automated metaheuristics configuration engine. 3.1 LVRP Trees During the process of configuring a search algorithm, for each procedural component, the environment employs a data tree structure to represent the options available for the procedural component. The data

5

structure is a Left Variation - Right Property (LVRP) tree structure of nodes representing procedural choices associated to the search algorithm. Conceptually, the metaheuristics framework can be modeled as Φ, a set of LVRP trees. For instance Φ = {T1, T2. . . TL }, where L denotes the number of procedural components used in the metaheuristics. Each LVRP tree Tk is a representation of the options for configuring the kth procedural component, with nodes representing the variations and properties of an operator. Variations are mutually exclusive meaning only one option may be selected. For example in Fig. 4, only one among the 1-point, Uniform, or Order-1 leaf nodes will be selected as the Crossover. On the other hand, properties are mandatory operators or algorithmic parameters. Figures 2, 3, 4, 5, 6 and 7 show an illustration of LVRP trees provided in the MA metaheuristics framework. The different shapes highlight the characteristics of a tree node, e.g. a star shape “ACO” in Fig. 2 represents a constructive approach, which is differentiated from the normal elliptical representation of other improvement-based initialization procedures under the “Population Initialization” variations.

Figure 2: LVRP of Population Initialization operator Population Evaluation

variations

properties

Fitness Scaling

Sorting

variations

Ranking

Power Law Scaling

Sigma Truncation

properties

properties

Power Factor

Fitness Function

properties

Linear Scaling

Do Fitness Scaling?

properties

Deviation Weighting Factor

Scaling Factor

Figure 3: LVRP of Population Evaluation operator

6

Offspring Producer properties

variations

variations

Uniform

Order-1

Parent(s) Selection

Cloning

Crossover

variations

properties

Crossover Rate

1-point

Roulette Wheel

Random

Tournament

Figure 4: LVRP of Offspring Producer operator Individual Operation

variations

2-exchange local search

Mutate

variations

variations

Tabu Search

variations

robust

# of iterations

short term tabu list size

long term tabu list size

Windowed

Mutate Rate

Mutate Size

properties

full or partial search

properties

Scramble

First Improvement

properties

properties

simple

properties

Best Improvement

properties

variations

tabu list size

full

full or partial search

selection sequence

variations

variations

partial

random

ordered

full

partial

Figure 5: LVRP of Individual Learning operator Population Update

variations

PU_1

PU_2

properties

PU_3

properties

# of elites

update interval

Figure 6: LVRP of Population Update operator

7

Termination Condition

variations

Time

Max # of iterations

properties

Convergence

properties

properties

Max # of iterations

Time limit

# of successive iterations without improvment

Figure 7: LVRP for Termination Condition During configuration, a user is guided through the tree structure node-by-node by a traversal algorithm. At each node, a selection is made based on the available options. Certain options determine the traversal path through the tree structure. This traversal process is illustrated in Fig. 8. This makes possible a generic software system, which provides an open and friendly interface; enabling users to quickly configure a metaheuristics solution methodology. With LVRP trees, even complex hybrid metahueristics can be modeled by combining LVRP sub-trees from two or more metaheuristics paradigms. Offspring Producer properties

variations

variations

Uniform

Order-1

Parent(s) Selection

Cloning

Crossover

variations

properties

1-point

Crossover Rate

Random

Roulette Wheel

Tournament

Figure 8: Traversal flow of a sample configuration for Offspring Producer procedural component 3.2 Interactive Visualization Besides sound algorithmic frameworks and effective mechanisms for configuration process to unlock the potential of optimizers, intuitive visualization provides useful feedbacks to guide the user in tuning and exploring viable alternatives. To address this, ADEP incorporates visualization components, allowing users to graphically configure structure and behavior of the metaheuristics. In addition to that, components for simulation visualization facilitate the development of interactive optimization systems, allowing what-if conditional scenarios testing along with guided manual interventions. 3.2.1 Configuration Visualizer To visualize a configuration, ADEP incorporates special GUI giving users a more intuitive understanding on the flow and control within the metaheuristics configuration. The utilities provided

8

include visualization of the structural tree of components for the search algorithm along with information on parameters associated to these components. The basic layout of the GUI is shown in Fig. 9. On the top left panel, the work flow of the algorithm is available with the various procedural components organized into functional blocks, each of which is accessible and displayed as a LVRP tree in the bottom left panel when activated. When a node is selected from the LVRP tree, its information is immediately displayed at the bottom right panel. The user can traverse through the LVRP tree of the procedural components for any search heuristic method built using ADEP and fine-tune the configuration by updating the nodes. With the GUI, the configuration process as illustrated in Fig. 8 can be performed without user being aware of the underlying tree traversal process. Metaheuristics search algorithms designed and configured by users are saved in XML format which can be recalled via the ADEP IDE again at a later time for further refining and experimentation.

Figure 9: GUI for visualization metaheuristics configuration.

3.2.2 Simulation Visualizer ADEP also makes it easy to simulate various configured metaheuristics. Users can modify the structure and parameters of any of the metaheuristics as discussed in Section 3.2.1, observing and monitoring their search behavior using the metaheuristics simulation visualization tool. The visualization tool is accessible via the Run button in the ADEP IDE of Fig. 9. This button activates a facility to monitor the performance of the currently configured algorithm on a user-specified problem instance. The steps involved are summarized below: 1. Generate the C/C++ or Java codes of configured metaheuristics in the framework; 2. Compile the generated codes into an executable program; 3. Run the executable program; 4. Generate and display performance statistics detailing the run-time behavior of the configured metaheuristics on the problem instance. This process is transparent to the user, making it convenient to explore alternatives in configuring

9

the metaheuristics and immediately observe the effect of the change on the performance. The visualization of a configured MA simulated on a Vehicle Routing Problem (VRP) instance is illustrated in Fig. 10. The interactive simulation environment allows users the convenience of evaluating and comparing the performance of different versions of the algorithm. Users can conveniently see the effect of the different components to the search process by conducting simulation runs for the algorithms configured without leaving the ADEP IDE. This is particularly useful when users are trying to determine what the best variant of a method is based on a particular search philosophy.

Figure 10: GUI of visualization metaheuristic simulation. 3.3 Automated Configuration Manual configuration process can be tedious because the configurations of metaheuristics as defined by parameters setting, structure, procedures, procedures’ coupling, compatibility of methods in hybrids, etc., can result in varied performance of the algorithms. Coupled with the fact that users may impose performance requirements and expectations of the desired algorithm in terms of efficiency, robustness, stability, etc., to manually configure an appropriate metaheuristics can be time-consuming. With these considerations, ADEP incorporates an intelligent learning engine (activated by the Learning button on the upper right panel of ADEP IDE in Fig. 9) to automatically optimize the metaheuristics configuration. The learning module evolves possible metaheuristics for a given sample of problem instances and evaluates the performance of these metaheuristics at each generation, until a good metaheuristics configuration is found or one of the stopping conditions is reached. Instead of a user making selections at each of the nodes during the traversal process, the learning module automatically decides on the choices in a probabilistic manner. Multiple probabilistic traversals of the tree-structures are performed, with probabilistic selection based on a set of one or more numerical trace values associated to the branches of the path which constitutes a search algorithm. A respective numerical index indicative of the quality of the resultant candidate search algorithm is produced. The quality index is obtained based on simulated runs on a sample of problem instances. The evaluated quality of the solution(s) may optionally take into account considerations on the desired performance, such as choosing parameters of search algorithms that favor efficiency, effectiveness, robustness or stability. These steps are carried out many times, in the process trying out different respective choices based on the set of trace values. For each set of trace values, the respective quality indices of one or more of the candidate search algorithms derived are evaluated to produce an aggregated quality index.

10

The aggregated index is indicative of the overall quality of the set of trace values in the generation of search algorithms. For users with no profound background knowledge and experience on metaheuristics algorithms design, using the intelligent learning engine to automatically configure an effective search algorithm is an attractive alternative. Based on the automatically generated algorithm, further refinements to the algorithm to improve its performance can be undertaken.

4 Case Studies We present two case studies to illustrate the whole process of configuring metaheuristics search algorithms. The codes were generated from ADEP in C++, and all simulations were performed on a Pentium IV 2.66 GHz machine having 512 MB of RAM. 4.1 UAV Path-Planning Here we present the development of a Memetic Algorithm (MA) for solving Unmanned Aerial Vehicles (UAV) Path-Planning Problem [30]. For the UAV path-planning, it has been proposed that the waypoints for reconnaissance purpose can be modeled as a sparse graph G(V, E) [31] where V and E correspond to the set of vertices and edges respectively. The objective is to find a UAV flight path starting and ending at the depot v0 visiting as many vertices vi ∈ V as possible such that each vertex vi is visited once. This is equivalent to finding the Hamiltonian cycle s on G, or finding the longest path cycle containing v0 if G is non-Hamiltonian. We manually configured a search algorithm in ADEP resulting in a configuration as shown in Fig. 11 and tested it on a set of commonly used benchmark dataset comprising of 14 sparse graphs with edge densities, Ed, ranging from 0.0398 to 0.1947. The algorithm uses crossover and mutation rate of pc=1.0 and pm=0.1 respectively, and applies the k-exchange local search for site visitation sequencing problem. Out of the 14 graphs, 7 are Hamiltonian while the remaining 7 are non-Hamiltonian. For details of the benchmark dataset, the reader is referred to [30]. Table 1: Simulation results for UAV Path Finding Problem Graphs

G(15,20) G(20,30)* G(20,37)* G(46,69) G(46,71)* G(46,93)* G(49,96) G(49,142) G(100,197) G(100,197)* G(100,203)* G(100,207) G(100,217) G(100,217)*

Avg. CPU time (secs)

0.08 0.11 0.10 1.43 1.15 1.53 0.98 0.96 4.29 5.29 4.17 6.88 9.75 6.23

Avg. longest Path (edges)

Longest path Obtained (edges)

14.0 20.0 20.0 45.0 45.3 45.7 48.0 48.0 95.5 97.8 96.6 97.4 97.9 98.2

14 20 20 46 46 46 48 48 98 100 99 98 99 99

Max. known path length (edges)

14 20 20 46 46 46 48 48 99 100 100 99 99 100

In Table 1, the instances marked with “*” are Hamiltonian graphs. It is shown that for problems with size 49 or less, the configured algorithm was able to uncover the longest known path or Hamiltonian cycle within 1.6 seconds of CPU time. For larger problems with 100 nodes, the resulting search algorithm showed respectable performance, uncovering solutions that are very close to the longest path or Hamiltonian cycle, and the computational time is less than 10 seconds. For all the 14 graphs

11

tested, the solutions with maximal length or maximal length-1 were found. Population Initialization

Evaluation

Termination condition?

Yes

Report

No For i = 1 to population size

Parents Selection

Crossover (probability Pc) ?

Yes

No

Child(i)=Crossover( Parent 1, Parent 2)

Child(i) = Parent 1

Mutation (probability Pm) ?

Yes

No

Child(i)=Mutate(Child(i))

Child(i) remains the same

Local Search (Child(i))

Child(i)

Increment Generation Counter

Figure 11: Resulting search procedure for UAV Path Planning 4.2 Quadratic Assignment Problem As another example, we use the Quadratic Assignment Problem (QAP) to validate the automatic configuration aspect albeit the intelligent learning module of ADEP. The QAP can be described as follows. A problem of size n, can be represented by two n × n matrices A= [aij] and B= [bij] for connectivity and weights respectively. The objective is to find a permutation π of the set M= {1, 2, 3 . . . n}, to minimize the cost C(π), where C(π) = ∑l=1…n ∑t=1...n alt bπ(l)π(t) There are many real-world QAP applications including blackboard wiring, ranking of archaeological data, hospital layout planning, and others [32][33][34]. From ADEP, two memetic algorithms algo_FF1 and algo_FF2 were configured, for two different quality indices (QIs): QI .1: f (algo_FF1) = QI .2: f (algo_FF2) =

1 avg gap + є 1 avg_ gap x T +є

(1) (2)

12

where avg_gap measures the solution quality produced by the MA, є denotes a very small number, and T is a measure of the computational cycles spent by the MA. QI.1 assumes that the user is primarily interested in the MA’s effectiveness whereas QI.2 puts emphasis on both the effectiveness and efficiency of the MA. Fig. 11 illustrates the two resulting algorithms after training by the learning module. The differences in the configurations of the two algorithms are highlighted in bold rectangular blocks. Random Population Initialization (popsize=100)

Random Population Initialization (popsize=50)

Termination condition?

Termination condition?

Yes No

Yes

Report

Evolutionary loop

Evolutionary loop

Random Parents selection

Tournament Parents selection

Template Crossover (probability Pc=1.0)

Template Crossover (probability Pc=1.0)

Scramble Mutation (probability Pm=0.1)

Scramble Mutation (probability Pm=0.1)

Robust tabu search

First improvement 2exchange local search

Just one child

Just one child

Population update 3

Population update 3

Algo_FF1

Report

No

Algo_FF2

Figure 11: Resulting MA based on different quality index For the two MAs, we carried out simulation runs on 14 QAP instances taken from the QAPLIB [35]. The results are summarized in Table. 2, in which avg_gap measures the percentage deviation from best known solution, time measures the average simulation time, and Rs measures the percentages of runs in which best known solutions are obtained. From Table 2, it is shown that through the training process based on QI.1, the intelligent learning module is able to produce MA that generates very good solutions, capable of finding the best-known solutions for 20 out of the 25 instances tested. It is also observed that the solution quality of algo_FF2 shows no significant compromise compared to that of algo_FF1. With QI.2, the learner is able to derive an algorithm which achieves a good balance between the two performance metrics.

13

problem name tai60a sko72 tai80a sko81 sko90 sko100a sko100b sko100c sko100d sko100e sko100f wil100 tai100a tho150

Table 2: Simulation results on QAP benchmark problems problem Algo_FF1 Algo_FF2 size avg_ gap time(secs) Rs avg_ gap time(secs) 60 0.5977% 338.3 0.00% 0.683% 185.68 72 0.00% 530.8 100% 0.017% 276.39 80 0.4156% 833 0.00% 0.769% 399.589 81 0.008% 747 80% 0.017% 348.38 90 0.0048% 1043.8 45% 0.012% 528.53 100 0.007% 1495.9 75% 0.014% 796.36 100 0.00% 1534.6 100% 0.003% 739.69 100 0.00% 1486.3 100% 0.005% 753.87 100 0.007% 1489.4 70% 0.017% 803.74 100 0.00% 1496.7 100% 0.002% 716.43 100 0.012% 1567.5 35% 0.02% 815.83 100 0.002% 1486.4 35% 0.004% 719.59 100 0.5813% 1543.9 0.00% 0.0672% 849.56 150 0.056% 2869.4 0.00% 0.078% 1639.58

Rs 0.00% 45% 0.00% 20% 20% 30% 75% 60% 30% 80% 30% 20% 0.00% 0.00%

5 Conclusions ADEP integrates a number of technologies, offering a complete set of tools for developing metaheuristics that serve as optimization engines of applications. We examined features and aspects of ADEP which make it suitable as a problem-solving environment. In conclusion, the main points highlighted in this article can be summarized as follows: • A framework for synthesizing metaheuristics through modularized procedural components extracted from various metaheuristics search algorithms. • A LVRP tree data structure integrating the various algorithmic modules in the metaheuristics framework for ease of configuring metaheuristics structure, parameters, and heuristic search components. • Visualization components for configuring and simulating metaheuristics, giving users the flexibility to visually configure metaheuristics based on the LVRP tree and carry out comparative evaluations of different alternative search algorithms. • ADEP incorporates an automated metaheuristics configuration technique, enabling it to automatically uncover effective metaheuristics taking into account the desired userspecified performance requirements.

References [1] X.S. Chen, Y.S. Ong, M.H. Lim and K.C. Tan, A multi-facet survey of memetic computing, IEEE Trans. on Evolutionary Computation, Vol. 15, No. 5, pp. 591 - 607, Oct 2011. [2] M.H. Lim, N. Krasnogor, Y.S. Ong and S. Gustafson, Editorial, Memetic Computing, Vol 4:1, March 2012.. [3] M.H. Lim, S. Gustafson, N. Krasnogor, Y.S. Ong, Editorial, Memetic Computing, volume 1 issue 3, pages 173-174. 2009. [4] C. Voudouris, R. Dorne, D. Lesaint, and A. Liret. iopt: A software toolkit for heuristic search methods. In CP01, Paphos, pages 716–729. Springer Verlag, 2001. [5] L.D. Gaspero and A. Schaerf. Easylocal++: An object-oriented framework for flexible design of local search algorithms, Software: Practice and Experience, volume 33 issue 8, pages 733765. 2000. [6] A. Fink and S. Voβ. Hotframe: A heuristic optimization framework. Optimization Software Class Libraries, OR/CS Interfaces Series, pages 81–154. Kluwer Academic Publishers, Boston, 2002. [7] M. Jones, G. McKeown, and V. Rayward-Smith. Templar: An object-oriented framework for distributed combinatorial optimzation. In Proceedings of the UNICOM Seminar on Modern Heuristics for Decision Support. UNICOM Ltd, Brunei University, UK, 1998. [8] A. Schaerf, M. Lenzerini, and M. Cadoli. Local++: a C++ framework for local search algorithms. In Proceedings of Technology of Object-Oriented Languages and Systems, pages 152–161, Jul 1999. [9] A.A. Andreatta, S. E.R. Carvalho, and C.C. Ribeiro. An object-oriented framework for local search heuristics, In Proceedings of Technology of Object-Oriented Languages & Systems, 26th. 1998. [10] P. Shaw, B.D. Backer, and V. Furnon. Improved Local Search for CP Toolkits. In Annals of Operations Research, volume 115 issue 1, pages 31-50, 2002. [11] P. Collet, E. Lutton, M. Schoenauer, and J. Louchet. Take it EASEA. In Lecture Notes In

14

Computer Science; volume 1917 archive. Proceedings of the 6th International Conference on Parallel Problem Solving, Sept 2000. [12] C. Veenhuis, K. Franke, and M. Koppen. A semantic model for evolutionary computation. In Proceedings of 6th International Conference on Soft Computing, 2000. [13] M. Wall. GAlib: A C++ library of genetic algorithm components. Mechanical Engineering Department, Massachusetts Institute of Technology, 1996. [14] M. Keijzer, J.J. Merelo, G. Romero, and M. Schoenauer. Evolving Objects: a general purpose evolutionary computation library. In EA-01, Evolution Artificielle, 5th International Conference in Evolutionary Algorithms, pages 231-244, 2001. [15] HEEDS (Hierarchical Evolutionary Engineering Design System), Getting Started Manual, 2002. [16] O. Lassila. Ozone temporal constraint propagator. Technical Report CMU-RI-TR-96-12, Robotics Institute, Pittsburgh, PA, March 1996. [17] M. Birattari, T. Stutzle, L. Paquete, and K. Varrentrapp. A racing algorithm for configuring metaheuristics. In Proceedings of the Genetic and Evolutionary Computation Conference, pages 11–18. Morgan Kaufmann, 2002. [18] K. O. Stanley, B. D. Bryant, and R. Miikkulainen. Real-time neuroevolution in the nero video game. IEEE Transactions on Evolutionary Computation, volume 9 issue 6, pages 653–668. 2005. [19] J. R. Koza. Genetic Programming: On the Programming of Computers by Means of Natural Selection. Complex Adaptive Systems. The MIT Press, December 1992. [20] E. Burke, G. Kendall, J. Newall, E. Hart, P. Ross, and S. Schulenburg. Hyper-heuristics: an emerging direction in modern search technology, 2003. [21] Y..S. Ong and A.J. Keane. Meta-lamarckian learning in memetic algorithms. IEEE Transactions on Evolutionary Computation, volume 8 issue 2, pages 99–110. April 2004. [22] Y.S. Ong, M.H. Lim, N. Zhu, and K.W. Wong. Classification of adaptive memetic algorithms: a comparative study. IEEE Transactions on Systems, Man, and Cybernetics, Part B, volume 36 issue 1, pages 141–152. Feb. 2006. [23] Q.H. Nguyen, Y.S. Ong, and M.H. Lim., Non-genetic transmission of memes by diffusion. In GECCO ’08: Proceedings of the 10th annual conference on Genetic and evolutionary computation, pages 1017–1024. New York, NY, USA, 2008. ACM. [24] R. Dawkins. The Selfish Gene. Clarendon Press, Oxford, 1989. [25] N. Krasnogor and J. Smith. Emergence of protable search strategies based on a simple inheritance mechanism. In the Genetic and Evolutionary Computation Conference, pages 432– 439. San Francisco, USA, 2001. [26] J. Smith. Co-evolving memetic algorithms: Initial investigations. In Parallel Problem Solving from Nature — PPSN VII, pages 537–548. 2002. [27] N. Krasnogor. Co-evolution of genes and memes in memetic algorithms. In Proceedings of the 1999 Genetic and Evolutionary Computation Conference Workshop Program, page 371, 1999. [28] N. Krasnogor and S. Gustafson. Toward truly ”‘memetic”’ memetic algorithms: discussion and proof of concepts. In Advances in Nature-Inspired Computation: The PPSN VII Workshops, pages 21–22. 2002. [29] N. Krasnogor and S. Gustafson. A study on the use of “self-generation” in memetic algorithms. Natural Computing, volume 3 issue 1, pages 53–76. 2004. [30] K.K. Lim, Y. S. Ong, M. H. Lim, X.S. Chen and A. Agarwal, Hybrid Ant Colony Algorithm for Path Planning in Sparse Graphs, Soft Computing Journal, pages 981-994. Nov 2007. [31] A. Agarwal, Y.L. Xu, and M.H. Lim. Graph compression via compaction of degree-2 nodes for urav visitation sequencing. Unmanned Vehicle System Technologies, 2003. [32] J. Tang, M.H. Lim and Y.S. Ong, Diversity-Adaptive Parallel Memetic Algorithm for Solving Large Scale Combinatorial Optimization Problems, Soft Computing Journal, volume 11 issue 9, pages 873-888. July 2007. [33] J. Tang, M.H. Lim and Y.S. Ong, Parallel Memetic Algorithm with Selective Local Search for Large Scale Quadratic Assignment Problems, International Journal of Innovative Computing, Information and Control, volume 2 issue 6, pages 1399-1416. Dec 2006. [34] J. Tang, M. H. Lim and Y. S. Ong, Adaptation for parallel memetic algorithm based on population entropy. In Proceedings of the 8th annual conference on Genetic and Evolutionary Computation (GECCO), Seattle, Washington, USA, pages 575 - 582, 2006. (Best paper Award Nomination) [35] R. E. Burkard, S. E. Karisch, and F. Rendl. QAPLIB – A Quadratic Assignment Problem Library. Journal of Global Optimization, volume 10 issue 4, pages 391-403. 1997.

15

Suggest Documents