Simplified Swarm Optimization with Differential Evolution Mutation Strategy for Parameter Search Po-Chun CHANG Advanced Analytics Institute University of Technology Sydney Mobile: +61 02 0431521689
[email protected]
Wei-Chang Yeh Integration & Collaboration Laboratory Advanced Analytics Institute, school of software University of Technology Sydney Department of Industrial Engineering and Engineering Management National Tsing Hua University, Hsinchu 300, Taiwan
[email protected] ABSTRACT In practical applications, solving dynamic optimization problem is a challenging field. In recent decades, the optimization approach is not merely dealing with unimodal functions, but also multimodal functions. Even more, the performance of optimization algorithms is affected by the size of dimensional problems. Some algorithms have shown excellent search abilities with small dimensional problems, but they become inadequate with large dimensional space. The opposite may also be true. This paper proposed a robust global optimization algorithm, SSODE - SSO (Simplified Swarm Optimization) with DE (Differential Evolution) mutation strategy. SSO was initially proposed to overcome the shortcoming of PSO (Particle Swarm Optimization) for discrete data space. DE is the meta-heuristic based evolutionary algorithm which is used for optimizing multi-dimensional real-value functions. Here, we performed two experiments on SSODE algorithm and compared it with the original DE and SSO. One was performed on search of parameter values for SVM (Support Vector Machine) with RBF (Radial Basis Function) kernel. The other experiment was performed on five common benchmark functions.
Categories and Subject Descriptors Intelligent Information Processing
General Terms Algorithm, Performance, Theory
Keywords Simplified Swarm Optimization, Differential Evolution, global optimization, parameter search
1. INTRODUCTION Machine learning is one important research area in computer science and has been used in many areas. One example is data mining, many learning algorithms and techniques have been proposed such as naive Bayesian, C4.5 decision tree, SVM (Support Vector Machine), and etc. However, the performance of algorithm is not only depended on the input data, but also the control parameters. Therefore, finding the suitable parameters is Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. ICUIMC(IMCOM)’13, January 17–19, 2013, Kota Kinabalu, Malaysia. Copyright 2013 ACM 978-1-4503-1958-4…$15.00.
important and this process is known as global optimization. Parameter search usually requires time consuming trial-anderror process. The overall performance of the system will be improved while using the efficient algorithm for parameter search. In this paper, new global optimization algorithm SSO (Simplified Swarm Optimization) with DE (Differential Evolution) mutation strategy, named SSODE, is proposed. In decades, SI (Swarm Intelligence) algorithms have become a powerful approach to solve optimization problem. Each particle in the swarm is considered as one candidate solution and a set of particles is defined as one population. Particles follow predefined rules and formulas to regenerate new particles every iteration time. Besides, scientists have been looking into biologically inspired concepts, e.g., natural processes and creatures, and hybrid them with swarm intelligence. One typical algorithm is SSO proposed by Yeh [1]. It was proposed for solving discrete data optimization problems that the shortcoming of PSO (Particle Swarm Optimization) algorithm [2]. PSO is an efficient optimization method for the problem within continuous space, but it easily suffers from local minima in discrete space. SSO originally named as discrete PSO and it solves the optimal problem by applying the particle swarm update strategy. The concept is inspired by EA (Evolution particle, its Algorithm) using crossover technique. For the particle itself, new candidate is synthesised from either the the currently best solution overall particles, the generation best particle, or random variables. solution of DE is a proposed by Storn and Price[3]. It is one type of swarm intelligence hybrid with Evolution Algorithms inspired by the natural evolution of species. Although DE has been successfully employed to solve global optimization problem in many field, one challenge is to choose the appropriate strategy with associate parameter values. DE performance is highly dependent on its control parameters such as crossover rate, selection of mutation strategy, and scaling rate. There are at least seven most frequently used mutation strategies used in DE algorithm [4]. For solving a specific optimization problem, trial-and-error search for appropriate strategy is not only time consuming, but also increases the computational cost. The proposed algorithm SSODE is inspired by the crossover equation from SSO and the mutation strategies from DE. The concept of SSODE is to solve the difficulty of choosing mutation strategy and scaling rate in DE. SSODE has the ability to dynamically choose the suitable mutation strategy and scaling rate according to the quality of current individual candidate solutions.
This paper is organized as follows. In section 2, SSO and DE algorithms will be reviewed. Section 3 presents the proposed algorithm SSODE. Sections 4 presents the experimental results while Section 5 concludes the paper.
g
DE/rand/1 [6]: vi =pgr +F∙ pgr -pgr 1,i,g
2,i,g
g
3,i,g
g
g
g
DE/rand-to-best/1 [6]: vi =pi +K∙ pbest -pi +F∙ pgr -pgr 1,i,g
g
g
DE/best/2 [6]: vi =pbest +F∙ pgr -pgr 1,i,g
2,i,g
3,i,g
g
DE/rand/2 [7]: vi =pgr +F∙ pgr -pgr 1,i,g
2. ALGORITHMS REVIEW Based on the concept of SI (Swarm Intelligence), the initial population (a set of candidate solutions) is randomly generated and then iteratively improve in every generation. The new population is generated based on evolutionary algorithm which involves mutation, crossover, and selection operations to reproduce new population for next generation. The reproduction of new population is based on predefined rules or algorithms [5]. SSO and DE are two different algorithms that evolute the candidate solutions from initial population to approach the global optima. Both SSO and DE initialize a population of random solutions with NP particles. = , …, , ∈ , = 0. , Each candidate solution is D dimensional vector. g is the generation that indicates the number of times algorithm adjust the candidate solutions. = 0 means the initial random solutions. Moreover, SSO and DE also use the same selection rule for preserving the current best solutions: ( )