An Adaptive Brain Storm Optimization Algorithm for Multiobjective ...

9 downloads 101124 Views 742KB Size Report
Jun 2, 2015 - Xiaoping Guo; Yali WuEmail author; Lixia Xie; Shi Cheng; Jing Xin ... The proposed algorithm is tested on five benchmark functions; and the ...
An Adaptive Brain Storm Optimization Algorithm for Multiobjective Optimization Problems Xiaoping Guo1 , Yali Wu1(B) , Lixia Xie1 , Shi Cheng2 , and Jing Xin1 1

2

Xi’an University of Technology, Xi’an, Shaanxi, China [email protected] Division of Computer Science, University of Nottingham Ningbo, Ningbo, China [email protected]

Abstract. Brain Storm Optimization (BSO) algorithm is a new swarm intelligence method that arising from the process of human beings problem-solving. It has been well validated and applied in solving the single objective problem. In order to extend the wide applications of BSO algorithm, a modified Self-adaptive Multiobjective Brain Storm Optimization (SMOBSO) algorithm is proposed in this paper. Instead of the k-means clustering of the traditional algorithm, the algorithm adopts the simple clustering operation to increase the searching speed. At the same time, the open probability is introduced to avoid the algorithm trapping into local optimum, and an adaptive mutation method is used to give an uneven distribution on solutions. The proposed algorithm is tested on five benchmark functions; and the simulation results showed that the modified algorithm increase the diversity as well as the convergence successfully. The conclusions could be made that the SMOBSO algorithm is an effective BSO variant for multiobjective optimization problems. Keywords: Brain storm optimization · Multiobjective optimization Clustering operation · Mutation method

1

·

Introduction

Optimization technique has been a significant and successful tool in solving various problems. Multiobjective optimization problems (MOPs) have gained much attention in recent years. Unlike single objective problems, each objective of MOPs usually conflicts with each other [7]. Due to the characteristic of a MOP, its optimum solution is usually not unique, but consists of a set of candidate solutions among which no one solution is better than other solutions with regards to all objectives. This work is partially supported by National Natural Science Foundation of China under Grant Number 61203345 and 61273367, and by Ningbo Science & Technology Bureau (Science and Technology Project Number 2012B10055). c Springer International Publishing Switzerland 2015  Y. Tan et al. (Eds.): ICSI-CCI 2015, Part I, LNCS 9140, pp. 365–372, 2015. DOI: 10.1007/978-3-319-20466-6 39

366

X. Guo et al.

In the multiobjective optimization algorithm, the Non-dominated Sorting Genetic Algorithm (NSGA) is the most widely used algorithm, in which the idea of a non-dominated sorting was introduced to the genetic algorithm that transform the computation of the objective function into multiple virtual fitness. NSGA-II is the improved NSGA-based non-dominated sorting genetic algorithm [3]. It uses a fast non-dominated sorting process, elitist and non-operating parameters to niche operator that overcomes the shortcomings of traditional NSGA which has high computational complexity, non-elite maintaining strategies and without specifying shared radius. Speed constrained multiobjective particle swarm optimization (SMPSO) has used crowding distance to maintain external archive collection which is used in NSGA-II [4]. The SMPSO algorithm introduced the binomial variation in the population space, which can be a good solution to the multimodal problem. Two solutions are randomly selected from the archive and choose the one that has a larger crowd distance as the global best [4]. Brain Storm Optimization (BSO) algorithm, inspired by human idea generation process, is originally proposed solving single objective optimization problems [5]. As a new swarm intelligence algorithm, BSO has received much attention since proposed in 2011. Currently, the main works of BSO research could be categorized into three class: 1) the analysis of BSO algorithm, such as solution clustering analysis [1], and population diversity maintenance [2]; 2) the new variants of BSO algorithms, e.g. BSO Algorithm for multiobjective optimization problems [6,7]; 3) the application of BSO algorithm. The BSO algorithm, which used the k-means clustering and Gaussian/ Cauchy mutation, has been utilized to solve multiobjective problems [6,7]. In this paper, a self-adaptive multiobjective BSO (SMOBSO) algorithm with clustering strategy and mutation is proposed to solve multiobjective optimization problems. Instead of the k-means method, the algorithm adopts a simple grouping method in the cluster operator to reduce the algorithm computational burden. Moreover, a parameter named open probability is introduced in mutation, which is dynamically changed with the increasing of iterations. The proposed algorithm is tested on the ZDT benchmark functions with different dimensions [3]. The simulation results showed that SMOBSO would be a promising algorithm in solving multiobjective optimization problems. The remaining of the paper is organized as follows. The proposed algorithm is described in Section 2. The parameter settings and results are given in Section 3. Finally, the conclusions and further research are detailed in Section 4.

2

Self-adaptive Brain Storm Optimization for Multiobjective Optimization Problems

Compared with the original multiobjective BSO algorithm [7], this paper makes improvements about clustering, mutation, and global archive operations. Three modifications are: a simple clustering operator has replaced the original k-means clustering, an open probability in the mutation operation, and a new updating

An Adaptive BSO Algorithm for (MOPs)

367

strategy of the archive set. These three parts improve the performance of the proposed algorithm together. The SMOBSO algorithm is described in Algorithm 1: Algorithm 1. Procedure of SMOBSO algorithm 1

2 3 4 5 6 7 8 9 10

2.1

Initialize number of the population with the size of Np , maximum iteration, probability parameter p1 , p2 , p3 , p4 , size of archive is max A, number of cluster and cluster center; Initialize the individuals of the population p and calculate their fitness values. Set a null archive Rep; while the maximum iteration has not reached do Calculate the non-dominated solutions and store them in Rep and then crowing distance of each individual in Rep; Cluster individuals using a new clustering strategy in 2.1; Get the elite and general clusters according to clustering results and if there are non-dominated individuals or not; Select the mutation that will be mutated according to 2.2; Update to generate new individuals; Put non-dominated individuals into archive one by one and update archive using 2.3; Output the archive;

Grouping Operator

In the original BSO, k-means clustering was used in the grouping operator. However, there is no strict requirement for grouping operator in BSO algorithm, only individuals should be divided into different classes. Although originally k-means clustering method is accurate. But it needs many computational costs. In this paper, a new simple clustering method is proposed which randomly select M (M is the number of clustering) different individuals as the centers throughout the search area to make the algorithm more simple. Each individual is clustered to its nearest class after calculated the Euclidean distance between each individual and the center of all classes. The detail steps of clustering strategy are given in Algorithm 2. Algorithm 2. The clustering strategy in grouping operator 1 2 3 4

Randomly select M different ideas from the current generation as the centers of M groups, denoted as Sj , 1 ≤ j ≤ M ; while The clustering of all individuals has not completed do Calculate Euclidean distance between each individual xi (1 ≤ i ≤ Np ) in the current generation and every center; For each individual, comparing the M distance values, the individual will be clustered in the cluster which has the smallest M value. In this process, the cluster centers do not always change;

368

2.2

X. Guo et al.

Mutation Operator

Individuals will be mutated after xselect has been selected (xselect is selected according to [6]). Gaussian mutation was used in original BSO algorithm. However, there are two drawbacks: 1) Due to behavior of BSO search is a random process, there is no feedback about the complex transfer function of S-type and cannot get a good search information. This defect will be more obvious in dealing with different optimization problems; 2) log sig() function and random value rand() is all in the range of (0, 1). ξ multiplied by the Gaussian random value. Such random noise may have little effect on the global search when facing a large search range. The open probability is introduced to avoid the algorithm trapping into local optimum. An adaptive mutation method is used to give an uneven distribution on solutions. In order to ensure the convergence, open probability of pr should be small at the beginning of the search. The effect of the pr value became smaller with the increasing of iterations. In order to generate new ideas and to avoid falling into local best solutions, pr should be increased with the increasing of iterations. Based on this principle, the equation of formula variation is as follows:  Ld + (Hd − Ld ) × rand() rand() < pr xnew,d = (1) xselect,d + (x1,d − x2,d ) × rand() otherwise Where Ld and Hd are lower and upper bounds of dth dimension, and x1,d and x2,d are two selected unequal dth dimension values in the population. 2.3

Global Archive

Circulation crowded distance is used to maintain global archive. Crowding distance is used to estimate the intensity of a solution and other solutions around it. Firstly, set all individual crowding distance to 0, and then calculate objective values of individuals in the archive. Then sort each objective function values in ascending order. Crowding distance of the first and last individual will be set to infinity, then calculate the other crowding distance following the formula: distance(i) = distance(i) +

fm (i + 1) − fm (i − 1) max − f min fm m

(2)

max where fm (i) represents the ith individual function value on the mth target, fm min and fm respectively denote the maximum and minimum of all individuals in the function value of the mth target. The average length of cube edge that formed by the solution i + 1 and i − 1 will be calculated. The specific method for the archive set updating shows as follows: 1) Non-dominated individuals in the population will be put into an archive one by one. If an individual is dominated by individuals in external archive, then the individual will be deleted from the archive. Otherwise the individual will join the archive. 2) If number of individuals in archive is less than the maximum number, then there is no need to delete, otherwise crowding distance of all individuals in the current archive

An Adaptive BSO Algorithm for (MOPs)

369

will be calculated and individual that has the smallest crowded distance will be deleted to keep the number of individuals is less than or equal to the maximum capacity. This method is different from the NSGA-II which will sort crowding distance between all non-dominated individuals that were newly generated and in archive. Then choose individuals that have far crowded distances into the next generation. It is benefit for the individuals to be distributed more even in the Pareto front.

3 3.1

Results and Discussions Parameter Settings

During the test, the population size is set to be 200 and the maximum size of the Pareto set is fixed at 100. The pre-determined probability values p1 , p2 and p3 are set to 0.8, 0.8 and 0.2, respectively. All algorithms are implemented in MATLAB using a real-number representation for decision variables. For each experiment, 30 independent runs were conducted to collect statistical results. The number of the dimensions is set to 5, 10, 20, and 30 for each test problem, respectively. 3.2

Results

The ZDT benchmark functions [3] are used in this paper to evaluate the performance of the SMOBSO algorithm. In ZDT functions, the ZDT1 and ZDT3 have a convex of the Pareto frontier, and ZDT3 is discontinuous. The high-dimensional space ratio HR box, which is obtained by repeating for 30 runs, is shown in Figure 1 and 2. The Figure 1 shows the comparisons among different variant of BSO algorithms, which include SMOBSO, MMBSO, MBSO G, and MBSO C. While the Figure 2 shows the comparison among different swarm intelligence algorithms, which include SMOBSO, NSGA-II [3], SMPSO [4]. All the result in Figure 1 shows that SMOBSO for all test functions has the most concentrated distribution, and dirty data is clearly smaller than the other algorithms. According to definition of HR, the closer of HR value to 1, the better performance algorithm has and the closer to the real front, the more uniformly distributed. It is clearly showed that HR value of SMOBSO is almost around 1. The SMOBSO algorithm is the best in comparison of other similar algorithms. To compare the SMOBSO algorithm with the other multiobjective optimization algorithm, the SMPSO and NSGA-II algorithm are used as our comparison algorithm in this paper. The comparisons of high-dimensional space ratio HR among SMOBSO, NSGA-II, and SMPSO solving ZDT benchmark problems are shown in Figure 2. From the figure 2, we can see that SMOBSO is slightly worse than the other two algorithms in high-dimensional space ratio for ZDT1, ZDT2, and ZDT3 problem. For ZDT4 and ZDT6 problem, with the number of dimensions increasing, the dirty data of NSGA-II are significantly increased; and SMOBSO and SMPSO results are similar. In conclusion, the proposed algorithm shows strong potential for multiobjective optimization algorithm.

370

X. Guo et al. 1

0.998

1

0.995

0.997

0.98 0.95

0.99

0.996 0.995

0.96

0.985

0.994

0.9

0.993

0.98

0.94

0.975

0.92

0.992 0.991 0.99

0.97

0.85

0.9 0.8

0.989 0.965

0.988 SMOBSO

MMBSO

MBSO−G

MBSO−C

SMOBSO

(a) ZDT1 5d

MMBSO

MBSO−G

SMOBSO

MBSO−C

(b) ZDT1 10d

1

MMBSO

MBSO−G

MBSO−C

SMOBSO

(c) ZDT1 20d

1

MMBSO

MBSO−G

MBSO−C

(d) ZDT1 30d

1

0.998

0.99

0.95

0.996

0.95

0.98

0.994

0.9

0.992

0.97

0.9

0.85

0.99

0.96

0.988

0.8

0.85 0.986

0.95 0.75

0.984

0.94

0.8

0.982

0.7 0.93

0.98

0.65

0.75 0.978

0.92 SMOBSO

MMBSO

MBSO−G

SMOBSO

MBSO−C

(e) ZDT2 5d

MMBSO

MBSO−G

MBSO−C

SMOBSO

(f) ZDT2 10d

MMBSO

MBSO−G

MBSO−C

SMOBSO

(g) ZDT2 20d

MMBSO

MBSO−G

MBSO−C

(h) ZDT2 30d

1

1.5

1.15

0.995

0.98

1.4 0.99

0.96

1.1

0.94

1.3

0.985

1.05 0.92

1.2

0.98

0.9

1

1.1

0.975

1

0.97

0.88

SMOBSO

MMBSO

MBSO−G

MBSO−C

0.95 0.86 SMOBSO

(i) ZDT3 5d

MMBSO

MBSO−G

SMOBSO

MBSO−C

(j) ZDT3 10d

1

1

MMBSO

MBSO−G

MBSO−C

SMOBSO

(k) ZDT3 20d

1

MMBSO

MBSO−G

MBSO−C

(l) ZDT3 30d

1

0.99

0.95 0.95

0.995

0.98

0.9

0.97

0.99

0.9

0.85

0.96 0.985

0.8

0.85

0.95

0.75 0.94

0.98

0.8

0.7

0.93 0.975

0.65

0.75

0.92

0.6

0.91

0.97

0.7 0.55

0.9 SMOBSO

MMBSO

MBSO−G

SMOBSO

MBSO−C

(m) ZDT4 5d

MMBSO

MBSO−G

MBSO−C

SMOBSO

(n) ZDT4 10d

1.005

MMBSO

MBSO−G

MBSO−C

MMBSO

MBSO−G

MBSO−C

(p) ZDT4 30d

1

1

1.004

SMOBSO

(o) ZDT4 20d

1

0.95

1.002

0.98

0.995

0.9

1

0.99

0.998

0.96 0.85

0.985

0.996

0.98

0.994

0.975

0.992

0.97

0.8

0.92

0.75 0.7

0.9

0.965

0.99

0.94

0.65 0.96

0.988 SMOBSO

MMBSO

MBSO−G

(q) ZDT6 5d

MBSO−C

0.88 SMOBSO

MMBSO

MBSO−G

MBSO−C

(r) ZDT6 10d

SMOBSO

MMBSO

MBSO−G

MBSO−C

(s) ZDT6 20d

SMOBSO

MMBSO

MBSO−G

MBSO−C

(t) ZDT6 30d

Fig. 1. The comparisons of high-dimensional space ratio HR among SMOBSO, MMBSO, MBSO G, MBSO C solving ZDT benchmark problems. The statistics boxplots are derived from 30 independent runs.

An Adaptive BSO Algorithm for (MOPs) 1

1

1

1

0.999

0.999

0.999

0.999

0.998

0.998

0.998

0.997

0.997

371

0.998

0.997 0.997 0.996 0.996

0.996

0.996

0.995 0.995 0.995

0.995

SMOBSO

NSGAII

SMPSO

0.994 SMOBSO

(a) ZDT1 5d

NSGAII

SMPSO

0.994

(b) ZDT1 10d

SMOBSO

NSGAII

SMPSO

SMOBSO

(c) ZDT1 20d

NSGAII

SMPSO

(d) ZDT1 30d

1

1

1

1

0.998

0.998

0.998

0.998

0.996

0.996

0.996

0.994

0.994

0.994

0.996

0.994

0.992 0.992

0.992

0.99

0.99

0.992 0.99

SMOBSO

NSGAII

SMPSO

0.99 0.988 SMOBSO

(e) ZDT2 5d

NSGAII

SMPSO

NSGAII

SMPSO

SMOBSO

(g) ZDT2 20d

NSGAII

SMPSO

(h) ZDT2 30d

0.9995

0.9995

0.9995

0.9995

SMOBSO

(f) ZDT2 10d

0.999 0.999 0.999

0.999

0.9985

0.9985

0.9985 0.998

0.9985

0.9975 0.998

0.998

0.998

0.997 0.9965

0.9975

0.996 0.9975

0.997

0.9975 SMOBSO

NSGAII

SMPSO

SMOBSO

(i) ZDT3 5d

NSGAII

SMPSO

SMOBSO

(j) ZDT3 10d

1

1

0.999

0.999

0.998

0.998

0.997

0.997

NSGAII

SMPSO

SMOBSO

(k) ZDT3 20d

1

1

0.9

0.9

0.8

0.8

0.7

NSGAII

SMPSO

(l) ZDT3 30d

0.7

0.6

0.6

0.5 0.5 0.996

0.996

0.995

0.995

0.4 0.4 0.3 0.3 0.2

SMOBSO

NSGAII

SMPSO

SMOBSO

(m) ZDT4 5d

NSGAII

SMOBSO

SMPSO

(n) ZDT4 10d

1.01

1.01

1.009

1.009

1.008

1.008

1.007

1.007

1.006

1.006

1.005

1.005

1.004

1.004

1.003

1.003

1.002

1.002

1.001

1.001

1

1

NSGAII

SMPSO

SMOBSO

(o) ZDT4 20d

1.01

NSGAII

SMPSO

(p) ZDT4 30d

1.01

1.008

1.006

1.005

1.004

SMOBSO

NSGAII

SMPSO

(q) ZDT6 5d

1

1.002

1

SMOBSO

NSGAII

SMPSO

(r) ZDT6 10d

SMOBSO

NSGAII

SMPSO

(s) ZDT6 20d

0.995

SMOBSO

NSGAII

SMPSO

(t) ZDT6 30d

Fig. 2. The comparisons of high-dimensional space ratio HR among SMOBSO, NSGAII, and SMPSO solving ZDT benchmark problems. The statistics boxplots are derived from 30 independent runs.

372

4

X. Guo et al.

Conclusions

In this paper, we proposed a self-adaptive brain storming optimization algorithm for multiobjective problem. The algorithm adopts the simple clustering operation to increase the searching speed. The open probability is introduced to avoid the algorithm trapping into local optimum; and an adaptive mutation method is used to give an uneven distribution on solutions. Five benchmark functions are simulated to validate the performance of the proposed algorithm. Compared with other similar BSO algorithms, the results show that the proposed algorithm has made a great improvement on the basis of the literature [7]. Moreover, compared with the other algorithms, the SMOBSO shows the better robustness for ZDT benchmark functions than the other algorithms, through the other performance is slightly worse than other two algorithms. In general, The SMOBSO algorithm is an effective modified BSO algorithm for multiobjective optimization problem. It is close to the true front of test functions and has uniform distribution. As a new swarm intelligence algorithm, there is plenty of room for improvement with generality and applicability.

References 1. Cheng, S., Shi, Y., Qin, Q., Gao, S.: Solution clustering analysis in brain storm optimization algorithm. In: Proceedings of The 2013 IEEE Symposium on Swarm Intelligence, (SIS 2013), pp. 111–118. IEEE, Singapore (2013) 2. Cheng, S., Shi, Y., Qin, Q., Zhang, Q., Bai, R.: Population diversity maintenance in brain storm optimization algorithm. Journal of Artificial Intelligence and Soft Computing Research (JAISCR) 4(2), 83–97 (2014) 3. Deb, K., Agrawal, S., Pratap, A., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation 6(2), 182–197 (2002) 4. Nebro, A.J., Durillo, J.J., Garc´ıa-Nieto, J., Coello Coello, C.A., Luna, F., Alba, E.: SMPSO: a new pso-based metaheuristic for multi-objective optimization. In: IEEE Symposium on Computational Intelligence in Multi-Criteria Decision-Making (MCDM 2009), pp. 66–73 (2009) 5. Shi, Y.: Brain storm optimization algorithm. In: Tan, Y., Shi, Y., Chai, Y., Wang, G. (eds.) ICSI 2011, Part I. LNCS, vol. 6728, pp. 303–309. Springer, Heidelberg (2011) 6. Shi, Y., Xue, J., Wu, Y.: Multi-objective optimization based on brain storm optimization algorithm. International Journal of Swarm Intelligence Research (IJSIR) 43(3), 1–21 (2013) 7. Xue, J., Wu, Y., Shi, Y., Cheng, S.: Brain storm optimization algorithm for multiobjective optimization problems. In: Tan, Y., Shi, Y., Ji, Z. (eds.) ICSI 2012, Part I. LNCS, vol. 7331, pp. 513–519. Springer, Heidelberg (2012)