Derivative Based Hybrid Genetic Algorithm: A ...

4 downloads 0 Views 896KB Size Report
Muhammad Asim*, Wali Khan Mashwani*, Muhammad Asif Jan* and Javed Iqbal**. *Department of Mathematics, Kohat University of Science& Technology, KPKΒ ...
Derivative Based Hybrid Genetic Algorithm: A Preliminary Experimental Results Muhammad Asim*, Wali Khan Mashwani*, Muhammad Asif Jan* and Javed Iqbal** *

Department of Mathematics, Kohat University of Science& Technology, KPK, Pakistan.

**

Department of Business Administration, Bahauddin Zakariya University Multan, Pakistan

Email: [email protected] Abstract: Global Optimization has become an important branch of mathematical analysis and numerical analysis in the recent years. Practical example of the optimization problems including the design and optimization of electrical circuit in electrical engineering, object packing problems, the Gibbs free energy in chemical engineering and the Protein structure prediction problems. Genetic algorithm (GA) is one of the most popular population based and stochastic nature based techniques in the field of evolutionary computation (EC). GA mimics the process of natural evolution and provides the maximum or minimum objective function value in a single simulation run unlike traditional optimization methods. This paradigm has great ability to efficiently locate the region in which the global optimum of the test problems exists. However, sometime, it has difficulties and spends much time to find the exact local optimum in the search space of the given test suites and complicated real world optimization problems. In such a situation, local search (LS) techniques are very good tools to handle these issues by incorporating them in the framework of evolutionary algorithms in order to improve further their global search process. In this paper, we have incorporated the Broyden-Fletcher-Goldfarb-Shanno (BFGS) as local search optimizer in GA framework with a hope to alleviate the issues related to optimality and convergence of the original GA. The performance of the suggested hybrid GA (HGA) have been examined by selecting eight

test problems from the widely used

benchmark functions. The suggested HGA have shown promising results for dealing with most of the test problems compared to simple GA by implementing them in a Matlab 2013 environment.

Key words: Global optimization, Evolutionary Computation, Evolutionary Algorithms, Genetic Algorithm, Hybridization, BFGS. 1. Introduction Optimization problems have extreme importance in both

others. However, Continuous optimization problems have

mathematical analysis and evolutionary computation (EC)

wide application in almost all engineering disciplines. A general

having had wide applications in various disciplines of science

optimization problem can be expressed as a minimization

and engineering. In general, optimization can be divided into

(without loss of generality) problem as follows

two categories depending on whether the variables are continuous or discrete. A problems with discrete variables are called combinatorial optimization problems. Continuous Optimization Problems having had floating point variables. Combinatorial optimization problems have wide application in the area of airline scheduling, production planning, location and distribution management, internet routing and many

minimize 𝑓(π‘₯) = 𝑓1 (π‘₯), 𝑓2 (π‘₯) … . … , π‘“π‘š (π‘₯)

subject to {

(1)

𝑔𝑗 (π‘₯) ≀ 0, 𝑗 = 1, 2, . . . 𝑝 β„Žπ‘– (π‘₯) ≀ 0, 𝑖 = 1, 2, . . . π‘ž

Where π‘₯ = [π‘₯1 , π‘₯2 , … , π‘₯𝑛 ] 𝑇 ∈ 𝑅𝑛 is an 𝑛-dimensional vector of optimization/decision variables, 𝑝 is the number of inequality

constraints, and π‘ž is the number of equality constraints. Moreover, 𝐿𝑖 ≀ π‘₯𝑖 ≀ π‘ˆπ‘– , 𝑖 = 1, 2, . . 𝑛, 𝐿𝑖 and π‘ˆπ‘– are the lower and upper bounds of parametric space S and the function 𝑓(π‘₯) is called an objective /fitness function. A solution that optimizes this objective function approximately well is called an optimal solution of the problem (1). If m=1, then the problem (1) is called single objective optimization problem (SOP) in which we focus on the decision space i.e. on the convergence of the solution towards an optimal solution. If mβ‰₯2, then problem (1) is called multi objective problem (MOP). In multi objective optimization, we focus both decision space as well as on the

Figure 1. Scheme of EAs [2].

objective space. However, in single objective optimization, our focus is only on decision space. Evolutionary algorithms (EAs)

1.1 Genetic Algorithm

are well established and effective stochastic methods that

Genetic algorithm (GA) was first proposed by John

automatically solve the given problem (1) without requiring

Holland in 1975 [3]. These algorithms are mainly based on

the user to know or specify the form or structure of the

the concept of natural selection and variation. The

problem in advance. Since its inception, EAs have attracted the

optimization technique gives concepts of biological

attention of many people around the globe. These algorithms

evaluations to evolve an optimal solution to a problem (1).

have successfully tackled various optimization and search

GA operates on set of chromosomes or solutions. All

problems [1,2,15-17]. Mutation, crossover and selection are

chromosomes are evaluated using the objective function of

basic operators of EAs as shown in the Figure 1. These operators mainly evolve the stochastic nature population of

the problems. Three genetic operator's selection, crossover and mutation are applied to population to form a new population for the next generation. This process continues

EAs.

until the optimal solutions are not found or number of

In general, classical EAs can be divided into four paradigms,

function evaluations are not terminated as outlined in the

namely, Genetic Algorithms (GAs) [3], Evolution Strategies

algorithm 1.

(ES) [4], Evolutionary Programming (EP) [5] and Genetic Programming (GP) [6,7]. GAs are well known global search techniques. They do not need any prior knowledge regarding problem to be solved as like gradient based optimization methods.

This

paradigm has ability of handling functions with noises and many complexities. However, being a global search optimizer, they sometime trap in local optima while solving complicated optimization and search problems. These limitations and drawbacks of GA can be overcome by hybridizing it with less time consuming local search techniques [8]. Local search optimizers increase the search process of the global search method for dealing multimodal optimization and search problems.

2. Hybridization of GA with Local Search In general, the combined use of GAs and different efficient local search methods are typically considered to be a good idea for locating local optima with high accuracy and to capture a global view of the search space of the complicated optimization and search problems. Due to fast convergence behaviors, Broyden Fletcher Goldfarb Shanno (BFGS) [9] is well known and affective hillclimbing local search method. BFGS method was proposed by Broyden [11], Fletcher [12], Goldfarb [13], and Shanno [14]. It has been frequently applied to different unconstrained nonlinear global optimization problems. In this paper, BFGS [9] has been employed as local search optimizer in GA framework and as resultant hybrid algorithm called HGA developed to tackle test functions given in Table 1. The algorithmic structure of BFGS is hereby explained in the algorithm 2.

BFGS has been

embedded in step 5 of the HGA Framework. BFGS is an 3

iterative based line search that does not demand for the Hessian matrix explicitly. Instead, it needs a positive definite matrix (i.e., initially identity matrix) which is further updated at each iteration by using the gradient information found in both current and previous iterations.

3. Tested Functions In our experiments we have examined the performance of our suggested derivative based hybrid genetic algorithm using nine single-objective optimization problems selected from the test suite of the 2005

IEEE conference of

evolutionary computing (CEC'05) [10]. These functions are listed in Table 1 and are defined in Table 2.

4.

Parameter Setting and Discussion

Hybrid Genetic Algorithm (HGA) has been applied to a set of 9 simple benchmark functions of dimensions D = 10, as shown in Table 1. All experiments were carried out by using D*100 function evaluations, where D is the dimension of the search space of the used tested function. The population size of the randomly generated set of solution by setting N = 100. To establish a fair comparison of HGA against GA, the same number of function evaluations are parameter settings that are used to run both algorithms 25times independently to solve each test problem as formulated in the formulated in Table 2. The solution quality is summarized in terms of minimum, median, mean, standard deviation and maximum of the objective values obtained by each algorithm are given in the Tables 3 and Table 4, respectively. Table 3 represents numerical results provided by HGA while Table 4 represents the results of the GA. Table 5 provides the best and average of the simple GA and HGA. This table indicates that HGA has efficiently solved the test functions f02-f07, f09 as compared to simple GA as shown in the following figures. However, test functions f02 and f08 are tackled by simple GA in an efficient manner as can see from the figures plotted below.

5.

Conclusion

[6] J.R. Koza (1992), ``Genetic Programming on the

Slow convergence and other related to search direction

Programming of Computers by means of Natural

issues regarding simple GA can address by employing

Selection'', ed. 1st, MIT Press.

different search techniques in GA framework. This is

[7] J.R.

Koza

(1995),

``Genetic

programming

II st

because GA has ability of finding global optima to

Autonomous Discovery of Reusable Programe'', ed. 1 ,

abide trapping in the local basin of attraction of the

MIT Press.

problem to solve. The use of local search methods is a

[8] Parry Gowher Majeed and Santosh Kumar (2014),

good choice to tackle the issue of GA’s trapping.

``Genetic Algorithms in Intrusion Detection Systems: A

BFGS method is very efficient technique having had

Survey'', International Journal of Innovation and Applied

fast convergence behavior. However, it is quite

Studies, 5, 233-240.

sensitive to the starting point but still perform well

[9] R. Battiti and F. Masulli (1990), ``BFGS Optimization

over nonconvex problems. In this paper, we proposed

for Faster and Automated Supervised Learning'',

a new hybrid HGA of GA and BFGS strategy for the

International conference on `` Neural Network'' (INCC

optimization of single objective functions. We have

90), Paris, 757-760.

used nine problems to examine the performance of the

[10] P.N. Suganthan et al. (2005) ``Problem Definitions and

HGA. The suggested hybrid GA have offered best

Evaluation Criteria for the CEC' 2005 Special Session on

results for most test problems compared to GA in

Real-Parameter

terms of convergence speed. In future, we intend to

Algorithms Laboratory, IIT Kanpur, KanGAL Report

examine the algorithmic behavior of the suggested

Number 20.

Optimization'',

Kanpur

Genetic

algorithm over latest test suites which are regularly

[11] C. G. Broyden, The convergence of a class of double

designed for the special session in the IEEE

rank minimization algorithms: 2: The new algorithm, J.

Conference of Evolutionary Computation (IEEE-

Inst. Math. Appl., 6 (1970), pp. 222-231.

CEC) Series [10].

[12] R. Fletcher, A new approach to variable metric algorithms, Computer J., 13 (1970), pp. 317-322.

References: [1] D.E. Goldberg (1998), ``Genetic Algorithms in Search, Optimization and Machine Learning'', ed. 1st, AddisonWesley, Reading, Massachusetts. [2] A.E. Eiben and J.E. Smith (2003), ``Introduction to Evolutionary Computing'', ed. 1st, Springer-Verlag, Berlin, Germany. [3] . J.H. Holland (1973), ``Genetic Algorithms and the

[13] D. Goldfarb, A family of variable metric methods derived by variational means, Math. Comp., 24 (1970), pp. 23-26. [14] D. F. Shanno, Conditioning of quasi-Newton Methods for function minimization, Math. Comp., 24 (1970), pp. 647-650. [15] Wali

Khan

Mashwani,

Enhanced

versions

of

Differential Evolution: State-of-the-art Survey '',

Optimal Allocation of Trials'' J. SIAM Compute, 2, 88-

International

Journal

Computing

Sciences

105.

Mathematics (IJCSM), 5(2), PP: 107-126, 2014.

and

[4] T. Back, F. Hoff Meister and H.P. Schwefel (1992),``A

[16] Tayyaba Shah, Muhammad Asif JAN, Wali Khan

Survey of Evolution Strategies'', Proceeding of the Fourth

Mashwani and Hamza Wazir, ''Adaptive Differential

International Conference on ``Genetic Algorithms'', San

Evolution

Mateo, CA: Morgan Kauffman, 2-9.

Problems'',Sci.Int.(Lahore) 28 (3), 2313-2320, 2016

for

Constrained

Optimization

[5] L.J. Fogel, A.J. Owens and M.J. Walsh (1966), ``

[17] Hamza Wazir et al., ''A Penalty Function Based

Artificial Intelligence through Simulated Evolution'', ed.

Differential Evolution Algorithm for Constrained

1st, John Wiley & Sons.

Optimization'', the Nucleus Journal, Vol 53, No. 1 (2016) 155-161