Evolutionary Computation - Semantic Scholar

2 downloads 12 Views 176KB Size Report
1 Introduction. During the last two decades there has been a growing interest in algorithms ..... establishing some preferences (e.g., seduction 50]). Some simpleĀ ...

Evolutionary Computation: One Project, Many Directions Zbigniew Michalewicz,1 2 Jing Xiao,1 and Krzysztof Trojanowski2 ;

1

Department of Computer Science, University of North Carolina, Charlotte, NC 28223, USA 2 Institute of Computer Science, Polish Academy of Sciences, ul. Ordona 21, 01-237 Warsaw, Poland

Abstract. The eld of evolutionary computation has been growing rapidly over the last few years. Yet, there are still many gaps to be lled, many experiments to be done, many questions to be answered. In this paper we examine a few important directions in which we can expect a lot of activities and signi cant results; we discuss them from a general perspective and in the context of a particular project: a development of an evolutionary planner/navigator in a mobile robot environment.

1 Introduction During the last two decades there has been a growing interest in algorithms which are based on the principle of evolution (survival of the ttest). A common term, accepted recently, refers to such techniques as evolutionary computation (EC) methods. The best known algorithms in this class include genetic algorithms, evolutionary programming, evolution strategies, and genetic programming. There are also many hybrid systems which incorporate various features of the above paradigms, and consequently are hard to classify; anyway, we refer to them just as EC methods. The eld of evolutionary computation has reached a stage of some maturity. There are several, well established international conferences that attract hundreds of participants (International Conferences on Genetic Algorithms|ICGA [30, 31, 52, 7, 24], Parallel Problem Solving from Nature|PPSN [57, 38, 8], Annual Conferences on Evolutionary Programming|EP [20, 21, 58, 39]); new annual conferences are getting started, e.g., IEEE International Conferences on Evolutionary Computation [47, 48, 49]. Also, there are many workshops, special sessions, and local conferences every year, all around the world. A relatively new journal, Evolutionary Computation (MIT Press) [12], is devoted entirely to evolutionary computation techniques; many other journals organized special issues on evolutionary computation (e.g., [17, 42]). Many excellent tutorial papers [5, 6, 62, 18] and technical reports provide more-or-less complete bibliographies of the eld [1, 28, 51, 45]. There is also The Hitch-Hiker's Guide to Evolutionary Computation prepared initially by Jorg Heitkotter and currently by David Beasley [32], available on comp.ai.genetic interest group (Internet), and a new volume, Handbook of Evolutionary Computation, is currently being prepared [4]. There are also a few quite recent (i.e., 1995/96) texts available [3, 19, 41, 56].

Despite of all these developments and activities, there are still many gaps to be lled, many experiments to be done, many questions to be answered. As Ken De Jong observed recently [13] in the context of genetic algorithms: \... the eld had pushed the application of simple GAs well beyond our initial theories and understanding, creating a need to revisit and extend them." In this paper we discuss some of the major current trends in this eld. The next section provides a short introductory material on evolutionary algorithms. Section 3 discusses brie y one particular project: a development of evolutionary planner/navigator in a mobile robot environment. Section 4 presents some current research direction and discusses them in the context of the project. Section 5 contains a few nal remarks.

2 Evolutionary computation In general, any abstract task to be accomplished can be thought of as solving a problem, which, in turn, can be perceived as a search through a space of potential solutions. Since usually we are after \the best" solution, we can view this task as an optimization process. For small spaces, classical exhaustive methods usually suce; for larger spaces special arti cial intelligence techniques must be employed. The methods of evolutionary computation are among such techniques; they are stochastic algorithms whose search methods model some natural phenomena: genetic inheritance and Darwinian strife for survival. As stated in [11]: \... the metaphor underlying genetic algorithms3 is that of natural evolution. In evolution, the problem each species faces is one of searching for bene cial adaptations to a complicated and changing environment. The `knowledge' that each species has gained is embodied in the makeup of the chromosomes of its members." The best known techniques in the class of evolutionary computation methods are genetic algorithms, evolution strategies, evolutionary programming, and genetic programming. There are also many hybrid systems which incorporate various features of the above paradigms; however, the structure of any evolutionary computation algorithm is very much the same; a sample structure is shown in Figure 1. The evolutionary algorithm maintains a population of individuals, P(t) = fx1; : : :; x g for iteration t. Each individual represents a potential solution to the problem at hand, and is implemented as some data structure S. Each solution x is evaluated to give some measure of its \ tness". Then, a new population t

t n

t i

3

The best known evolutionary computation techniques are genetic algorithms; very often the terms evolutionary computation methods and GA-based methods are used interchangeably.

procedure evolutionary algorithm begin

0 initialize P (t) evaluate P (t) while (not termination-condition) do

t

begin

t+1 select P (t) from P (t ? 1) alter P (t) evaluate P (t)

t

end end

Fig. 1. The structure of an evolutionary algorithm (iteration t + 1) is formed by selecting the tter individuals (select step). Some members of the new population undergo transformations (alter step) by means of \genetic" operators to form new solutions. There are unary transformations m (mutation type), which create new individuals by a small change in a single individual (m : S ! S), and higher order transformations c (crossover type), which create new individuals by combining parts from several (two or more) individuals (c : S  : : :  S ! S).4 After some number of generations the algorithm converges|it is hoped that the best individual represents a nearoptimum (reasonable) solution. Despite powerful similarities between various evolutionary computation techniques there are also many di erences between them (often hidden at a lower level of abstraction). EC techniques use di erent data structures S for their chromosomal representations, consequently, the `genetic' operators are di erent as well. For example, the original genetic algorithms (GAs), which were devised to model adaptation processes, mainly operated on binary strings and used a recombination operator with mutation as a background operator [33]. On the other hand, evolution strategies (ESs) were developed as a method to solve parameter optimization problems [55]; consequently, a chromosome represents an individual as a pair of oat-valued vectors, and Gaussian mutation is the leading operator. The original evolutionary programming (EP) techniques [22] aimed at evolution of arti cial intelligence in the sense of developing ability to predict changes in an environment; hence nite state machines were selected as a chromosomal representation of individuals ( ve mutation operators were proposed in connection with this representation). Another interesting approach was developed relatively recently by Koza [35, 36] who suggested that the desired program should evolve i

i

j

j

4

In most cases crossover involves just two parents, however, it need not be the case. In a recent study [14] the authors investigated the merits of `orgies', where more than two parents are involved in the reproduction process. Also, scatter search techniques [25] proposed the use of multiple parents.

itself during the evolution process. Koza developed a new methodology, named Genetic Programming (GP), which processes tree structured programs. It is important to note that many researchers modi ed further evolutionary algorithms by `adding' a problem speci c knowledge to the algorithm. Several researchers have discussed initialization techniques, di erent representations, decoding techniques (mapping from genetic representations to `phenotypic' representations), and the use of heuristics for genetic operators. Such hybrid/nonstandard systems enjoy a signi cant popularity in evolutionary computation community. Very often these systems, extended by the problem-speci c knowledge, outperform classical evolutionary methods as well as other standard techniques [40, 41]. However, such systems are quite hard to classify and it is convenient to refer to them just as evolutionary computation techniques. The next section discusses brie y such an evolutionary system: Evolutionary Planner/Navigator for mobile robot environment.

3 The Project: Evolutionary Planner/Navigator The motion planning problem for mobile robots is typically formulated as follows [67]: given a robot and a description of an environment, plan a path of the robot between two speci ed locations, which is collision-free and satis es certain optimization criteria. Traditionally there are two approaches to the problem: O -line planning, which assumes perfectly known and stable environment, and on-line planning, which focuses on dealing with uncertainties when the robot traverses the environment. On-line planning is also referred to by many researchers as the navigation problem. A great deal of research has been done in motion planning and navigation (see [67] and [37] for surveys). However, di erent existing methods encounter one or many of the following diculties: { high computation expenses, { in exibility in responding to changes in the environment, { in exibility in responding to di erent optimization goals, { in exibility in responding to uncertainties, { inability to combine advantages of global planning and reactive planning. In order to address these diculties, we initiated the study of an Evolutionary Planner/Navigator (EP/N) system; the inspiration to use evolutionary techniques was triggered by the following ideas/observations: { randomized search can be the most e ective in dealing with NP-hard problems and in escaping local minima, { parallel search actions not only provide great speed but also provide ground for interactions among search actions to achieve even greater eciency in optimization, { creative application of the evolutionary computation concept rather than dogmatic imposition of a standard algorithm proves to be more e ective in solving speci c types of real problems,

{ intelligent behavior is the result of a collection of simple reactions to a complex world, { a planner can be greatly simpli ed, much more ecient and exible, and

increase the quality of search, if search is not con ned to be within a speci c map structure, { it is more meaningful to equip a planner with the exibility of changing the optimization goals than the ability of nding the absolutely optimum solution for a single, particular goal. The EP/N embodies the above ideas by following the evolution program approach, i.e. combining the concept of evolutionary computation with problemspeci c chromosome structures and genetic operators. With such an approach, the EP/N is pursuing all the advantages as described above. Less obvious though, is that with the unique design of chromosome structure and genetic operators, the EP/N does not need a discretized map for search, which is usually required by other planners. Instead, the EP/N \searches" the original and continuous environment by generating paths by various evolutionary operators. The objects in the environment can simply be indicated as a collection of straight-line \walls". This representation accommodates both known objects as well as partial information of unknown objects obtained from sensing. Thus, there is little di erence between o -line planning and on-line navigation for the EP/N. In fact, the EP/N uni es o -line planning and on-line navigation with the same evolutionary algorithm and chromosome structure. For more details on the current state of the project, see [65, 66].

4 Evolutionary Computation: Current Directions As indicated in the Introduction, the eld of evolutionary computation has been growing rapidly over the last few years. Yet, there are still many areas in which we can expect a lot of activities and signi cant results. This section discusses such directions and illustrates some of the points by referring to the EP/N project described in the previous section.

4.1 Function optimization For many years, most evolutionary techniques have been evaluated and compared to each other in the domain of function optimization. It seems also that the domain of function optimization would remain the primary test-bed for many new comparisons and new features of various algorithms. In this context it is important to investigate properties of evolutionary algorithms on di erent landscapes, to experiment with various modi cations of evolutionary algorithms (e.g., elitist strategy, non random mating, etc.), to explore the role of operators (e.g., mutation versus crossover, di erent types of operators, etc.), to study the signi cance of infeasible individuals in the population, etc. In particular, studies on evaluation functions seem to be of great importance; after all the evaluation function

serves as the main (sometimes the only) link between evolutionary algorithm and the problem. The EP/N project falls into the category of `function optimization': a nearoptimal path between the source and destination is being evolved. Clearly, the evaluation function of the EP/N is of utmost importance and accommodates di erent (often con icting) optimization goals. Various experiments led us to focus on the short, smooth, and clear (i.e., far from obstacles) paths, thus we faced the interesting and important issue of how to design the evaluation function to accommodate properly all of these requirements. Moreover, during our study on this problem, we discovered an even more interesting issue: how to design the evaluation function to evaluate paths which are not yet collision-free? Such infeasible paths are generated quite often, especially in early generations. If the system generates and discards infeasible individuals, then the evolutionary algorithm will spend most of its time evaluating such infeasible individuals. Moreover, the rst feasible path found will likely trigger a premature convergence (i.e., other infeasible individuals will be driven out from the population), and the system will not be able to nd better individuals. On the other hand, if we keep these individuals in the population, we have also to address yet another interesting issue: how to compare feasible and infeasible individuals in the population? We experimented with di erent evaluation functions [43] and analyzed their impacts. Although the EP/N generally worked well even under simple and crude evaluation functions, we found that better designs of the evaluation function greatly improved the quality of results and planning eciency. The project provides also with additional aspects of study in the area of evolutionary techniques for function optimization such as (1) behavior of the system in various environments (e.g., with di erent number and various density of objects, di erent topology of a feasible part in the search space, etc.), (2) non-stationary environments, where the evaluation function changes with time (due to incomplete information about the environment or the movement of some obstacles), (3) a mixture of evolutionary techniques with memory-based systems (we brie y discuss the issues (2) and (3) in section 4.6), (4) various methods for maintaining diversity in the population, which is essential in the EP/N project: the generation of alternative paths is important in dynamic environments (section 4.3), etc.

4.2 Representation, operators, and other search techniques Traditionally, GAs work with binary strings, ES|with oating point vectors, and EP|with nite state machines (represented as matrices), whereas GP techniques use trees as a structure for the individuals; each of these techniques has a standard set of operators. However, there is a need for a systematic research on evolutionary algorithms, which { incorporate the problem-speci c knowledge by means of appropriate chromosomal data structures, non-standard operators, and additional local search algorithms,

{ represent complex, non-linear objects of varying sizes, and, in particular, represent `blueprints' of complex objects, and { experiment with evolutionary operators for such objects at the genotype

level. These directions can be perceived as important steps towards building complex hybrid evolutionary system. The EP/N project is very appropriate for such research. For example, there are many possibilities for various path representations: starting from a linked list of coordinates of all knot points of a path (this is the current representation, see [65]), through a set of rules, which provide directions and distances to be traveled, to blueprints of complex rules|e.g., a development of a neural network to control the robot. Various evolutionary operators can be experimented with: e.g., knowledge-based operators, which adopt their actions with respect to the current state of search, Lamarckian operators, which improve an individual during its lifetime|consequently, the improved, \learned" characteristics of such an individual can be passed to the next generation, and multi-parent operators [14]. In addition, incorporation of problem-speci c, local search algorithms seems to be worthwhile: such algorithms may enhance ne-grained local search capabilities of an evolutionary algorithm, may help in maintaining diversity of paths, and may be used in the repairing process (i.e., in converting an infeasible path into a feasible one). The EP/N project also reveals the importance of studying the tradeo between the amount of problem-speci c knowledge being incorporated into an algorithm and the eciency of an algorithm, which is essential for various on-line systems.

4.3 Non random mating Most current techniques which incorporate crossover operator use random mating, i.e, mating, where individuals are paired randomly. It seems that with the trend of movement from simple to complex systems, the issue of non random mating would be of growing importance. There are many possibilities to explore; these include introduction of sex or \family" relationships between individuals or establishing some preferences (e.g., seduction [50]). Some simple schemes were already investigated by several researchers (e.g., Eshelman's incest prevention technique [16]), however, the ultimate goal seems to evolve rules for non random mating. A few possibilities (in the context of multimodal optimization) were already explored; these include sharing functions, which permit a formation of stable subpopulations, and tagging, where individuals are assigned labels. However, very little was done in this direction for complex chromosomal structures. In the EP/N project the aspect of non random mating is quite important. Since the robot's environment is only partially known, it is essential to maintain diversity of paths in the population. Di erent groups of paths (i.e., subpopulations or species) can explore di erent areas of the environment; it is important to identify such groups and to maintain their identity. As mentioned earlier, the

generation of alternative paths is essential in dynamic and partially-unknown environments: due to unknown obstacles (in the on-line phase) it might be necessary to change the current path.

4.4 Self-adapting systems

Since evolutionary algorithms implement the idea of evolution, it is more than natural to expect a development of some self-adapting characteristics of these techniques. Apart from evolutionary strategies, which incorporate some of its control parameters in the solution vectors, most other techniques use xed representations, operators, and control parameters. It seems that the most promising research areas are based on inclusion of self adapting mechanisms within the system for: { Representation of individuals (e.g., adaptive strategies, proposed by Shaefer [59], the Dynamic Parameter Encoding technique [54], and messy genetic algorithms [27]). { Operators. It is clear that di erent operators play di erent roles at di erent stages of the evolutionary process. The operators should adapt to the current stage of the search and to the current topology of the landscape being searched (e.g., adaptive crossover [53, 60]). This is especially important for time-varying tness landscapes. { Control parameters. There were already experiments aimed at these issues: adaptive population sizes [2] or adaptive probabilities of operators [10, 34, 61]. However, much more remains to be done. It seems that this is one of the most promising directions of research; after all, the power of evolutionary algorithms lies in their adaptiveness. The adaptiveness is also the key issue in the EP/N project. For example, the current version of the system incorporates 8 operators which play di erent roles in various environments and at various stages of the evolution process. Recently, a new adaptive procedure was developed [66], which adapts the frequencies of these operators on the basis of their eciency (i.e., usefulness and the operational cost). Note that di erent environments require di erent subset of these operators; the adaptive procedure is responsible for tuning all frequences with respect to the current state of search. Additional work would concentrate on a development of adaptive representations (there is an interesting possibility of changing representation of individuals, e.g., from a linked list into rules and vice versa), additional adaptive operators which change their scope and mechanism on the basis of the current state of search, adaptation of the population size and the population structure (e.g., division of the population into sub-populations).

4.5 Co-evolutionary systems

There is a growing interest in co-evolutionary systems, where more than one evolution process takes place: usually there are di erent populations (e.g., additional populations of parasites or predators) which interact with each other.

In such systems the evaluation function for one population may depend on the state of the evolution processes in the other population(s). This is an important topic for modeling arti cial life, some business applications, etc. Co-evolutionary systems might also be important for approaching large-scale problems [46]: a (large) problem can be decomposed into smaller subproblems and there can be a separate evolutionary process for each of the subproblem. These evolutionary processes are connected with each other: evaluation of individuals in one population depends also on developments in other populations. The EP/N project seems ideal for studying some aspects of co-evolutionary systems. First of all, the path planning problem can be decomposed into several sub-problems with di erent sources and destinations. In complex environments such an approach may lead to increased eciency of the system. Secondly, the co-evolutionary models would allow us to extend the EP/N project to study some processes of cooperation, where several robots explore an environment and exchange the evolved information. This line of research leads to investigation of intelligent multi-agent scenarios.

4.6 Diploid/polyploid versus haploid structures

Diploidy (or polyploidy) can be viewed as a way to incorporate memory into the individual's structure. Instead of single chromosome (haploid structure) representing a precise information about an individual, a diploid structure is made up of a pair of chromosomes: the choice between two values is made by some dominance function. The diploid (polyploid) structures are of particular signi cance in non-stationary or partially unknown environments (i.e., for time-varying objective functions) and for modeling complex systems (possibly using co-evolution models). However, there is no theory to support the incorporation of a dominance function into the system; there are also quite limited experimental data in this area. It might be worthwhile to explore the addition of an \experience-related" behavior to the EP/N system. The current version of the adaptive navigator is memoryless: the population adapts to the current evaluation function which changes over time on the basis of sensed information. However, the system \does not remember" explicitly the past experiences. Consequently, the system does not learn from past experiences. It seems that some sort of \memory" (in addition to the object map of the environment) would enhance further the capabilities of the system. One possibility would be to develop an adaptive algorithm based on multichromosome structures with a dominance function. The EP/N project is very appropriate for studying polyploid structures with a dominance function: the environment is not fully known; besides, some obstacles need not stay in the same place all the time. In such an algorithm, each individual consists of more than one chromosome (each chromosome still represents a single solution, i.e., a path). The dominance function determines one chromosome as the current representative of the individual, which then participates in the evaluation process. The other chromosomes in the structure are \inactive" and play the role of memory: past

experiences of the robot. They will, nevertheless, evolve with the currently active chromosome in the same way. Another possible approach would be based on an additional data structure, where past \footsteps" and other \reasonable" paths discovered earlier during the navigation process are stored. Such a memory would be suitable for a static or a relatively stable environment. As the \short memory" for the current navigation task, it may further enhance the ability of the system to escape from dead-end situations. Since the \memory" of the system stores feasible paths from various points of the environment to the goal, it might be possible for the robot, if trapped, to return to some earlier visited knot point and to continue along di erent path, thus avoiding dead-end situation.

4.7 Parallel models

Parallelism promises to put within our reach solutions to problems untractable before; clearly, it is one of the most important areas of computer science. Evolutionary algorithms are very suitable for parallel implementations; as Goldberg [26] observed: \In a world where serial algorithms are usually made parallel through countless tricks and contortions, it is no small irony that genetic algorithms (highly parallel algorithms) are made serial through equally unnatural tricks and turns." This is an important direction which will be investigated in connection with the EP/N project. Note that parallel models can provide a natural embedding for other paradigms of evolutionary computation, like non random mating, some aspects of self-adaptation, and co-evolutionary systems. Also, apart from increased eciency of the system, it would be possible to investigate various paradigms of parallel processing (e.g., island models, massively parallel model, or parallel hybrid GAs [41]) and various migration strategies.

5 Final Remarks

It is worthwhile to note that there are many other approaches to learning, optimization, and problem solving, which are based on other natural metaphors from nature | the best known examples include neural networks and simulated annealing. There is a growing interest in all these areas; the most fruitful and challenging direction seems to be a \recombination" of some ideas at present scattered in di erent elds. Moreover, it seems that the whole eld of arti cial intelligence should lean towards evolutionary techniques; as Lawrence Fogel stated [23] in his plenary talk during the World Congress on Computational Intelligence (Orlando, 27 June { 2 July 1994): \If the aim is to generate arti cial intelligence, that is, to solve new problems in new ways, then it is inappropriate to use any xed set of rules. The rules required for solving each problem should simply evolve..."

References 1. Alander, J.T., An Indexed Bibliography of Genetic Algorithms: Years 1957{1993, Department of Information Technology and Production Economics, University of Vaasa, Finland, Report Series No.94-1, 1994. 2. Arabas, J., Michalewicz, Z., and Mulawka, J., GAVaPS | a Genetic Algorithm with Varying Population Size, in [47]. 3. Back, T., Evolutionary Algorithms in Theory and Practice, Oxford University Press, 1995. 4. Back, T., Fogel, D., and Michalewicz, Z. (Editors), Handbook of Evolutionary Computation, Oxford University Press, New York, 1996. 5. Beasley, D., Bull, D.R., and Martin, R.R., An Overview of Genetic Algorithms: Part 1, Foundations, University Computing, Vol.15, No.2, pp.58{69, 1993. 6. Beasley, D., Bull, D.R., and Martin, R.R., An Overview of Genetic Algorithms: Part 2, Research Topics, University Computing, Vol.15, No.4, pp.170{181, 1993. 7. Belew, R. and Booker, L. (Editors), Proceedings of the Fourth International Conference on Genetic Algorithms, Morgan Kaufmann Publishers, Los Altos, CA, 1991. 8. Davidor, Y., Schwefel, H.-P., and Manner, R. (Editors), Proceedings of the Third International Conference on Parallel Problem Solving from Nature (PPSN), Springer-Verlag, New York, 1994. 9. Davis, L., (Editor), Genetic Algorithms and Simulated Annealing, Morgan Kaufmann Publishers, Los Altos, CA, 1987. 10. Davis, L., Adapting Operator Probabilities in Genetic Algorithms, in [52], pp.61{69. 11. Davis, L. and Steenstrup, M., Genetic Algorithms and Simulated Annealing: An Overview, in [9], pp.1{11. 12. De Jong, K.A., (Editor), Evolutionary Computation, MIT Press, 1993. 13. De Jong, K., Genetic Algorithms: A 25 Year Perspective, in [68], pp.125{134. 14. Eiben, A.E., Raue, P.-E., and Ruttkay, Zs., Genetic Algorithms with Multi-parent Recombination, in [8], pp.78{87. 15. Eshelman, L.J., (Editor), Proceedings of the Sixth International Conference on Genetic Algorithms, Morgan Kaufmann, San Mateo, CA, 1995. 16. Eshelman, L.J. and Scha er, J.D., Preventing Premature Convergence in Genetic Algorithms by Preventing Incest, in [7], pp.115{122. 17. Fogel, D.B. (Editor), IEEE Transactions on Neural Networks, special issue on Evolutionary Computation, Vol.5, No.1, 1994. 18. Fogel, D.B., An Introduction to Simulated Evolutionary Optimization, IEEE Transactions on Neural Networks, special issue on Evolutionary Computation, Vol.5, No.1, 1994. 19. Fogel, D.B., Evolutionary Computation: Toward a New Philosophy of Machine Intelligence, IEEE Press, Piscataway, NJ, 1995. 20. Fogel, D.B. and Atmar, W., Proceedings of the First Annual Conference on Evolutionary Programming, La Jolla, CA, 1992, Evolutionary Programming Society. 21. Fogel, D.B. and Atmar, W., Proceedings of the Second Annual Conference on Evolutionary Programming, La Jolla, CA, 1993, Evolutionary Programming Society. 22. Fogel, L.J., Owens, A.J., and Walsh, M.J., Arti cial Intelligence Through Simulated Evolution, John Wiley, Chichester, UK, 1966. 23. Fogel, L.J., Evolutionary Programming in Perspective: The Top-Down View, in [68], pp.135{146. 24. Forrest, S. (Editor), Proceedings of the Fifth International Conference on Genetic Algorithms, Morgan Kaufmann Publishers, Los Altos, CA, 1993.

25. Glover, F., Heuristics for Integer Programming Using Surrogate Constraints, Decision Sciences, Vol.8, No.1, pp.156{166, 1977. 26. Goldberg, D.E., Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley, Reading, MA, 1989. 27. Goldberg, D.E., Deb, K., and Korb, B., Do not Worry, Be Messy, in [7], pp.24{30. 28. Goldberg, D.E., Milman, K., and Tidd, C., Genetic Algorithms: A Bibliography, IlliGAL Technical Report 92008, 1992. 29. Gorges-Schleuter, M., ASPARAGOS An Asynchronous Parallel Genetic Optimization Strategy, in [52], pp.422{427. 30. Grefenstette, J.J., (Editor), Proceedings of the First International Conference on Genetic Algorithms, Lawrence Erlbaum Associates, Hillsdale, NJ, 1985. 31. Grefenstette, J.J., (Editor), Proceedings of the Second International Conference on Genetic Algorithms, Lawrence Erlbaum Associates, Hillsdale, NJ, 1987. 32. Heitkotter, J., (Editor), The Hitch-Hiker's Guide to Evolutionary Computation, FAQ in comp.ai.genetic, issue 1.10, 20 December 1993. 33. Holland, J.H., Adaptation in Natural and Arti cial Systems, University of Michigan Press, Ann Arbor, 1975. 34. Julstrom, B.A., What Have You Done for Me Lately? Adapting Operator Probabilities in a Steady-State Genetic Algorithm, in [15], pp.81{87. 35. Koza, J.R., Genetic Programming, MIT Press, Cambridge, MA, 1992. 36. Koza, J.R., Genetic Programming { 2, MIT Press, Cambridge, MA, 1994. 37. Latombe, J.C., Robot Motion Planning. Kluwer Academic Publishers, 1991. 38. Manner, R. and Manderick, B. (Editors), Proceedings of the Second International Conference on Parallel Problem Solving from Nature (PPSN), North-Holland, Elsevier Science Publishers, Amsterdam, 1992. 39. McDonnell, J.R., Reynolds, R.G., and Fogel, D.B. (Editors), Proceedings of the Fourth Annual Conference on Evolutionary Programming, The MIT Press, 1995. 40. Michalewicz, Z., A Hierarchy of Evolution Programs: An Experimental Study, Evolutionary Computation, Vol.1, No.1, 1993, pp.51{76. 41. Michalewicz, Z., Genetic Algorithms + Data Structures = Evolution Programs, Springer-Verlag, 3rd edition, 1996. 42. Michalewicz, Z. (Editor), Statistics & Computing, special issue on evolutionary computation, Vol.4, No.2, 1994. 43. Michalewicz, Z. and Xiao, J., Evaluation of Paths in Evolutionary Planner/Navigator, Proceedings of the 1995 International Workshop on Biologically Inspired Evolutionary Systems, Tokyo, Japan, May 30{31, 1995, pp.45{52. 44. Muhlenbein, H., Parallel Genetic Algorithms, Population Genetics and Combinatorial Optimization, in [52], pp.416-421. 45. Nissen, V., Evolutionary Algorithms in Management Science: An Overview and List of References, European Study Group for Evolutionary Economics, 1993. 46. Potter, M. and De Jong, K., A Cooperative Coevolutionary Approach to Function Optimization, George Mason University, 1994. 47. Proceedings of the First IEEE International Conference on Evolutionary Computation, Orlando, 26 June { 2 July, 1994. 48. Proceedings of the Second IEEE International Conference on Evolutionary Computation, Perth, 29 November { 1 December, 1995. 49. Proceedings of the Third IEEE International Conference on Evolutionary Computation, Nagoya, 20 { 22 May, 1996. 50. Ronald, E., When Selection Meets Seduction, in [15], pp.167{173.

51. Saravanan, N. and Fogel, D.B., A Bibliography of Evolutionary Computation & Applications, Department of Mechanical Engineering, Florida Atlantic University, Technical Report No. FAU-ME-93-100, 1993. 52. Scha er, J., (Editor), Proceedings of the Third International Conference on Genetic Algorithms, Morgan Kaufmann Publishers, Los Altos, CA, 1989. 53. Scha er, J.D. and Morishima, A., An Adaptive Crossover Distribution Mechanism for Genetic Algorithms, in [31], pp.36{40. 54. Schraudolph, N. and Belew, R., Dynamic Parameter Encoding for Genetic Algorithms, CSE Technical Report #CS90{175, University of San Diego, La Jolla, 1990. 55. Schwefel, H.-P., On the Evolution of Evolutionary Computation, in [68], pp.116{ 124. 56. Schwefel, H.-P., Evolution and Optimum Seeking, John Wiley, Chichester, UK, 1995. 57. Schwefel, H.-P. and Manner, R. (Editors), Proceedings of the First International Conference on Parallel Problem Solving from Nature (PPSN), Springer-Verlag, Lecture Notes in Computer Science, Vol.496, 1991. 58. Sebald, A.V. and Fogel, L.J., Proceedings of the Third Annual Conference on Evolutionary Programming, San Diego, CA, 1994, World Scienti c. 59. Shaefer, C.G., The ARGOT Strategy: Adaptive Representation Genetic Optimizer Technique, in [31], pp.50{55. 60. Spears, W.M., Adapting Crossover in Evolutionary Algorithms, in [39], pp.367{384. 61. Srinivas, M. and Patnaik, L.M., Adaptive Probabilities of Crossover and Mutation in Genetic Algorithms, IEEE Transactions on Systems, Man, and Cybernetics, Vol.24, No.4, 1994, pp.17{26. 62. Whitley, D., Genetic Algorithms: A Tutorial, in [42], pp.65{85. 63. Whitley, D., GENITOR II: A Distributed Genetic Algorithm, Journal of Experimental and Theoretical Arti cial Intelligence, Vol.2, pp.189{214. 64. Whitley, D. (Editor), Foundations of Genetic Algorithms{2, Second Workshop on the Foundations of Genetic Algorithms and Classi er Systems, Morgan Kaufmann Publishers, San Mateo, CA, 1993. 65. Xiao, J., Evolutionary Planner/Navigator, Handbook of Evolutionary Computation, Oxford University Press, 1996. 66. Xiao, J., Michalewicz, Z, and Zhang, L., Operator Performance of Evolutionary Planner/Navigator, Proceedings of the 3rd IEEE ICEC, Nagoya, 20{22 May 1996. 67. Yap, C.-K., \Algorithmic Motion Planning", Advances in Robotics, Vol.1: Algorithmic and Geometric Aspects of Robotics, J.T. Schwartz and C.-K. Yap Ed., Lawrence Erlbaum Associates, 1987, pp. 95-143. 68. Zurada, J., Marks, R., and Robinson, C. (Editors), Computational Intelligence: Imitating Life, IEEE Press, 1994.

This article was processed using the LaTEX macro package with LLNCS style

Suggest Documents