Evolutionary Algorithms (EAs) such as Genetic Algorithms (GAs), Evolution Strategies (ES's), Evolutionary Programming (EP), are a class of stochastic search and optimization methods gleaned from the model of organic evolution. The differences between the EAs and the traditional search algorithms are mainly in two aspects: population-based search and information exchange between the individuals. The applications of the EAs to the numerical optimization problems are studied. The main contributions of the dissertation are summarized as follows. 1.The frequently used crossover and mutation operators of the real-coded GAs are discussed. The search abilities of these operators are compared. The analysis and the experiments show that the search ability of an operator may vary greatly at different evolutionary stages. And the search ability of an operator may vary greatly when it is used to solve different problems. If little is known about the problem, hybridization of many operators is a feasible way to extend the application range and improve the performance of the EAs. 2. The GAs can be used to solve multimodal function optimization problems because of the population-based search. The variation of the selection probability is studied when fitness sharing scheme is added to the genetic algorithm to optimize the multimodal functions. The analysis and simulation results show that the genetic algorithm with fitness sharing can maintain population diversity to a certain extent and the range of the fitness function can greatly influence the performance of the algorithm. A multi-level Genetic Algorithm & Tabu Search algorithm (MGA-TS) is proposed to optimize multimodal function. The main GA performs global search while the microgenetic algorithm (MGA) exploits the neighborhood of the current solution provided by the main GA. The TS is introduced to prevent the cycling search of some area. The simulation results for the benchmarking multimodal function show that it is superior to other algorithms. 3. A new mutation operator of EP is proposed. An extension operation is performed in order to make full use Of the good mutation direction if the offspring is better than its parent. Otherwise, a Gaussian or Cauchy perturbation is superimposed on the mutation vector based on the parent's performance. So the fine-tuning search ability of the Gaussian mutation and the coarse-grained search ability of the Cauchy mutation are combined efficiently. The experimental results show that the improved algorithm performs better than the classical EP for numerical benchmark problems. 4. The design of architecture and weights of artificial neural networks is discussed. The EP, Back Propagation (BP) and TS are hybridized to design the feedforward neural network. The experimental results show that compact neural network with good generalization ability can be produced by the algorithm in comparison with the BR And the algori
修改评论