In this dissertation, we make a brief analysis of Genetic Algorithms (Gas) in its theory basis. Afterward Complex-valued Encodings are first applied to Genetic Algorithms. We use one complex number to denote each diploidy and define the genetic operators accordingly as well. Each pair of alleles corresponds one complex number. The independent variables of the objective function are determined by the modules and angles of their corresponding complex numbers. Compared with the conventional Genetic Algorithms based on real-valued encodings or binary encodings, our algorithm expands the dimensions for denoting greatly. The computer simulation results prove its efficiency. Furthermore, the algorithm can be used to ensure global optimization during the training of artificial neural networks using complex numbers. Character recognition is a very important field GAs are applied to. Fully considering that genetic algorithms have excellent global sampling abilities, we apply them to printed character recognition and handwritten numeral recognition. In the case of printed character recognition, in order to simplify the computing complexity of printed character recognition and improve recognition rate as well, we proposed a new effective algorithm. Using two reasonable threshold values, the algorithm transformed real template vectors into (0,1,*)-ones. Meanwhile, adequately considering the different weightiness of the four different factors denoting the pertinence between template and unknown sample as we described, we allocated them corresponding weight coefficients. Finally we used Genetic Algorithm to decide all these threshold values and weight coefficients. In case of handwritten numeral recognition, we apply a simple multilayer cluster neural network with five independent subnetworks as the classifier. GAs are combined with backpropagation algorithm effectively. The initial weights of the neural network are decided by GAs, and then we use backpropagation algorithm to continue the search until the global maxima are found at last. By doing so, we can avoid the problem of finding local minima in training the neural network with backpropagation algorithm while taking advantage of its effective local search.
修改评论