Due to the global optimum, studying the convex quadratic programming problem is of great significance to the quadratic programming problem. In this thesis, we discuss and study SVM and MEB within the framework of convex quadratic programming. From the geometric interpretation theory of SVM on, many researchers explored the margin hyperplane in the sense of different norms, they claimed that: the margin hyperplane attained in the sense of L1<下标!> norm in optimization problem compared to the one attained in the sense of conventional L2<下标!> norm, is of less training time and more robust to outliers. In this dissertation, the main work and contribution including: 1) We present the dual equivalence of MEB and SVM in the sense of L1<下标!> and Linfty<下标!> norms, and point out that: the dual problems of margin hyperplane optimization problem in two-class soft margin L1--MEB and two-class hard margin MEB in the sense of L1<下标!>(Linfty<下标!>) norm is the special case of the dual problems of margin hyperplane optimization problem in two-class soft margin L1--SVM and two-class hard margin SVM in the sense of L1<下标!>(Linfty<下标!>) norm, respectively, which provides theoretical fundamentals for subsequent study. 2) We solve the MEB problem in the sense of L1<下标!> and Linfty<下标!> norms by using linear programming, and attain a training process with less training time and more robustness, and hence discussing MEB in the sense of different norms is of some practical sense. 3) We propose an optimization toolbox--based improved MEB algorithm, which adds the furthest point away from the current center as the core vector to the current core-set. At the same time, we reduce the current core sets by a theoretical guaranteed dropping principle, by which we can guarantee the minimal volume of current core-set, and hence lessen the computational complexities for the subsequent iterations. We also provide theoretical analysis about the space and time complexities of the proposed algorithm, and then we compared it with other MEB algorithms through experiments. 4) We propose a non--optimization toolbox--based improved MEB algorithm by relaxing the requirement on kernel function, which achieved an simpler MEB algorithm where the radius is expanded incrementally. Hence, the proposed algorithm can solve more extensive kernel methods. We also provide theoretical proof on convergence and analysis about the space and time complexities of the proposed algorithm, and then we come to the conclusion that the proposed algorithm is more suitable to handle larger data sets with comparative accuracies by comparing with other MEB algorithms through experiments. In a word, in this thesis, we have made profitably exploration and research both on the connection between the SVM and MEB problem and the improvement of MEB algorithms.
修改评论