THE USE OF PRECONDITIONING FOR TRAINING SUPPORT VECTOR MACHINES
Since the introduction of support vector machines (SVMs), much work has been done to make these machines more efficient in classification. In our work, we incorporated the preconditioned conjugate gradient method (PCG) with an adaptive constraint reduction method developed in 2007 [8, 9] to improve the efficiency of training the SVM when using an Interior-Point Method. As in [8, 9], we reduced the computational effort in assembling the matrix of normal equations by excluding unnecessary constraints. By using PCG and refactoring the preconditioner only when necessary, we also reduced the time to solve the system of normal equations. We also compared two methods to update the preconditioner. Both methods consider the two most recent diagonal matrices in the normal equations. The first method chooses the indices to be updated based on the difference between the diagonal elements while the second method chooses based on the ratio of these elements.
Promising numerical results for dense matrix problems are reported.
A support vector machine (SVM) is a tool used to classify patterns. Several patterns with associated predetermined classification labels (positive or negative) are given as input. A SVM is trained by finding a hyperplane that separates the two classes of patterns. Once the training procedure is complete, the machine can be used to classify future patterns. A pattern is labeled as positive or negative depending on which side of the hyperplane it lies on. If there is a hyperplane that separates the positive patterns from the negative patterns, then the separating hyperplane can be found by solving a convex quadratic program (CQP).
Our work uses the Interior-Point Method (IPM) with adaptive constraint reduction developed in [8, 9] to find the hyperplane. During the reduction process, patterns that are farthest from the separating hyperplane are eliminated since they have no effect on training the SVM. We assume our problem can be modeled as a CQP and use primal-dual interior-point methods to find the best separating hyperplane. Our contribution is to use the preconditioned conjugate gradient method (PCG) to solve the normal equations problem at each step of the IPM. PCG solves a system of linear equations by using a preconditioner matrix that is chosen such that convergence to the “true” solution is rapid. The matrix of the normal equations is symmetric and positive definite. These equations are very useful because they determine our IPM step but involve fewer variables, which reduces the complexity of the computation. For PCG, we tested several preconditioners and found that using the Cholesky factorization of a previous normal equations matrix as a preconditioner significantly reduced computational time. We also applied a low-rank update to the preconditioner to try to improve efficiency.