Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Perturbation to enhance support vector machines for classification
To K., Lim C. Journal of Computational and Applied Mathematics163 (1):233-239,2004.Type:Article
Date Reviewed: May 19 2004

Support vector machines (SVM) are learning algorithms, frequently used to solve classification and regression tasks, and supporting the linear separation of a binary class of data. SVM techniques have been extensively used to solve a large variety of pattern recognition problems, including isolated digit handwritten recognition, object recognition, speaker identification, face detection in images, and text categorization, as well as for regression estimation purposes. Comparative investigations of their corresponding efficiencies have proven that SVMs’ performance on generalization tasks either matches, or is significantly better than, that of competing methods.

Briefly, the SVM algorithm determines which vectors, from a finite binary labeled input pattern, support the hyperplane that gives the largest margin of separation between the classes. The computation of the coefficients corresponding to the separating hyperplane is carried out by solving a constrained quadratic programming problem.

This paper introduces a perturbation method, used to obtain a sensitivity measure for trained solutions from a set of input patterns, using SVM learning. The authors seek to extend the SVM technique to allow for the evaluation of the quality of image features resulting from trained samples taken from a noisy background. The main contribution of the paper is to present a new method to alter the training data for the SVM algorithm, to extract pixel-wise class data from a trained solution. The extraction of pixel features from a trained SVM problem is performed using the inhibitory perturbation method, and, for image classification, two additional sensitivity measures are considered to provide a direct correspondence to each pixel.

Reviewer:  L. State Review #: CR129638 (0411-1380)
Bookmark and Share
 
Quadratic Programming Methods (G.1.6 ... )
 
 
Classifier Design And Evaluation (I.5.2 ... )
 
 
Parameter Learning (I.2.6 ... )
 
 
Design Methodology (I.5.2 )
 
 
Learning (I.2.6 )
 
Would you recommend this review?
yes
no
Other reviews under "Quadratic Programming Methods": Date
A method of trust region type for minimizing noisy functions
Elster C., Neumaier A. Computing 58(1): 31-46, 1997. Type: Article
Jun 1 1998
Minimizing quadratic functions subject to bound constraints with the rate of convergence and finite termination
Dostál Z., Schöberl J. Computational Optimization and Applications 30(1): 23-43, 2005. Type: Article
Aug 2 2005
Shape optimization with computational fluid dynamics
El-Sayed M., Sun T., Berry J. Advances in Engineering Software 36(9): 607-613, 2005. Type: Article
Jan 26 2006
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy