Toggle Main Menu Toggle Search

Open Access padlockePrints

A regression approach to LS-SVM and sparse realization based on fast subset selection

Lookup NU author(s): Dr Wanqing ZhaoORCiD


Full text for this publication is not currently held within this repository. Alternative links are provided below where available.


The Least Squares Support Vector Machine (LSSVM) is a modified SVM with a ridge regression cost function and equality constraints. It has been successfully applied in many classification problems. But, the common issue for LSSVM is that it lacks sparseness, which is a serious drawback in its applications. To tackle this problem, a fast approach is proposed in this paper for developing sparse LS-SVM. First, a new regression solution is proposed for the LS-SVM which optimizes the same objective function for the conventional solution. Based on this, a new subset selection method is then adopted to realize the sparse approximation. Simulation results on different benchmark datasets i.e. Checkerboard, two Gaussian datasets, show that the proposed solution can achieve better objective value than conventional LS-SVM, and the proposed approach can achieve a more sparse LS-SVM than the conventional LS-SVM while provide comparable predictive classification accuracy. Additionally, the computational complexity is significantly decreased. © 2012 IEEE.

Publication metadata

Author(s): Zhang J, Li K, Irwin GW, Zhao W

Publication type: Conference Proceedings (inc. Abstract)

Publication status: Published

Conference Name: Proceedings of the 10th World Congress on Intelligent Control and Automation (WCICA)

Year of Conference: 2012

Pages: 612-617

Online publication date: 24/11/2012

Publisher: IEEE


DOI: 10.1109/WCICA.2012.6357952

Library holdings: Search Newcastle University Library for this item

ISBN: 9781467313988