ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article Fast Training of Structured SVM Using Fixed-Threshold Sequential Minimal Optimization
Cited 24 time in scopus Download 2 time Share share facebook twitter linkedin kakaostory
Authors
Chang Ki Lee, Myung Gil Jang
Issue Date
2009-04
Citation
ETRI Journal, v.31, no.2, pp.121-128
ISSN
1225-6463
Publisher
한국전자통신연구원 (ETRI)
Language
English
Type
Journal Article
DOI
https://dx.doi.org/10.4218/etrij.09.0108.0276
Abstract
In this paper, we describe a fixed-threshold sequential minimal optimization (FSMO) for structured SVM problems. FSMO is conceptually simple, easy to implement, and faster than the standard support vector machine (SVM) training algorithms for structured SVM problems. Because FSMO uses the fact that the formulation of structured SVM has no bias (that is, the threshold b is fixed at zero), FSMO breaks down the quadratic programming (QP) problems of structured SVM into a series of smallest QP problems, each involving only one variable. By involving only one variable, FSMO is advantageous in that each QP sub-problem does not need subset selection. For the various test sets, FSMO is as accurate as an existing structured SVM implementation (SVM-Struct) but is much faster on large data sets. The training time of FSMO empirically scales between O(n) and O(n1.2), while SVM-Struct scales between O(n1.5) and O(n1.8).
KSP Keywords
Fast training, Large data sets, Subset selection, Support VectorMachine(SVM), Training algorithms, Training time, breaks down, quadratic programming, sequential minimal optimization, structured SVM