ETRI-Knowledge Sharing Plaform

ENGLISH

성과물

논문 검색
구분 SCI
연도 ~ 키워드

상세정보

학술지 Fast Training of Structured SVM Using Fixed-Threshold Sequential Minimal Optimization
Cited 23 time in scopus Download 0 time Share share facebook twitter linkedin kakaostory
저자
이창기, 장명길
발행일
200904
출처
ETRI Journal, v.31 no.2, pp.121-128
ISSN
1225-6463
출판사
한국전자통신연구원 (ETRI)
DOI
https://dx.doi.org/10.4218/etrij.09.0108.0276
협약과제
08MS3700, 웹 QA 기술개발, 장명길
초록
In this paper, we describe a fixed-threshold sequential minimal optimization (FSMO) for structured SVM problems. FSMO is conceptually simple, easy to implement, and faster than the standard support vector machine (SVM) training algorithms for structured SVM problems. Because FSMO uses the fact that the formulation of structured SVM has no bias (that is, the threshold b is fixed at zero), FSMO breaks down the quadratic programming (QP) problems of structured SVM into a series of smallest QP problems, each involving only one variable. By involving only one variable, FSMO is advantageous in that each QP sub-problem does not need subset selection. For the various test sets, FSMO is as accurate as an existing structured SVM implementation (SVM-Struct) but is much faster on large data sets. The training time of FSMO empirically scales between O(n) and O(n1.2), while SVM-Struct scales between O(n1.5) and O(n1.8).
키워드
Fixed-threshold sequential minimal optimization, Structured SVM, Support vector machines
KSP 제안 키워드
Fast training, Large data sets, Subset selection, Support VectorMachine(SVM), Training algorithms, Training time, breaks down, quadratic programming, sequential minimal optimization, structured SVM