WCSE 2019 SUMMER ISBN:978-981-14-1684-2
DOI:10.18178/wcse.2019.06.011

Lost-Min Voting Strategies for Speeding up Multi-SVMs

Shinq-Jen Wu, Van-Hung Pham

Abstract— Support vector machines (SVMs) possess good accuracy in big data classification. However, the computational cost in both training and testing stages is a critical issue. The authors recently proposed a twophase sequential minimal optimization to largely reduce the training cost (tested with 3186–70,000-sample datasets). The authors now focus on speeding up the testing speed of SVMs for multi-class classification. A lost-min strategy is proposed to accelerate the voting algorithm used in multi-SVMs. The number of the used binary classifiers is reduced from an order of n² to (nearly to n-1). The proposed lost-min voting strategy was tested with DNA dataset (bioinformatics), Usps datasets (handwritten digits), Letter dataset (English alphabet) and Satimage dataset (satellite imagery of Earth). The time complexity for all of the datasets approaches to n-1 algorithm and the accuracy is remained at the same time.

Index Terms— Support vector machines, multi-class classification, big data analysis, computational biology.

Shinq-Jen Wu
Da-Yeh University, Department of Electrical Engineering, TAIWAN.
Van-Hung Pham
Institute information of technology, Vietnam academy of Science and Technology, VIETNAM

[Download]


Cite: Shinq-Jen Wu, Van-Hung Pham, "Lost-Min Voting Strategies for Speeding up Multi-SVMs," Proceedings of 2019 the 9th International Workshop on Computer Science and Engineering, pp. 65-71, Hong Kong, 15-17 June, 2019.