TY - JOUR
T1 - Joint sparse regularization based Sparse Semi-Supervised Extreme Learning Machine (S3ELM) for classification
AU - Luo, Xiaozhuo
AU - Liu, F.
AU - Yang, Shuyuan
AU - Wang, Xiaodong
AU - Zhou, Zhiguo
N1 - Funding Information:
The authors would like to thank editor and reviewers. Their comments give us great confidence to improve this work. This research was supported in part by the National Basic Research Program (973 Program) of China (No. 2013CB329402 ), in part by the National Natural Science Foundation of China (Nos. 61173090 , 61173092 , 61303119 , 61271302 , 61272282 , 61072108 ), in part by the National Research Foundation for the Doctoral Program of Higher Education of China (No. 20110203110006 ), in part by the Fund for Foreign Scholars in University Research and Teaching Programs (the 111 Project) (Grant No. B07048 ), in part by the Program for Cheung Kong Scholars and Innovative Research Team in University (No. IRT1170 ). in part by the Fundamental Research Funds for the Central Universities (Grant Nos. GK201302007 , 7214507501 ).
Publisher Copyright:
© 2014 Elsevier B.V.All rights reserved.
PY - 2015/1/1
Y1 - 2015/1/1
N2 - Extreme Learning Machine (ELM) has received increasing attention for its simple principle, low computational cost and excellent performance. However, a large number of labeled instances are often required, and the number of hidden nodes should be manually tuned, for better learning and generalization of ELM. In this paper, we propose a Sparse Semi-Supervised Extreme Learning Machine (S3ELM) via joint sparse regularization for classification, which can automatically prune the model structure via joint sparse regularization technology, to achieve more accurate, efficient and robust classification, when only a small number of labeled training samples are available. Different with most of greedy-algorithms based model selection approaches, by using ℓ2,1-norm, S3ELM casts a joint sparse constraints on the training model of ELM and formulate a convex programming. Moreover, with a Laplacian, S3ELM can make full use of the information from both the labeled and unlabeled samples. Some experiments are taken on several benchmark datasets, and the results show that S3ELM is computationally attractive and outperforms its counterparts.
AB - Extreme Learning Machine (ELM) has received increasing attention for its simple principle, low computational cost and excellent performance. However, a large number of labeled instances are often required, and the number of hidden nodes should be manually tuned, for better learning and generalization of ELM. In this paper, we propose a Sparse Semi-Supervised Extreme Learning Machine (S3ELM) via joint sparse regularization for classification, which can automatically prune the model structure via joint sparse regularization technology, to achieve more accurate, efficient and robust classification, when only a small number of labeled training samples are available. Different with most of greedy-algorithms based model selection approaches, by using ℓ2,1-norm, S3ELM casts a joint sparse constraints on the training model of ELM and formulate a convex programming. Moreover, with a Laplacian, S3ELM can make full use of the information from both the labeled and unlabeled samples. Some experiments are taken on several benchmark datasets, and the results show that S3ELM is computationally attractive and outperforms its counterparts.
KW - Extreme learning machine
KW - Joint sparse regularization
KW - Laplacian
KW - Sparse semi-supervised learning
KW - ℓ -Norm
UR - http://www.scopus.com/inward/record.url?scp=84915822678&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84915822678&partnerID=8YFLogxK
U2 - 10.1016/j.knosys.2014.09.014
DO - 10.1016/j.knosys.2014.09.014
M3 - Article
AN - SCOPUS:84915822678
SN - 0950-7051
VL - 73
SP - 149
EP - 160
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
ER -