期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Dropout training for SVMs with data augmentation 被引量:1
1
作者 Ning CHEN Jun ZHU +1 位作者 Jianfei CHEN Ting CHEN 《Frontiers of Computer Science》 SCIE EI CSCD 2018年第4期694-713,共20页
Dropout and other feature noising schemes have shown promise in controlling over-fitting by artificially corrupting the training data. Though extensive studies have been performed for generalized linear models, little... Dropout and other feature noising schemes have shown promise in controlling over-fitting by artificially corrupting the training data. Though extensive studies have been performed for generalized linear models, little has been done for support vector machines (SVMs), one of the most successful approaches for supervised learning. This paper presents dropout training for both linear SVMs and the nonlinear extension with latent representation learning. For linear SVMs, to deal with the intractable expectation of the non-smooth hinge loss under corrupting distributions, we develop an iteratively re-weighted least square (IRLS) algorithm by exploring data augmentation techniques. Our algorithm iteratively minimizes the expectation of a re- weighted least square problem, where the re-weights are analytically updated. For nonlinear latent SVMs, we con- sider learning one layer of latent representations in SVMs and extend the data augmentation technique in conjunction with first-order Taylor-expansion to deal with the intractable expected hinge loss and the nonlinearity of latent representa- tions. Finally, we apply the similar data augmentation ideas to develop a new IRLS algorithm for the expected logistic loss under corrupting distributions, and we further develop a non-linear extension of logistic regression by incorporating one layer of latent representations. Our algorithms offer insights on the connection and difference between the hinge loss and logistic loss in dropout training. Empirical results on several real datasets demonstrate the effectiveness of dropout training on significantly boosting the classification accuracy of both linear and nonlinear SVMs. 展开更多
关键词 DROPOUT SVMS logistic regression data aug- mentation iteratively reweighted least square
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部