期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
Supervised topic models with weighted words:multi-label document classification 被引量:1
1
作者 Yue-peng ZOU ji-hong ouyang Xi-ming LI 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2018年第4期513-523,共11页
Supervised topic modeling algorithms have been successfully applied to multi-label document classification tasks.Representative models include labeled latent Dirichlet allocation(L-LDA)and dependency-LDA.However,these... Supervised topic modeling algorithms have been successfully applied to multi-label document classification tasks.Representative models include labeled latent Dirichlet allocation(L-LDA)and dependency-LDA.However,these models neglect the class frequency information of words(i.e.,the number of classes where a word has occurred in the training data),which is significant for classification.To address this,we propose a method,namely the class frequency weight(CF-weight),to weight words by considering the class frequency knowledge.This CF-weight is based on the intuition that a word with higher(lower)class frequency will be less(more)discriminative.In this study,the CF-weight is used to improve L-LDA and dependency-LDA.A number of experiments have been conducted on real-world multi-label datasets.Experimental results demonstrate that CF-weight based algorithms are competitive with the existing supervised topic models. 展开更多
关键词 Supervised topic model Multi-label classification Class frequency Labeled latent Dirichlet allocation (L-LDA) Dependency-LDA
原文传递
Tuning the Learning Rate for Stochastic Variational Inference
2
作者 Xi-Ming Li ji-hong ouyang 《Journal of Computer Science & Technology》 SCIE EI CSCD 2016年第2期428-436,共9页
Stochastic variational inference (SVI) can learn topic models with very big corpora. It optimizes the variational objective by using the stochastic natural gradient algorithm with a decreasing learning rate. This ra... Stochastic variational inference (SVI) can learn topic models with very big corpora. It optimizes the variational objective by using the stochastic natural gradient algorithm with a decreasing learning rate. This rate is crucial for SVI; however, it is often tuned by hand in real applications. To address this, we develop a novel algorithm, which tunes the learning rate of each iteration adaptively. The proposed algorithm uses the Kullback-Leibler (KL) divergence to measure the similarity between the variational distribution with noisy update and that with batch update, and then optimizes the learning rates by minimizing the KL divergence. We apply our algorithm to two representative topic models: latent Dirichlet allocation and hierarchical Dirichlet process. Experimental results indicate that our algorithm performs better and converges faster than commonly used learning rates. 展开更多
关键词 stochastic variational inference online learning adaptive learning rate topic model
原文传递
Curve length estimation based on cubic spline interpolation in gray-scale images
3
作者 Zhen-xin WANG ji-hong ouyang 《Journal of Zhejiang University-Science C(Computers and Electronics)》 SCIE EI 2013年第10期777-784,共8页
This paper deals with a novel local arc length estimator for curves in gray-scale images.The method first estimates a cubic spline curve fit for the boundary points using the gray-level information of the nearby pixel... This paper deals with a novel local arc length estimator for curves in gray-scale images.The method first estimates a cubic spline curve fit for the boundary points using the gray-level information of the nearby pixels,and then computes the sum of the spline segments’lengths.In this model,the second derivatives and y coordinates at the knots are required in the computation;the spline polynomial coefficients need not be computed explicitly.We provide the algorithm pseudo code for estimation and preprocessing,both taking linear time.Implementation shows that the proposed model gains a smaller relative error than other state-of-the-art methods. 展开更多
关键词 Arc length estimation Cubic spline interpolation Gray-scale image Local algorithm
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部