期刊文献+

一种基于潜变量的Ranking模型构造算法 被引量:1

A Ranking Model Constructing Algorithm Based on Latent Variables
下载PDF
导出
摘要 现有的Ranking算法获得的模型全部来自训练数据,因为很多模型的有用信息并不能完全从训练数据中得到,因此这样得到的模型不够精确,对此,提出一种基于潜变量的Ranking算法。该算法以结构化SVM为学习工具,将除训练数据外的其他有用信息以潜变量形式引入算法的框架中,并在此基础上定义了面向NDCG的目标函数。针对该目标函数非凸非平滑,首先使用"凹-凸过程"进行逼近,然后用"近似Bundle法"展开优化计算。基准数据集上的实验结果表明:相比完全依靠训练数据的Ranking算法,本文算法获得的模型更为精确。 It is known that the models obtained via the existing ranking algorithms completely come from training data. Because much useful information on the models cannot be gotten from training data, the models are not usually accurate enough. Aiming at the above shortcoming, this paper proposes a ranking algorithm based on latent variables. Firstly, the algorithm uses structural support vector machine as learning tool, and introduces other useful information, except for training data, into algorithm framework as latent variables. On the basis, it defines an object function orienting NDCG. Because the object function is non-convex and non-smooth, this paper utilizes the concave-convex procedure to approximation, and makes use of proximal bundle method to optimizing computing. Experimental results on the benchmark datasets show that the obtained model via the proposed algorithm is more precise than those only via training data.
出处 《华东理工大学学报(自然科学版)》 CAS CSCD 北大核心 2011年第6期739-744,共6页 Journal of East China University of Science and Technology
基金 国家自然科学基金项目(61003131) 安徽省自然科学基金项目(11040606M141) 安徽省自然科学基金青年基金(11040606Q07) 安徽省科技攻关计划重大科技专项(08010201002) 安徽大学"211工程"资助项目
关键词 Ranking算法 潜变量 结构化SVM NDCG 凹-凸过程 近似Bundle法 ranking algorithm latent variables structural support vector machine NDCG concaveconvex procedure proximal bundle method
  • 相关文献

参考文献19

  • 1Chu Wei , Keerthi S S. New approaches to support vector ordinal regression[C]//Proceedings of the 22nd International Conference on Machine Learning. New York, USA: ACM Press , 2005:145-152.
  • 2Shashua A, Levin A. Ranking with large margin principle: Two approaches[C]// Advances in Neural Information Processing Systems:Cambridge: MIT Press, 20031 937-944.
  • 3Burges C, Shaked T, Renshaw E. Learning to rank using gradient descent[C]// Proceedings of the 22nd International Conference on Machine Learning . New York, USA: ACM Press , 2005: 89-96.
  • 4Freund Y, Iyer R, Schapire R E. An efficient boosting algo rithm for combining preferences [J]. Journal of Machine Learning Research, 2003,4 : 933-969.
  • 5Herbrich R, Grapel T, Obermayer K . Large Margin Rank Boundaries for Ordinal Regression [ M]. Cambridge: MIT Press ,2000: 115-132.
  • 6Joachims T. Optimizing search engines using clickthrough data[C]// Proceedings of the Eighth ACM SIGKDD Inter national Conference on Knowledge Discovery and Data Min ing. New York, USA: ACM Press, 2002:133 142.
  • 7Cao Zhe, Qin Tao, Liu Tieyan, et al. Learning to rank: From Pairwise approach to Listwise approach[C]// Proceedings of the 24th International Conference on Machine Learning. New York, USA: ACMPress, 2007:129-136.
  • 8Xu Jun, Li Hang. Adarank: A boosting algorithm for infor marion retrieval [C]//Learning to Rank for Information Retrieval. Amsterdam, Netherlands: ACM Press , 2007: 391-398.
  • 9Arampatzis A, Kamps J. Where to stop reading a ranked list [C]//Proceedings of the 32th Annual International ACM SIGIR Conference on Research and Development in Informa tion Retrieval. Boston, Massachusetts: ACM Press , 2009: 524-531.
  • 10McFee B, Lanckriet G. Metric learning to rank[C]//Pro ceedings of the 27th International Conference on Machine Learning. Haifa, Israel: ACM Press , 2010:235-243.

同被引文献10

引证文献1

二级引证文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部