期刊文献+

防恶意竞价的众包多任务分配激励机制

Incentive mechanism of crowdsourcing multi-task assignment against malicious bidding
下载PDF
导出
摘要 众包的飞速发展丰富了任务执行者的阅历和技能,使他们更加了解任务且倾向于同时完成多种任务,因此根据执行者对任务的主观偏好进行分配成为一种常见的任务分配方式;但是出于个人利益,执行者可能采取恶意竞价行为换取更高的收益,这对众包平台的发展是不利的。为此,提出一种防恶意竞价的众包多任务分配激励机制GIMSM(Greedy Incentive Mechanism for Single-Minded)。该机制定义了一个线性比值作为分配依据,再根据贪心策略从执行者比值递增的序列中依次选取并分配任务,最后按照支付函数对分配算法选中的任务执行者进行支付,得到最终的任务分配结果。在Taxi and Limousine Commission Trip Record Data数据集上进行实验。相较于TODA(Truthful Online Double Auction mechanism)、TCAM(Truthful Combinatorial Auction Mechanism)和FU方法,GIMSM在不同工人数下的任务结果平均质量水平分别提高了25.20、13.20和4.40个百分点,GIMSM在不同任务数下的任务结果平均质量水平分别提高了26.17、16.17和9.67个百分点。此外,GIMSM满足个体理性和激励相容,可在线性时间内得到任务分配结果。实验结果表明GIMSM具有良好的防恶意竞价性能,在具有大量数据的众包平台上有更好的表现。 The rapid development of crowdsourcing has enriched workers’experience and skills of workers,making them more aware of tasks and tend to complete multiple tasks at the same time.Therefore,assigning tasks according to workers’subjective preferences has become a common way of task assignment.However,out of personal interests,workers may take malicious bidding behaviors to obtain higher utility.It is detrimental to the development of crowdsourcing platforms.To this end,an incentive mechanism of crowdsourcing multi-task assignment against malicious bidding was proposed,named GIMSM(Greedy Incentive Mechanism for Single-Minded).First,a linear ratio was defined as the allocation basis by this mechanism.Then,according to the greedy strategy,from a sequence of increasing worker ratios,tasks were selected and assigned.Finally,the workers selected by allocation algorithm were paid according to payment function,and the result of task assignment was obtained.The experiments were conducted on Taxi and Limousine Commission Trip Record Data dataset.Compared to TODA(Truthful Online Double Auction mechanism),TCAM(Truthful Combinatorial Auction Mechanism)and FU method,GIMSM’s average quality level of task results under different numbers of workers increased by 25.20 percentage points,13.20 percentage points and 4.40 percentage points,respectively.GIMSM’s average quality level of task results under different numbers of tasks increased by 26.17 percentage points,16.17 percentage points and 9.67 percentage points,respectively.In addition,the proposed mechanism GIMSM satisfies individual rationality and incentive compatibility,and can obtain task assignment results in linear time.The experimental results show that the proposed mechanism GIMSM has good anti-malicious bidding performance,and has a better performance on the crowdsourcing platforms with a large amount of data.
作者 张佩瑶 付晓东 ZHANG Peiyao;FU Xiaodong(School of Information Engineering and Automation,Kunming University of Science and Technology,Kunming Yunnan 650500,China)
出处 《计算机应用》 CSCD 北大核心 2024年第1期261-268,共8页 journal of Computer Applications
基金 国家自然科学基金资助项目(61962030) 云南省科技人才与平台计划项目(202005AC160036) 云南省院士(专家)工作站项目(202105AF150013)。
关键词 众包 任务分配 激励机制 拍卖 恶意竞价 crowdsourcing task assignment incentive mechanism auction malicious bidding
  • 相关文献

参考文献5

二级参考文献105

  • 1HoweJ. The rise of crowdsourcing. Wired Magazine, 2006, 14(6): 1-4.
  • 2HoweJ. Crowdsourcing. New York: Crown Publishing Group, 2008.
  • 3Zhao Yu-Xiang , Zhu Qing-Hua. Evaluation on crowdsourcing research: Current status and future direction. Information Systems Frontiers, 2012, 11(1): 1-18.
  • 4von Ahn L, Maurer B, Abraham D, Blum M. reCAPTCHA: Human-based character recognition via web security measures. Science, 2008, 321(5895): 1465-1468.
  • 5Ipeirotis P G. Analyzing the amazon mechanical turk marketplace. ACM Crossroads, 2010, 17(2): 16-21.
  • 6Doan A, Franklin MJ, Kossmann D, Kraska T. Crowdsourcing applications and platforms: A data management perspective. Proceedings of the VLDB Endowment, 2011,4(12): 1508-1509.
  • 7Alonso 0, Lease M. Crowdsourcing for information retrieval: Principles, methods, and applications//Proceedings of the 34th International ACM SIGIR Conference on Research and Development in Information Retrieval. Beijing, China, 2011: 1299-1300.
  • 8Lease M, Alonso O. Crowdsourcing for search evaluation and social-algorithmic search//Proceedings of the 35th International ACM SIGIR Conference on Research and Development in Information Retrieval. Portland, USA, 2012: 1180.
  • 9Ipeirotis P G, Paritosh P K. Managing crowdsourced human computation, A tutoriall /Proceedings of the 20th International Conference on World Wide Web. Hyderabad, India, 2011, 287-288.
  • 10Alonso 0, Lease M. Crowdsourcing 101, Putting the WSDM of crowds to work for you//Proceedings of the 4th International Conference on Web Search and Web Data Mining. Hong Kong, China, 2011, 1-2.

共引文献160

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部