Incentives for truthful reporting in crowdsourcing

作者: Eric Horvitz , Ece Kamar

DOI: 10.5555/2343896.2343988

关键词: IncentiveTask (project management)Computer scienceQuality (business)Knowledge managementWork (electrical)Overhead (business)CrowdsourcingPayment

摘要: A challenge with the programmatic access of human talent via crowdsourcing platforms is specification incentives and checking quality contributions. Methodologies for include providing a payment if work approved by task owner hiring additional workers to evaluate contributors' work. Both these approaches place burden on people organizations commissioning tasks, may be susceptible manipulation owners. Moreover, neither nor market know well enough able worker reports. incentivizing without external rewards based agreement peer or final output system. These are vulnerable strategic manipulations workers. Recent experiments Mechanical Turk have demonstrated negative influence owners systems [3]. We address this central introducing incentive mechanisms that promote truthful reporting in discourage overhead.

参考文章(4)
Eric Horvitz, Severin Hacker, Ece Kamar, Combining human and machine intelligence in large-scale crowdsourcing adaptive agents and multi-agents systems. pp. 467- 474 ,(2012) , 10.5555/2343576.2343643
Nolan Miller, Paul Resnick, Richard Zeckhauser, Eliciting Informative Feedback: The Peer-Prediction Method Management Science. ,vol. 51, pp. 1359- 1373 ,(2005) , 10.1287/MNSC.1050.0379
Aniket Kittur, Ed H. Chi, Bongwon Suh, Crowdsourcing user studies with Mechanical Turk Proceeding of the twenty-sixth annual CHI conference on Human factors in computing systems - CHI '08. pp. 453- 456 ,(2008) , 10.1145/1357054.1357127