ABSTRACT
Context: The success of crowdsourced software development (CSD) depends on a large crowd of trustworthy software workers who are registering and submitting for their interested tasks in exchange of financial gains. Preliminary analysis on software worker behaviors reveals an alarming task-quitting rate of 82.9%.
Goal: The objective of this study is to empirically investigate worker decision factors and provide better decision support in order to improve the success and efficiency of CSD.
Method: We propose a novel problem formulation, DCW-DS, and an analytics-based decision support methodology to guide workers in acceptance of offered development tasks. DCS-DS is evaluated using more than one year's real-world data from TopCoder, the leading CSD platform.
Results: Applying Random Forest based machine learning with dynamic updates, we can predict a worker as being a likely quitter with 99% average precision and 99% average recall accuracy. Similarly, we achieved 78% average precision and 88% average recall for the worker winner class. For workers just following the top three task recommendations, we have shown that the average quitting rate goes down below 6%.
Conclusions: In total, the proposed method can be used to improve total success rate as well as reduce quitting rate of tasks performed.
- K. R. Lakhani, D. A. Garvin, and E. Lonstein, "TopCoder (A): Developing Software through Crowdsourcing," Harvard Business School Case 610--032, Jan. 2010.Google Scholar
- A. Kittur, J. V. Nickerson, M. Bernstein, E. Gerber, A. Shaw, J. Zimmerman, M. Lease, and J. Horton, "The Future of Crowd Work," in Proc. CSCW 2013, pp. 1301--1318. Google ScholarDigital Library
- K. Mao, Y. Yang, M. Li, and M. Harman, "Pricing Crowdsourcing-based Software Development Tasks," Piscataway, NJ, USA, 2013, pp. 1205--1208. Google ScholarDigital Library
- Y. Yang and R. Saremi, "Award vs. Worker Behaviors in Competitive Crowdsourcing Tasks," ESEM 2015, pp. 1--10.Google Scholar
- J. Whitehill, P. Ruvolo, T. Wu, J. Bergsma, and J. Movellan. "Whose vote should count more: Optimal integration of labels from labelers of unknown expertise," Advances in Neural Information Processing Systems. 22(2035-2043):7--13. 2009. Google ScholarDigital Library
- A. Mao, A.D. Procaccia, and Y. Chen, "Better human computation through principled voting," in Proc.of the AAAI Conference on Artificial Intelligence 2013, pp. 1142--1148. Google ScholarDigital Library
- N. Kaufmann, T. Schulze, and D. Veit, "More than fun and money. Worker Motivation in Crowdsourcing - A Study on Mechanical Turk", Proc. 17th AMCIS, 2011.Google Scholar
- A. Mao, E Kamar, and E. Horvitz, "Why Stop Now? Predicting Worker Engagement in Online Crowdsourcing," Proc. HCOMP 2013.Google Scholar
- T. D. LaToza et al., "Microtask programming: Building software with a crowd." In Proc. Symp. UI Software and Technology, 2014. Google ScholarDigital Library
- S. Faradani, B. Hartmann, and P.G. Ipeirotis, "What's the Right Price? Pricing Tasks for Finishing on Time", In Proc. Human Computation, 2011. Google ScholarDigital Library
- K. Mao, Y. Yang, Q. Wang, Y. Jia, M. Harman, "Developer Recommendation for Crowdsourced Software Development Tasks," SOSE 2015: pp 347--356. Google ScholarDigital Library
- TopCoder website: "10 Burning Questions on Crowdsourcing: Your starting guide to open innovation and crowdsourcing success," https://www.topcoder.com/blog/10-burning-questions-on-crowdsourcing-and-open-innovation/, Access date: March 14, 2016.Google Scholar
- N. Archak. "Money, glory and cheap talk: analyzing strategic behavior of contestants in simultaneous crowdsourcing contests on topcoder.com", In Proc. Conference on World Wide Web, pages 21--30, 2010. Google ScholarDigital Library
- H. Zhang, Y. Wu, W. Wu. Analyzing Developer Behavior and Community Structure in Software Crowdsourcing. Information Science and Application. Vol. 339. Pp981--988.Google Scholar
- M. Jorgensen and S. Grimstad. Over-Optimism in Software Development Projects: "The Winner's Curse". In Proc. 15th International Conference on Electronics,Google Scholar
- M. & Marsella, S. C. (2014). Encode Theory of Mind in Character Design for Pedagogical Interactive Narrative. Advances in HCI, vol. 2014, Article ID 386928. Google ScholarDigital Library
- J. Yang, L.A. Adamic, and M.S. Ackerman, "Crowdsourcing and knowledge sharing: strategic user behaviour on tasks," In: Proc. ACM conference on Electronic Commerce, pp 246--255. Google ScholarDigital Library
- C. Stylianou, and A. S. Andreou, "Human Resource Allocation and Scheduling for Software Project Management," In: Software Project Management in a Changing World (G. Ruhe, C. Wohlin, eds.), Springer 2014.Google Scholar
- C.K. Chang, C. Chao, S. Hsieh, "SPMNet: a formal methodology for software management," in Proc. COMPAC, November, 1994.Google Scholar
- E. Alba, J.F. Chicano, "Software project management with GAs," Journal of Information Science, 177(11):2380--2401, 2007. Google ScholarDigital Library
- M. R. Karim et al., "An Empirical Investigation of Single-objective and Multi-Objective Evolutionary Algorithms for Developer's Assignment to Bugs," to appear in Journal of Software: Evolution and Process, 2016.Google Scholar
- A. Slivkins and J. W. Vaughan. Online Decision Making in Crowdsourcing Markets: Theoretical Challenges. ACM SIGecom Exchanges, Vol. 12, 2013, pp 4--23 Google ScholarDigital Library
- D. Karger, S. Oh, and D. Shah, "Iterative learning for reliable crowdsourcing systems," In 25th Advances in Neural Information Processing Systems. 2011. Google ScholarDigital Library
- Y. Singer, and M. Mittal, "Pricing mechanisms for crowdsourcing markets," Proc. Intl. WWWConf. 2013. Google ScholarDigital Library
- M. S. Bernstein et al., "Analytic Methods for Optimizing Real-time Crowdsourcing", CS.SI 2012, 1204.2995Google Scholar
- G. Salton, M.G. McGill. Introduction to modern information retrieval. McGraw-Hill. (1986). Google ScholarDigital Library
- Github repository "TopCoder-Winner-Quitter": https://github.com/yy2111/TopCoder_Winner_Quitter. Access date: March 15, 2016.Google Scholar
- P. A. Hancock, H. C. Ganey, "From the inverted-U to the extended-U: The evolution of a law of psychology," J. Human Performance in Extreme Environments, 2013, pp 5--14.Google Scholar
- Y. Yang and J. O. Pedersen, "A Comparative Study on Feature Selection in Text Categorization," in Proc. Conference on ML, Morgan Kaufmann Publishers. pp. 412--420, 1997. Google ScholarDigital Library
- M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, I.H. Witten, "The WEKA data mining software: an update," SIGKDD Explorations Newsletter 11(1), 10--18. Nov 2009. Google ScholarDigital Library
- S. Lessmann et al., "Benchmarking classification models for software defect prediction: a proposed framework and novel findings," IEEE TSE, vol. 34, no. 4, pp. 485--496, 2008. Google ScholarDigital Library
- A. Dwarakanath, N.C. Shrikanth, K. Abhinav, A. Kass, "Trustworthiness in enterprise crowdsourcing: a taxonomy & evidence from data," Proc. ICSE 2016 pp 41--50. Google ScholarDigital Library
Recommendations
Do extra dollars paid-off?: an exploratory study on topcoder
CSI-SE '18: Proceedings of the 5th International Workshop on Crowd Sourcing in Software EngineeringIn general crowdsourcing, different task requesters employ different pricing strategies to balance task cost and expected worker performance. While most existing studies show that increasing incentives tend to benefit crowdsourcing outcomes, i.e. ...
Leveraging crowdsourcing for team elasticity: an empirical evaluation at TopCoder
ICSE-SEIP '17: Proceedings of the 39th International Conference on Software Engineering: Software Engineering in Practice TrackThere is an emergent trend in software development projects that mini-tasks can be crowdsourced to achieve rapid development and delivery. For software managers requesting crowdsourcing services, it is beneficial to be able to evaluate and assure the ...
Is This the Right Time to Post My Task? An Empirical Analysis on a Task Similarity Arrival in TopCoder
Human Interface and the Management of Information. Interacting with InformationAbstractExisted studies have shown that crowd workers are more interested in taking similar tasks in terms of context, field and required technology, rather than tasks from the same project. Therefore, it is important for task owners to not only be able ...
Comments