Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3318464.3389719acmconferencesArticle/Chapter ViewAbstractPublication PagesmodConference Proceedingsconference-collections
research-article

Recommending Deployment Strategies for Collaborative Tasks

Published: 31 May 2020 Publication History

Abstract

Our work contributes to aiding requesters in deploying collaborative tasks in crowdsourcing. We initiate the study of recommending deployment strategies for collaborative tasks to requesters that are consistent with deployment parameters they desire: a lower-bound on the quality of the crowd contribution, an upper-bound on the latency of task completion, and an upper-bound on the cost incurred by paying workers. A deployment strategy is a choice of value for three dimensions: Structure (whether to solicit the workforce sequentially or simultaneously), Organization (to organize it collaboratively or independently), and Style (to rely solely on the crowd or to combine it with machine algorithms). We propose StratRec, an optimization-driven middle layer that recommends deployment strategies and alternative deployment parameters to requesters by accounting for worker availability. Our solutions are grounded in discrete optimization and computational geometry techniques that produce results with theoretical guarantees. We present extensive experiments on Amazon Mechanical Turk, and conduct synthetic experiments to validate the qualitative and scalability aspects of StratRec.

Supplementary Material

MP4 File (3318464.3389719.mp4)
Presentation Video

References

[1]
BJ Allen et al. 2018. Design Crowdsourcing: The Impact on New Product Performance of Sourcing Design Solutions from the Crowd. Journal of Marketing (2018).
[2]
Aris Anagnostopoulos et al. 2010. An optimization framework for query recommendation. (2010).
[3]
Norbert Beckmann et al. 1990. The R*-tree: an efficient and robust access method for points and rectangles. In SIGMOD. Acm.
[4]
Michael S. Bernstein, Greg Little, Robert C. Miller, Björn Hartmann, Mark S. Ackerman, David R. Karger, David Crowell, and Katrina Panovich. 2010. Soylent: A Word Processor with a Crowd Inside. In IN PROC UIST'10.
[5]
Ria Mae Borromeo et al. 2017. Deployment strategies for crowdsourcing text creation. Information Systems (2017).
[6]
Stephan Borzsony et al. 2001. The skyline operator. In ICDE. IEEE.
[7]
Lydia B Chilton, Greg Little, Darren Edge, Daniel S Weld, and James A Landay. 2013. Cascade: Crowdsourcing taxonomy creation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1999--2008.
[8]
Jan Chomicki et al. 2013. Skyline queries, front and back. SIGMOD (2013).
[9]
Mark De Berg et al. 1997. Computational geometry. In Computational geometry. Springer.
[10]
Michael R Garey and David S Johnson. 2002. Computers and intractability. wh freeman New York.
[11]
Susan Gauch et al. 1991. Search improvement via automatic query reformulation. Technical Report. UNC Chapel Hill, Computer Science.
[12]
Benjamin M Good and Andrew I Su. 2013. Crowdsourcing for bioinformatics. Bioinformatics 29, 16 (2013), 1925--1933.
[13]
Ting-Hao Kenneth Huang and Jeffrey P Bigham. 2017. A 10-monthlong deployment study of on-demand recruiting for low-latency crowdsourcing. In Fifth AAAI Conference on Human Computation and Crowdsourcing.
[14]
Oscar H Ibarra et al. 1975. Fast approximation algorithms for the knapsack and sum of subset problems. Journal of the ACM (JACM) (1975).
[15]
Yannis E Ioannidis, Raymond T Ng, Kyuseok Shim, and Timos K Sellis. 1992. Parametric query optimization. In VLDB, Vol. 92. Citeseer, 103--114.
[16]
Wen Jin et al. 2007. The multi-relational skyline operator. In ICDE. IEEE.
[17]
Ouiame Ait El Kadi. [n.d.]. Exploring Crowdsourcing Deployment Strategies through Recommendation and Iterative Renement. MS Research Report ([n. d.]).
[18]
Aniket Kittur, Boris Smus, Susheel Khamkar, and Robert E Kraut. 2011. Crowdforge: Crowdsourcing complex work. In Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 43--52.
[19]
Anand Kulkarni, Matthew Can, and Björn Hartmann. 2012. Collaboratively crowdsourcing workflows with turkomatic. In Proceedings of the acm 2012 conference on computer supported cooperative work. ACM, 1003--1012.
[20]
Anand P Kulkarni, Matthew Can, and Bjoern Hartmann. 2011. Turkomatic: automatic recursive task and workflow design for mechanical turk. In CHI'11 Extended Abstracts on Human Factors in Computing Systems. ACM, 2053--2058.
[21]
Walter S Lasecki, Raja Kushalnagar, and Jeffrey P Bigham. 2014. Legion scribe: real-time captioning by non-experts. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility. ACM, 303--304.
[22]
Walter S Lasecki, Rachel Wesley, Jeffrey Nichols, Anand Kulkarni, James F Allen, and Jeffrey P Bigham. 2013. Chorus: a crowd-powered conversational assistant. In Proceedings of the 26th annual ACM symposium on User interface software and technology. ACM, 151--162.
[23]
Christopher H Lin, Mausam Daniel, and S Weld. 2012. Dynamically switching between synergistic workflows for crowdsourcing. In In Proceedings of the 26th AAAI Conference on Artificial Intelligence, AAAI'12. Citeseer.
[24]
Chaitanya Mishra et al. 2009. Interactive query refinement. In EDBT. ACM.
[25]
Davide Mottin et al. 2013. A probabilistic optimization framework for the empty-answer problem. VLDB (2013).
[26]
Kyriakos Mouratidis et al. 2006. Continuous monitoring of top-k queries over sliding windows. In SIGMOD. ACM.
[27]
Kyriakos Mouratidis and Bo Tang. 2018. Exact Processing of Uncertain Top-k Queries in Multi-criteria Settings. PVLDB (2018).
[28]
Barzan Mozafari, Purna Sarkar, Michael Franklin, Michael Jordan, and Samuel Madden. 2014. Scaling up crowd-sourcing to very large datasets: a case for active learning. Proceedings of the VLDB Endowment 8, 2 (2014), 125--136.
[29]
Julien Pilourdault et al. 2017. Motivation-aware task assignment in crowdsourcing. In EDBT.
[30]
Habibur Rahman et al. 2018. Optimized group formation for solving collaborative tasks. The VLDB Journal (2018), 1--23.
[31]
Klaas-Jan Stol and Brian Fitzgerald. 2014. Two's company, three's a crowd: a case study of crowdsourcing software development. In Proceedings of the 36th International Conference on Software Engineering. ACM, 187--198.
[32]
Immanuel Trummer and Christoph Koch. 2016. Multi-objective parametric query optimization. ACM SIGMOD Record 45, 1 (2016), 24--31.
[33]
Dong Wei, Senjuti Basu Roy, and Sihem Amer-Yahia. 2020. Recommending Deployment Strategies for Collaborative Tasks. arXiv preprint arXiv:2003.06875 (2020).
[34]
Man-Ching Yuen, Irwin King, and Kwong-Sak Leung. 2011. A survey of crowdsourcing systems. In 2011 IEEE Third International Conference on Privacy, Security, Risk and Trust and 2011 IEEE Third International Conference on Social Computing. IEEE, 766--773.
[35]
Haichao Zheng et al. 2011. Task design, motivation, and participation in crowdsourcing contests. International Journal of Electronic Commerce (2011).

Cited By

View all
  • (2021)Data Management to Social Science and Back in the Future of WorkProceedings of the 2021 International Conference on Management of Data10.1145/3448016.3457536(2876-2877)Online publication date: 9-Jun-2021
  • (2021)The Reaches of Crowdsourcing: A Systematic Literature ReviewHCI International 2021 - Late Breaking Papers: Design and User Experience10.1007/978-3-030-90238-4_17(229-248)Online publication date: 24-Jul-2021
  • (2021)Cost and Quality in Crowdsourcing WorkflowsApplication and Theory of Petri Nets and Concurrency10.1007/978-3-030-76983-3_3(33-54)Online publication date: 23-Jun-2021

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGMOD '20: Proceedings of the 2020 ACM SIGMOD International Conference on Management of Data
June 2020
2925 pages
ISBN:9781450367356
DOI:10.1145/3318464
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 31 May 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. approximation algorithm
  2. computational geometry
  3. crowdsourcing
  4. deployment strategy recommendation

Qualifiers

  • Research-article

Funding Sources

Conference

SIGMOD/PODS '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 785 of 4,003 submissions, 20%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)15
  • Downloads (Last 6 weeks)2
Reflects downloads up to 30 Aug 2024

Other Metrics

Citations

Cited By

View all
  • (2021)Data Management to Social Science and Back in the Future of WorkProceedings of the 2021 International Conference on Management of Data10.1145/3448016.3457536(2876-2877)Online publication date: 9-Jun-2021
  • (2021)The Reaches of Crowdsourcing: A Systematic Literature ReviewHCI International 2021 - Late Breaking Papers: Design and User Experience10.1007/978-3-030-90238-4_17(229-248)Online publication date: 24-Jul-2021
  • (2021)Cost and Quality in Crowdsourcing WorkflowsApplication and Theory of Petri Nets and Concurrency10.1007/978-3-030-76983-3_3(33-54)Online publication date: 23-Jun-2021

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media