Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.5555/2820116.2820121acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

CrowdIntent: annotation of intentions hidden in online discussions

Published: 16 May 2015 Publication History

Abstract

Stakeholders working in open-source software development use social media, emails or any available means in the Internet to communicate and express what they want or need through the use of text. The recognition of such needs or desires (that we call intentions) is usually done by a human reader, and it can require a considerable effort when the amount of messages in online discussions increases. The problem is that to support an automated recognition of the intentions hidden in the text, data are needed in the domain of software development for training classifiers. However, so far there is no data annotated with intentions that can be used for data mining purposes. In order to tackle the lack of data we have collected online discussions in the domain of software development and asked people to annotate such discussions with intentions. This collection has been performed by crowdsourcing the task of annotating sentences with their hidden intention. In this paper we report the experience of carrying out a crowdsourcing project with a heterogeneous crowd. We discuss how we applied the steps of the crowdsourcing workflow in CrowdIntent. Lessons learned and future work are also presented.

References

[1]
P. Laurent and J. Cleland-Huang, "Lessons learned from open source projects for facilitating online requirements processes," in REFSQ 2009, June 8-9, ser. LNCS, vol. 5512. Springer, 2009, pp. 240--255. {Online}. Available: http://dx.doi.org/10.1007/978-3-642-02050-6_21
[2]
N. Novielli and C. Strapparava, "Dialogue act classification exploiting lexical semantics," in Conversational Agents and Natural Language Interaction: Techniques and Effective Practices. IGI Global, 2011, pp. 80--106.
[3]
I. Morales-Ramirez, M. Vergne, M. Morandini, A. Siena, A. Perini, and A. Susi, "Who is the expert? combining intention and knowledge of online discussants in collaborative RE tasks," in ICSE '14, May 31 - June 07. ACM, 2014, pp. 452--455. {Online}. Available: http://doi.acm.org/10.1145/2591062.2591103
[4]
K. Bach and R. M. Harnish, Linguistic Communication and Speech Acts. Cambridge, MA: MIT Press, 1979.
[5]
I. Morales-Ramirez and A. Perini, "Discovering speech acts in online discussions: A tool-supported method," in Joint Proceedings of the CAiSE 2014 Forum, Thessaloniki, Greece, June 18-20, 2014., ser. CEUR Workshop Proceedings, S. Nurcan, E. Pimenidis, O. Pastor, and Y. Vassiliou, Eds., vol. 1164. CEUR-WS.org, 2014, pp. 137--144. {Online}. Available: http://ceur-ws.org/Vol-1164/PaperDemo01.pdf
[6]
E. Estellés-Arolas and F. González-Ladrón-de Guevara, "Towards an integrated crowdsourcing definition," Journal of Information Science, vol. 38, no. 2, pp. 189--200, 2012. {Online}. Available: http://jis.sagepub.com/content/38/2/189.abstract
[7]
M. Sabou, K. Bontcheva, L. Derczynski, and A. Scharl, "Corpus annotation through crowdsourcing: Towards best practice guidelines," in LREC 2014, May 26-31. European Language Resources Association (ELRA), 2014, pp. 859--866. {Online}. Available: http://www.lrec-conf.org/proceedings/lrec2014/summaries/497.html
[8]
Z. Azmeh, I. Mirbel, and P. Crescenzo, "Highlighting stakeholder communities to support requirements decision-making," in REFSQ 2013, April 8-11, ser. LNCS. Springer, 2013, vol. 7830, pp. 190--205. {Online}. Available: http://dx.doi.org/10.1007/978-3-642-37422-7_14
[9]
S. L. Lim, D. Quercia, and A. Finkelstein, "Stakesource: harnessing the power of crowdsourcing and social networks in stakeholder analysis," in ICSE (2), J. Kramer, J. Bishop, P. T. Devanbu, and S. Uchitel, Eds. ACM, 2010, pp. 239--242.
[10]
D. Renzel, M. Behrendt, R. Klamma, and M. Jarke, "Requirements bazaar: Social requirements engineering for community-driven innovation," in RE 2013, July 15-19. IEEE, 2013, pp. 326--327. {Online}. Available: http://doi.ieeecomputersociety.org/10.1109/RE.2013.6636738
[11]
B. Mellebeek, F. Benavent, J. Grivolla, J. Codina, M. R. Costa-jussá, and R. Banchs, "Opinion mining of spanish customer comments with non-expert annotations on mechanical turk," in Proceedings of the NAACL HLT 2010 Workshop on Creating Speech and Language Data with Amazon's Mechanical Turk, ser. CSLDAMT '10. Stroudsburg, PA, USA: Association for Computational Linguistics, 2010, pp. 114--121. {Online}. Available: http://dl.acm.org/citation.cfm?id=1866696.1866714
[12]
F. Pastore, L. Mariani, and G. Fraser, "Crowdoracles: Can the crowd solve the oracle problem?" in Software Testing, Verification and Validation (ICST), 2013 IEEE Sixth International Conference on, March 2013, pp. 342--351.
[13]
J. L. Fleiss, B. Levin, and M. C. Paik, "The measurement of interrater agreement," Statistical methods for rates and proportions, vol. 2, pp. 212--236, 1981.
[14]
J. R. Landis and G. G. Koch, "The measurement of observer agreement for categorical data," biometrics, pp. 159--174, 1977.
[15]
A. M. Green, "Kappa statistics for multiple raters using categorical classifications," in Proceedings of the Twenty-Second Annual SAS Users Group International Conference (online), San Diego, CA, March 1997.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CSI-SE '15: Proceedings of the Second International Workshop on CrowdSourcing in Software Engineering
May 2015
57 pages

Sponsors

Publisher

IEEE Press

Publication History

Published: 16 May 2015

Check for updates

Qualifiers

  • Research-article

Conference

ICSE '15
Sponsor:

Upcoming Conference

ICSE 2025

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 77
    Total Downloads
  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)0
Reflects downloads up to 09 Nov 2024

Other Metrics

Citations

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media