Measuring the Effectiveness of Gamesourcing Expert Oil Painting Annotations
Abstract
References
Recommendations
Presence of an Ecosystem: a Catalyst in the Knowledge Building Process in Crowdsourced Annotation Environments
ASONAM '15: Proceedings of the 2015 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining 2015The phenomenal success of certain crowdsourced online platforms, such as Wikipedia, is accredited to their ability to tap the crowd's potential to collaboratively build knowledge. While it is well known that the crowd's collective wisdom surpasses the ...
Almost an Expert: The Effects of Rubrics and Expertise on Perceived Value of Crowdsourced Design Critiques
CSCW '16: Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social ComputingExpert feedback is valuable but hard to obtain for many designers. Online crowds can provide fast and affordable feedback, but workers may lack relevant domain knowledge and experience. Can expert rubrics address this issue and help novices provide ...
Argument Mining in Tweets: Comparing Crowd and Expert Annotations for Automated Claim and Evidence Detection
Natural Language Processing and Information SystemsAbstractOne of the main challenges in the development of argument mining tools is the availability of annotated data of adequate size and quality. However, generating data sets using experts is expensive from both organizational and financial perspectives,...
Comments
Information & Contributors
Information
Published In
Publisher
Springer-Verlag
Berlin, Heidelberg
Publication History
Author Tags
Qualifiers
- Article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 0Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0
Other Metrics
Citations
View Options
View options
Get Access
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in