CrowdTruth is a framework for machine-human computation for harnessing disagreement in gathering annotated data. The slides come from our talk at DIR2015.
4. • Annotator disagreement
is signal, not noise.
• It is indicative of the
variation in human
semantic interpretation of
signs
• It can indicate ambiguity,
vagueness, similarity,
over-generality, etc,
as well as quality
http://CrowdTruth.org
15. • methodology
• disagreement-aware
crowdsourcing to collect gold
standard data
• metrics to capture
disagreement
• software
• online platform for
crowdsourcing task workflows
and data analytics
• ground truth collection
• medical relation extraction
• salience in news and tweets
• sound annotation
http://CrowdTruth.org