Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2598153.2602225acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
research-article

First International Workshop on User Interfaces for Crowdsourcing and Human Computation

Published: 27 May 2014 Publication History

Abstract

Recent years witnessed an explosion in the number and variety of data crowdsourcing initiatives. From OpenStreetMap to Amazon Mechanical Turk, developers and practitioners have been striving to create user interfaces able to effectively and efficiently support the creation, exploration, and analysis of crowdsourced information.
The extensive usage of crowdsourcing techniques brings a major change of paradigm with respect to traditional user interface for data collection and exploration, as effectiveness, speed, and interaction quality concerns play a central role in supporting very demanding incentives, including monetary ones.
The First International Workshop on User Interfaces for Crowdsourcing and Human Computation (CrowdUI 2014), co-located with the AVI 2014 conference, brought together researchers and practitioners from a wide range of areas interested in discussing the user interaction challenges posed by crowdsourcing systems.

Cited By

View all
  • (2018)Information Visualization Evaluation Using CrowdsourcingComputer Graphics Forum10.1111/cgf.1344437:3(573-595)Online publication date: 10-Jul-2018

Index Terms

  1. First International Workshop on User Interfaces for Crowdsourcing and Human Computation

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AVI '14: Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces
    May 2014
    438 pages
    ISBN:9781450327756
    DOI:10.1145/2598153
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • Centro Cultura Volta: Centro Cultura Volta
    • Politecnico di Milano: Politecnico di Milano

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 27 May 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. crowdsourcing
    2. human computation
    3. user incentives
    4. user interfaces

    Qualifiers

    • Research-article

    Conference

    AVI' 14
    Sponsor:
    • Centro Cultura Volta
    • Politecnico di Milano

    Acceptance Rates

    AVI '14 Paper Acceptance Rate 32 of 112 submissions, 29%;
    Overall Acceptance Rate 128 of 490 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)4
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 07 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2018)Information Visualization Evaluation Using CrowdsourcingComputer Graphics Forum10.1111/cgf.1344437:3(573-595)Online publication date: 10-Jul-2018

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media