Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3077136.3084146acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
research-article

Visual Pool: A Tool to Visualize and Interact with the Pooling Method

Published: 07 August 2017 Publication History

Abstract

Every year more than 25 test collections are built among the main Information Retrieval (IR) evaluation campaigns. They are extremely important in IR because they become the evaluation praxis for the forthcoming years. Test collections are built mostly using the pooling method. The main advantage of this method is that it drastically reduces the number of documents to be judged. It does so at the cost of introducing biases, which are sometimes aggravated by non optimal configuration. In this paper we develop a novel visualization technique for the pooling method, and integrate it in a demo application named Visual Pool. This demo application enables the user to interact with the pooling method with ease, and develops visual hints in order to analyze existing test collections, and build better ones.

References

[1]
Javed A. Aslam and Mark Montague 2001. Models for Metasearch Proc. of SIGIR.
[2]
Gordon V. Cormack, Charles L. A. Clarke, and Stefan Buettcher. 2009. Reciprocal Rank Fusion Outperforms Condorcet and Individual Rank Learning Methods Proc. of SIGIR.
[3]
Gordon V. Cormack, Christopher R. Palmer, and Charles L. A. Clarke 1998. Efficient Construction of Large Test Collections. Proc. of SIGIR.
[4]
Bevan Koopman and Guido Zuccon 2014. Why assessing relevance in medical IR is demanding Medical Information Retrieval (MedIR) Workshop.
[5]
Aldo Lipani. 2016. Fairness in Information Retrieval. In Proc. of SIGIR.
[6]
Aldo Lipani, Mihai Lupu, and Allan Hanbury 2015. Splitting Water: Precision and Anti-Precision to Reduce Pool Bias Proc. of SIGIR.
[7]
Aldo Lipani, Mihai Lupu, and Allan Hanbury 2016. The Curious Incidence of Bias Corrections in the Pool Proc. of ECIR.
[8]
Aldo Lipani, Mihai Lupu, Evangelos Kanoulas, and Allan Hanbury. 2016. The Solitude of Relevant Documents in the Pool. In Proc. of CIKM.
[9]
Aldo Lipani, Mihai Lupu, Joao Palotti, Guido Zuccon, and Allan Hanbury 2017. Fixed Budget Pooling Strategies Based on Fusion Methods Proc. of SAC.
[10]
Aldo Lipani, Joao Palotti, Mihai Lupu, Florina Piroi, Guido Zuccon, and Allan Hanbury. 2017. Fixed-Cost Pooling Strategies Based on IR Evaluation Measures.
[11]
David E. Losada, Javier Parapar, and Alvaro Barreiro. 2017. Multi-armed bandits for adjudicating documents in pooling-based evaluation of information retrieval systems. Information Processing & Management Vol. 53, 5 (2017).
[12]
Craig Macdonald and Iadh Ounis 2006. Voting for Candidates: Adapting Data Fusion Techniques for an Expert Search Task Proc. of CIKM.
[13]
Alistair Moffat, William Webber, and Justin Zobel. 2007. Strategic System Comparisons via Targeted Relevance Judgments Proc. of SIGIR.
[14]
K. Spärck Jones and C. J. van Rijsbergen 1975. Report on the need for and provision of an 'ideal' information retrieval test collection. British Library Research and Development Report No. 5266 (1975).
[15]
Ellen M. Voorhees. 2014. The Effect of Sampling Strategy on Inferred Measures Proc. of SIGIR.
[16]
E. Voorhes and Donna Harman 1999. Overview of the eighth text retrieval conference. Proc. of TREC.
[17]
Emine Yilmaz and Javed A. Aslam 2006. Estimating Average Precision with Incomplete and Imperfect Judgments Proc. of CIKM.

Cited By

View all
  • (2020)Neural-IR-Explorer: A Content-Focused Tool to Explore Neural Re-ranking ResultsAdvances in Information Retrieval10.1007/978-3-030-45442-5_58(459-464)Online publication date: 8-Apr-2020
  • (2020)An Information Visualization Tool for the Interactive Component-Based Evaluation of Search EnginesDigital Libraries: The Era of Big Data and Data Science10.1007/978-3-030-39905-4_3(15-25)Online publication date: 22-Jan-2020
  • (2019)TrecToolsProceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval10.1145/3331184.3331399(1325-1328)Online publication date: 18-Jul-2019
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGIR '17: Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval
August 2017
1476 pages
ISBN:9781450350228
DOI:10.1145/3077136
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 August 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. pooling method
  2. pooling strategies
  3. test collections
  4. visualization

Qualifiers

  • Research-article

Funding Sources

  • Austrian Science Fund (FWF)

Conference

SIGIR '17
Sponsor:

Acceptance Rates

SIGIR '17 Paper Acceptance Rate 78 of 362 submissions, 22%;
Overall Acceptance Rate 792 of 3,983 submissions, 20%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)2
Reflects downloads up to 05 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2020)Neural-IR-Explorer: A Content-Focused Tool to Explore Neural Re-ranking ResultsAdvances in Information Retrieval10.1007/978-3-030-45442-5_58(459-464)Online publication date: 8-Apr-2020
  • (2020)An Information Visualization Tool for the Interactive Component-Based Evaluation of Search EnginesDigital Libraries: The Era of Big Data and Data Science10.1007/978-3-030-39905-4_3(15-25)Online publication date: 22-Jan-2020
  • (2019)TrecToolsProceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval10.1145/3331184.3331399(1325-1328)Online publication date: 18-Jul-2019
  • (2019)On Biases in Information Retrieval Models and EvaluationACM SIGIR Forum10.1145/3308774.330880452:2(172-173)Online publication date: 17-Jan-2019
  • (2019)Visual Analytics and IR Experimental EvaluationInformation Retrieval Evaluation in a Changing World10.1007/978-3-030-22948-1_24(565-582)Online publication date: 14-Aug-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media