Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1571941.1572081acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
poster

Has adhoc retrieval improved since 1994?

Published: 19 July 2009 Publication History

Abstract

Evaluation forums such as TREC allow systematic measurement and comparison of information retrieval techniques. The goal is consistent improvement, based on reliable comparison of the effectiveness of different approaches and systems. In this paper we report experiments to determine whether this goal has been achieved. We ran five publicly available search systems, in a total of seventeen different configurations, against nine TREC adhoc-style collections, spanning 1994 to 2005. These runsets were then used as a benchmark for reassessing the relative effectiveness of the original TREC runs for those collections. Surprisingly, there appears to have been no overall improvement in effectiveness for either median or top-end TREC submissions, even after allowing for several possible confounds. We therefore question whether the effectiveness of adhoc information retrieval has improved over the past decade and a half.

References

[1]
E. M. Voorhees and D. K. Harman, editors. TREC: Experiment and Evaluation in Information Retrieval. Addison-Wesley, Cambridge, Massachusetts,2005.
[2]
W. Webber, A. Moffat, and J. Zobel. Score standardization for inter-collection comparison of retrieval systems. SIGIR 2008, pages 51--58.

Cited By

View all
  • (2019)Overview of CENTRE@CLEF 2019: Sequel in the Systematic Reproducibility RealmExperimental IR Meets Multilinguality, Multimodality, and Interaction10.1007/978-3-030-28577-7_24(287-300)Online publication date: 3-Aug-2019
  • (2019)CENTRE@CLEF 2019Advances in Information Retrieval10.1007/978-3-030-15719-7_38(283-290)Online publication date: 7-Apr-2019
  • (2018)Introduction to the Special Issue on Reproducibility in Information RetrievalJournal of Data and Information Quality10.1145/326841010:4(1-4)Online publication date: 29-Oct-2018
  • Show More Cited By

Index Terms

  1. Has adhoc retrieval improved since 1994?

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGIR '09: Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
    July 2009
    896 pages
    ISBN:9781605584836
    DOI:10.1145/1571941

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 July 2009

    Permissions

    Request permissions for this article.

    Check for updates

    Qualifiers

    • Poster

    Conference

    SIGIR '09
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 792 of 3,983 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)6
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 04 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2019)Overview of CENTRE@CLEF 2019: Sequel in the Systematic Reproducibility RealmExperimental IR Meets Multilinguality, Multimodality, and Interaction10.1007/978-3-030-28577-7_24(287-300)Online publication date: 3-Aug-2019
    • (2019)CENTRE@CLEF 2019Advances in Information Retrieval10.1007/978-3-030-15719-7_38(283-290)Online publication date: 7-Apr-2019
    • (2018)Introduction to the Special Issue on Reproducibility in Information RetrievalJournal of Data and Information Quality10.1145/326841010:4(1-4)Online publication date: 29-Oct-2018
    • (2018)Introduction to the Special Issue on Reproducibility in Information RetrievalJournal of Data and Information Quality10.1145/326840810:3(1-4)Online publication date: 11-Oct-2018
    • (2018)Overview of CENTRE@CLEF 2018: A First Tale in the Systematic Reproducibility RealmExperimental IR Meets Multilinguality, Multimodality, and Interaction10.1007/978-3-319-98932-7_23(239-246)Online publication date: 15-Aug-2018
    • (2017)Statistical biases in Information Retrieval metrics for recommender systemsInformation Retrieval Journal10.1007/s10791-017-9312-z20:6(606-634)Online publication date: 27-Jul-2017
    • (2016)A Reproducibility Study of Information Retrieval ModelsProceedings of the 2016 ACM International Conference on the Theory of Information Retrieval10.1145/2970398.2970415(77-86)Online publication date: 12-Sep-2016
    • (2016)Conceptual feature generation for textual information using a conceptual network constructed from WikipediaExpert Systems: The Journal of Knowledge Engineering10.1111/exsy.1213333:1(92-106)Online publication date: 1-Feb-2016
    • (2015)Classical databases and knowledge organizationJournal of the Association for Information Science and Technology10.1002/asi.2325066:8(1559-1575)Online publication date: 1-Aug-2015
    • (2014)Improvements to BM25 and Language Models ExaminedProceedings of the 19th Australasian Document Computing Symposium10.1145/2682862.2682863(58-65)Online publication date: 26-Nov-2014
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media