Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3673791.3698434acmconferencesArticle/Chapter ViewAbstractPublication Pagessigir-apConference Proceedingsconference-collections
short-paper

The First Workshop on Evaluation Methodologies, Testbeds and Community for Information Access Research (EMTCIR 2024)

Published: 08 December 2024 Publication History

Abstract

Evaluation campaigns, where researchers share important tasks, collaboratively develop test collections, and have discussion to advance technologies, are still important events to strategically address core challenges in information access research. The goal of this workshop is to discuss information access tasks that are worth addressing as a community, share new resources and evaluation methodologies, and encourage researchers to ultimately propose new evaluation campaigns in NTCIR, TREC, CLEF, FIRE, etc. The proposed workshop accepts four types of contributions, namely, emerging task, ongoing task, resource, and evaluation papers. The workshop will start with presentation of accepted papers and introduction of ongoing tasks. The rest of the workshop will be run in an interactive manner: round-table discussion on potential tasks.

References

[1]
Payal Bajaj, Daniel Campos, Nick Craswell, Li Deng, Jianfeng Gao, Xiaodong Liu, Rangan Majumder, Andrew McNamara, Bhaskar Mitra, Tri Nguyen, et al. 2016. Ms marco: A human generated machine reading comprehension dataset. arXiv preprint arXiv:1611.09268 (2016).
[2]
Garbiel Bénédict, Ruqing Zhang, and Donald Metzler. 2023. Gen-IR@SIGIR 2023: The First Workshop on Generative Information Retrieval. In Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval. 3460--3463.
[3]
Shubham Chatterjee and Laura Dietz. 2022. BERT-ER: query-specific BERT entity representations for entity ranking. In Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval. 1466--1477.
[4]
Nick Craswell, Bhaskar Mitra, Emine Yilmaz, Daniel Campos, Ellen M Voorhees, and Ian Soboroff. 2021. TREC deep learning track: Reusable test collections in the large data regime. In Proceedings of the 44th international ACM SIGIR conference on research and development in information retrieval. 2369--2375.
[5]
Guglielmo Faggioli, Laura Dietz, Charles LA Clarke, Gianluca Demartini, Matthias Hagen, Claudia Hauff, Noriko Kando, Evangelos Kanoulas, Martin Potthast, Benno Stein, et al. 2023. Perspectives on large language models for relevance judgment. In Proceedings of the 2023 ACM SIGIR International Conference on Theory of Information Retrieval. 39--50.
[6]
Johannes Kiesel, Marcel Gohsen, Nailia Mirzakhmedova, Matthias Hagen, and Benno Stein. 2024. Large language models can accurately predict searcher preferences. In CLEF 2024.
[7]
Carol Peters. 2019. Information Retrieval Evaluation in a Changing World Lessons Learned from 20 Years of CLEF. (2019).
[8]
Tetsuya Sakai, Douglas W Oard, and Noriko Kando. 2021. Evaluating Information Retrieval and Access Tasks: NTCIR's Legacy of Research Impact. Springer Nature.
[9]
Gregory Tassey, Brent R Rowe, Dallas W Wood, Albert N Link, and Diglio A Simoni. 2010. Economic impact assessment of NIST's text retrieval conference (TREC) program. National Institute of Standards and Technology, Gaithersburg, Maryland (2010).
[10]
Paul Thomas, Seth Spielman, Nick Craswell, and Bhaskar Mitra. 2023. Large language models can accurately predict searcher preferences. arXiv preprint arXiv:2309.10621 (2023).
[11]
Hamed Zamani, Johanne R Trippas, Jeff Dalton, Filip Radlinski, et al. 2023. Conversational information seeking. Foundations and Trends® in Information Retrieval, Vol. 17, 3--4 (2023), 244--456.
[12]
Yujia Zhou, Qiannan Zhu, Jiajie Jin, and Zhicheng Dou. 2024. Cognitive Personalized Search Integrating Large Language Models with an Efficient Memory Mechanism. In Proceedings of the ACM on Web Conference 2024. 1464--1473.

Index Terms

  1. The First Workshop on Evaluation Methodologies, Testbeds and Community for Information Access Research (EMTCIR 2024)

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGIR-AP 2024: Proceedings of the 2024 Annual International ACM SIGIR Conference on Research and Development in Information Retrieval in the Asia Pacific Region
    December 2024
    328 pages
    ISBN:9798400707247
    DOI:10.1145/3673791
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 08 December 2024

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. evaluation campaigns
    2. evaluation methodology
    3. test collections

    Qualifiers

    • Short-paper

    Conference

    SIGIR-AP 2024
    Sponsor:

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 45
      Total Downloads
    • Downloads (Last 12 months)45
    • Downloads (Last 6 weeks)11
    Reflects downloads up to 03 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media