Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1031171.1031246acmconferencesArticle/Chapter ViewAbstractPublication PagescikmConference Proceedingsconference-collections
Article

Providing consistent and exhaustive relevance assessments for XML retrieval evaluation

Published: 13 November 2004 Publication History

Abstract

Comparing retrieval approaches requires test collections, which consist of documents, queries and relevance assessments. Obtaining consistent and exhaustive relevance assessments is crucial for the appropriate comparison of retrieval approaches. Whereas the evaluation methodology for flat text retrieval approaches is well established, the evaluation of XML retrieval approaches is a research issue. This is because XML documents are composed of nested components that cannot be considered independent in terms of relevance. This paper describes the methodology adopted in INEX (the INitiative for the Evaluation of XML Retrieval) to ensure consistent and exhaustive relevance assessments.

References

[1]
R. Baeza-Yates, N. Fuhr, and Y. S. Maarek, editors. ACM SIGIR 2002 Workshop on XML, Aug. 2002.
[2]
H. M. Blanken, T. Grabs, H.-J. S. andRalf Schenkel, and G. Weikum, editors. Intelligent Search on XML Data, Applications, Languages, Models, Implementations, and Benchmarks, volume 2818 of Lecture Notes in Computer Science. Springer, 2003.
[3]
D. Carmel, Y. Maarek, and A. Soffer, editors. ACM SIGIR 2000 Workshop on XML, July 2000.
[4]
Y. Chiaramella. Browsing and Querying: two complementary approaches for Multimedia Information Retrieval. In HIM'97 International Conference, Dortmund, Germany, 1997.
[5]
N. Fuhr, S. Malik, and M. Lalmas. Overview of the initiative for the evaluation of xml retrieval (inex) 2003. In Proceedings of the Second INEX Workshop, March 2004.
[6]
G. Kazai. Report of the inex 2003 metrics working group. In N. Fuhr, M. Lalmas, and S. Malik, editors, Proceedings of the 2nd Workshop of the INitiative for the Evaluation of XML retrieval (INEX), Dagstuhl, germany, December 2003, pages 184--190, April 2004.
[7]
G. Kazai, S. Masood, and M. Lalmas. A study of the assessment of relevance for the INEX'02 test collection. In S. McDonald and J. Tait, editors, Advances in Information Retrieval, 26th European Conference on IR Research, ECIR 2004, volume 2997 of Lecture Notes in Computer Science, Sunderland, UK, Apr. 2004. Springer.
[8]
J. Kekäläinen and K. Järvelin. Using graded relevance assessments in IR evaluation. Journal of the American Society for Information Science (JASIS), 53(13):1120--1129, 2002.
[9]
R. Luk, H. Leong, T. Dillon, A. Chan, W. B. Croft, and J. Allan. A Survey in Indexing and Searching XML Documents. JASIS, 6(53):415--437, Mar. 2002.
[10]
B. Piwowarski and M. Lalmas. Interface pour l'évaluation de systèmes de recherche sur des documents XML. In Premiere COnference en Recherche d'Information et Applications (CORIA'04), Toulouse, France, Mar. 2004. Hermès.
[11]
K. Sparck~Jones and C. J. Van Rijsbergen. Report on the need for and provision of an ideal information retrieval test collection. Technical Report 5266, Computer Laboratory, University of Cambridge, Cambridge, England, 1975.
[12]
E. M. Voorhees and D. K. Harman, editors. The Tenth Text Retrieval Conference (TREC 2001), Gaithersburg, MD, USA, 2002. NIST.

Cited By

View all
  • (2012)Model for simulating result document browsing in focused retrievalProceedings of the 4th Information Interaction in Context Symposium10.1145/2362724.2362764(238-241)Online publication date: 21-Aug-2012
  • (2011)Crowdsourcing for book search evaluationProceedings of the 34th international ACM SIGIR conference on Research and development in Information Retrieval10.1145/2009916.2009947(205-214)Online publication date: 24-Jul-2011
  • (2011)The Potential Benefit of Focused Retrieval in Relevant-in-Context TaskComparative Evaluation of Focused Retrieval10.1007/978-3-642-23577-1_2(33-43)Online publication date: 2011
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CIKM '04: Proceedings of the thirteenth ACM international conference on Information and knowledge management
November 2004
678 pages
ISBN:1581138741
DOI:10.1145/1031171
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 November 2004

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. INEX
  2. XML
  3. evaluation
  4. relevance assessment process

Qualifiers

  • Article

Conference

CIKM04
Sponsor:
CIKM04: Conference on Information and Knowledge Management
November 8 - 13, 2004
D.C., Washington, USA

Acceptance Rates

Overall Acceptance Rate 1,861 of 8,427 submissions, 22%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)0
Reflects downloads up to 30 Aug 2024

Other Metrics

Citations

Cited By

View all
  • (2012)Model for simulating result document browsing in focused retrievalProceedings of the 4th Information Interaction in Context Symposium10.1145/2362724.2362764(238-241)Online publication date: 21-Aug-2012
  • (2011)Crowdsourcing for book search evaluationProceedings of the 34th international ACM SIGIR conference on Research and development in Information Retrieval10.1145/2009916.2009947(205-214)Online publication date: 24-Jul-2011
  • (2011)The Potential Benefit of Focused Retrieval in Relevant-in-Context TaskComparative Evaluation of Focused Retrieval10.1007/978-3-642-23577-1_2(33-43)Online publication date: 2011
  • (2010)The potential benefit of focused retrieval in relevant-in-context taskProceedings of the 9th international conference on Initiative for the evaluation of XML retrieval: comparative evaluation of focused retrieval10.5555/2040369.2040372(33-43)Online publication date: 13-Dec-2010
  • (2010)Expected reading effort in focused retrieval evaluationInformation Retrieval10.1007/s10791-010-9133-913:5(460-484)Online publication date: 6-May-2010
  • (2009)XML RetrievalSynthesis Lectures on Information Concepts, Retrieval, and Services10.2200/S00203ED1V01Y200907ICR0071:1(1-111)Online publication date: Jan-2009
  • (2008)Sound and complete relevance assessment for XML retrievalACM Transactions on Information Systems10.1145/1416950.141695127:1(1-37)Online publication date: 23-Dec-2008
  • (2007)INEX 2002-2006Proceedings of the 1st international conference on Digital libraries: research and development10.5555/1782334.1782357(187-196)Online publication date: 13-Feb-2007
  • (2007)Evaluating XML retrieval effectiveness at INEXACM SIGIR Forum10.1145/1273221.127322541:1(40-57)Online publication date: 1-Jun-2007
  • (2007)Examining topic shifts in content-oriented XML retrievalInternational Journal on Digital Libraries10.1007/s00799-007-0026-58:1(39-60)Online publication date: 25-Oct-2007
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media