Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1007/11880592_44guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Evaluating scalability in information retrieval with multigraded relevance

Published: 16 October 2006 Publication History

Abstract

For the user’s point of view, in large environments, it can be desirable to have Information Retrieval Systems (IRS) that retrieve documents according to their relevance levels. Relevance levels have been studied in some previous Information Retrieval (IR) works while some others (few) IR research works tackled the questions of IRS effectiveness and collections size. These latter works used standard IR measures on collections of increasing size to analyze IRS effectiveness scalability. In this work, we bring together these two issues in IR (multigraded relevance and scalability) by designing some new metrics for evaluating the ability of IRS to rank documents according to their relevance levels when collection size increases.

References

[1]
Lyman, P., Varian, H.R., Swearingen, K., Charles, P., Good, N., Jordan, L.L., Pal, J.: How much informations 2003. http://www.sims.berkeley.edu/research/ projects/how-much-info-2003/ (2003)
[2]
Mizzaro, S.: How many relevances in information retrieval? Interacting with Computers 10 (1998) 303-320
[3]
Barry, C.L.: User-de.ned relevance criteria: an exploratory study. Journal of the American Society for Information Science 45 (1994) 149U159
[4]
Saracevic, T.: Relevance: A review of and a framework for the thinking on the notion in information science. Journal of the American Society for Information Science 26 (1975) 321-343
[5]
Schamber L., E.M.B., Nilan, M.S.: A re-examination of relevance: toward a dynamic, situational definition. Information Processing and Management 26 (1990) 755U776
[6]
Wilson, P.: Situational relevance. Information Storage and Retrieval 9 (1973) 457-471
[7]
Cooper, W.S.: A definition of relevance for information retrieval. Information Storage and Retrieval (1971)
[8]
Cosijn, E., Ingwersen, P.: Dimensions of relevance. Information Processing and Management 36 (2000) 533U550
[9]
Rees, A.M., Schulz, D.G.: A field experimental approach to the study of relevance assessments in relation to document searching. 2 vols. Technical Report NSF Contract No. C-423, Center for Documentation and Communication Research, School of Library Science (1967)
[10]
Cuadra, C.A., Katter, R.V.: The relevance of relevance assessment. In: Proceedings of the American Documentation Institute. Volume 4., American Documentation Institute, Washington, DC (1967) 95-99
[11]
Kekalainen, J., Jarvelin, K.: Using graded relevance assessments in ir evaluation. Journal of the American Society for Information Science and Technology 53 (2002) 1120-1129
[12]
Tang, R., William M. Shaw, J., Vevea, J.L.: Towards the identification of the optimal number of relevance categories. Journal of the American Society for Information Science 50 (1999) 254-264
[13]
Spink, A., Greisdorf, H., Bateman, J.: From highly relevant to not relevant: examining different regions of relevance. Information Processing and Management: an International Journal 34 (1998) 599-621
[14]
Voorhees, E.M.: Evaluation by highly relevant documents. In: Proceedings of the 24th annual international ACM SIGIR Conference. (2001) 74-82
[15]
: Ntcir workshop 1: Proceedings of the first ntcir workshop on retrieval in japanese text retrieval and term recognition, tokyo, japan. In Kando, N., Nozue, T., eds.: NTCIR. (1999)
[16]
Jarvelin, K., Kekalainen, J.: Ir evaluation methods for retrieving highly relevant documents. In: Proceedings of the 23th annual international ACM SIGIR Conference. (2000) 41-48
[17]
Sakai, T.: Average gain ratio: A simple retrieval performance measure for evaluation with multiple relevance levels. In: SIGIR'03. (2003)
[18]
Kekalainen, J.: Binary and graded relevance in ir evaluations -comparison of the effects on rankings of ir systems. Information Processing an Management 41 (2005)

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
AIRS'06: Proceedings of the Third Asia conference on Information Retrieval Technology
October 2006
679 pages
ISBN:3540457801
  • Editors:
  • Hwee Tou Ng,
  • Mun-Kew Leong,
  • Min-Yen Kan,
  • Donghong Ji

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 16 October 2006

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 10 Feb 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media