Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Pooling-based continuous evaluation of information retrieval systems
Tonon A., Demartini G., Cudré-Mauroux P. Information Retrieval18 (5):445-472,2015.Type:Article
Date Reviewed: Jan 25 2016

Web search engines, such as Google and Yahoo, are using information retrieval (IR) applications or systems to answer users’ requests. Using IR systems/applications enables the reduction of information overload in large databases.

The evaluation of an IR system consists of determining how well the system performs or processes users’ requests with regard to the information needs of its users. One of the widely employed IR system evaluation strategies is the Cranfield paradigm, which consists of running two different strategies, evaluating them using appropriate metrics, and statistically comparing the strategies with each other.

The common or standard IR evaluation methodologies, based on pooling and crowdsourcing, are largely limited. They consist of partially judging a set of documents. Only the top documents retrieved by the IR system are evaluated.

In this paper, the authors present a new methodology, called pooling-based continuous evaluation, that overcomes the limitations of standard IR system evaluations. The authors describe these limitations as follows:

  • “the difficulty in gathering comprehensive relevance judgments for long runs,” and
  • “the unfair bias towards systems that are evaluated as part of the original evaluation campaign (that is, when the collection is created).”

With this work, the authors greatly contribute to the improvement of IR systems’ evaluation with regard to the accuracy of the results of an IR system. The paper is well written and well structured. The authors judiciously explain the methodology they propose, provide well-described algorithms, and present the limitations of the current IR system evaluation methodologies. I recommend this paper for students and researchers working particularly in the areas of data mining and document processing.

Reviewer:  Thierry Edoh Review #: CR144125 (1607-0532)
Bookmark and Share
  Featured Reviewer  
 
Information Storage And Retrieval (H.3 )
 
 
Data Mining (H.2.8 ... )
 
 
Document And Text Processing (I.7 )
 
Would you recommend this review?
yes
no
Other reviews under "Information Storage And Retrieval": Date
Length normalization in XML retrieval
Kamps J., de Rijke M., Sigurbjörnsson B.  Research and development in information retrieval (Proceedings of the 27th International Conference on Research and Development in Information Retrieval, Sheffield, United Kingdom, Jul 25-29, 2004)80-87, 2004. Type: Proceedings
Nov 1 2005
Building an example application with the unstructured information management architecture
Ferrucci D., Lally A. IBM Systems Journal 43(3): 455-475, 2004. Type: Article
Feb 2 2005
Rich results from poor resources: NTCIR-4 monolingual and cross-lingual retrieval of Korean texts using Chinese and English
Kwok K., Choi S., Dinstl N. ACM Transactions on Asian Language Information Processing 4(2): 136-162, 2005. Type: Article
Mar 2 2006
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy