In some situations search engine users would prefer to retrieve entities instead of just documents. Example queries include “Italian Nobel prize winners”, “Formula 1 drivers that won the Monaco Grand Prix”, or “German spoken Swiss cantons”. The XML Entity Ranking (XER) track at INEX creates a discussion forum aimed at standardizing evaluation procedures for entity retrieval. This paper describes the XER tasks and the evaluation procedure used at the XER track in 2009, where a new version of Wikipedia was used as underlying collection; and summarizes the approaches adopted by the participants.
Additional Metadata
THEME Information (theme 2)
Publisher Springer
Editor S. Geva , J. Kamps , A. Trotman
Citation
Geva, S, Kamps, J, & Trotman, A (Eds.). (2009). Overview of the INEX 2009 Entity Ranking Track. In S Geva, J Kamps, & A Trotman (Eds.), Focused Retrieval and Evaluation, 8th International Workshop of the Initiative for the Evaluation of XML Retrieval, INEX 2009. Springer.