de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Crowdsourcing Assessments for XML Ranked Retrieval

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons44000

Alonso,  Omar
Databases and Information Systems, MPI for Informatics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons45380

Schenkel,  Ralf
Databases and Information Systems, MPI for Informatics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons45609

Theobald,  Martin
Databases and Information Systems, MPI for Informatics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Alonso, O., Schenkel, R., & Theobald, M. (2010). Crowdsourcing Assessments for XML Ranked Retrieval. In C. Gurrin, Y. He, G. Kazai, U. Kruschwitz, S. Little, T. Roelleke, et al. (Eds.), Advances in Information Retrieval (pp. 602-606). Berlin: Springer. doi:10.1007/978-3-642-12275-0_57.


Cite as: http://hdl.handle.net/11858/00-001M-0000-000F-14DA-3
Abstract
Crowdsourcing has gained a lot of attention as a viable approach for conducting IR evaluations. This paper shows through a series of experiments on INEX data that crowdsourcing can be a good alternative for relevance assessment in the context of XML retrieval.