English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Crowdsourcing Assessments for XML Ranked Retrieval

MPS-Authors
/persons/resource/persons44000

Alonso,  Omar
Databases and Information Systems, MPI for Informatics, Max Planck Society;

/persons/resource/persons45380

Schenkel,  Ralf
Databases and Information Systems, MPI for Informatics, Max Planck Society;

/persons/resource/persons45609

Theobald,  Martin
Databases and Information Systems, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Alonso, O., Schenkel, R., & Theobald, M. (2010). Crowdsourcing Assessments for XML Ranked Retrieval. In C. Gurrin, Y. He, G. Kazai, U. Kruschwitz, S. Little, T. Roelleke, et al. (Eds.), Advances in Information Retrieval (pp. 602-606). Berlin: Springer. doi:10.1007/978-3-642-12275-0_57.


Cite as: https://hdl.handle.net/11858/00-001M-0000-000F-14DA-3
Abstract
Crowdsourcing has gained a lot of attention as a viable approach for conducting IR evaluations. This paper shows through a series of experiments on INEX data that crowdsourcing can be a good alternative for relevance assessment in the context of XML retrieval.