English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Evaluation of Relevance Feedback Algorithms for XML Retrieval

Solomon, S. (2007). Evaluation of Relevance Feedback Algorithms for XML Retrieval. Master Thesis, Universität des Saarlandes, Saarbrücken.

Item is

Basic

show hide
Genre: Thesis
Latex : Evaluation of Relevance Feedback Algorithms for {XML} Retrieval

Files

show Files
hide Files
:
solomon2007_thesis.pdf (Any fulltext), 5KB
 
File Permalink:
-
Name:
solomon2007_thesis.pdf
Description:
-
OA-Status:
Visibility:
Restricted (Max Planck Institute for Informatics, MSIN; )
MIME-Type / Checksum:
application/pdf
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show

Creators

show
hide
 Creators:
Solomon, Silvana1, 2, Author           
Weikum, Gerhard1, Advisor           
Schenkel, Ralf1, Referee           
Affiliations:
1Databases and Information Systems, MPI for Informatics, Max Planck Society, ou_24018              
2International Max Planck Research School, MPI for Informatics, Max Planck Society, Campus E1 4, 66123 Saarbrücken, DE, ou_1116551              

Content

show
hide
Free keywords: -
 Abstract: Information retrieval and feedback in {XML} are rather new fields for researchers; natural questions arise, such as: how good are the feedback algorithms in {XML IR}? Can they be evaluated with standard evaluation tools? Even though some evaluation methods have been proposed in the literature, it is still not clear yet which of them are applicable in the context of {XML IR}, and which metrics they can be combined with to assess the quality of {XML} retrieval algorithms that use feedback. We propose a solution for fairly evaluating the performance of the {XML} search engines that use feedback for improving the query results. Compared to previous approaches, we aim at removing the effect of the results for which the system has knowledge about their the relevance, and at measuring the improvement on unseen relevant elements. We implemented our proposed evaluation methodologies by extending a standard evaluation tool with a module capable of assessing feedback algorithms for a specific set of metrics. We performed multiple tests on runs from both {INEX} 2005 and {INEX} 2006, covering two different {XML} document collections. The performance of the assessed feedback algorithms did not reach the theoretical optimal values either for the proposed evaluation methodologies, or for the used metrics. The analysis of the results shows that, although the six evaluation techniques provide good improvement figures, none of them can be declared the absolute winner. Despite the lack of a definitive conclusion, our findings provide a better understanding on the quality of feedback algorithms.

Details

show
hide
Language(s): eng - English
 Dates: 2008-02-282007-07-172007
 Publication Status: Issued
 Pages: -
 Publishing info: Saarbrücken : Universität des Saarlandes
 Table of Contents: -
 Rev. Type: -
 Identifiers: eDoc: 356461
Other: Local-ID: C12573CC004A8E26-7696201B4CA7C699C125730800414211-Solomon2007
 Degree: Master

Event

show

Legal Case

show

Project information

show

Source

show