English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  The SHARC Framework for Data Quality in Web Archiving

Denev, D., Mazeika, A., Spaniol, M., & Weikum, G. (2011). The SHARC Framework for Data Quality in Web Archiving. The VLDB Journal, 20(2), 183-207. doi:10.1007/s00778-011-0219-9.

Item is

Basic

show hide
Genre: Journal Article
Latex : The {SHARC} Framework for Data Quality in Web Archiving

Files

show Files
hide Files
:
sharc-vldbj.pdf (Any fulltext), 920KB
 
File Permalink:
-
Name:
sharc-vldbj.pdf
Description:
-
OA-Status:
Visibility:
Private
MIME-Type / Checksum:
application/pdf
Technical Metadata:
Copyright Date:
-
Copyright Info:
-
License:
-

Locators

show

Creators

show
hide
 Creators:
Denev, Dimitar1, Author           
Mazeika, Arturas1, Author           
Spaniol, Marc1, Author           
Weikum, Gerhard1, Author           
Affiliations:
1Databases and Information Systems, MPI for Informatics, Max Planck Society, ou_24018              

Content

show
hide
Free keywords: -
 Abstract: Web archives preserve the history of born-digital content and offer great potential for sociologists, business analysts, and legal experts on intellectual property and compliance issues. Data quality is crucial for these purposes. Ideally, crawlers should gather coherent captures of entire Web sites, but the politeness etiquette and completeness requirement mandate very slow, long-duration crawling while Web sites undergo changes. %big-picture contribution This paper presents the SHARC framework for assessing the data quality in Web archives and for tuning capturing strategies towards better quality with given resources. We define data-quality measures, characterize their properties, and develop a suite of quality-conscious scheduling strategies for archive crawling. Our framework includes single-visit and visit-revisit crawls. Single-visit crawls download every page of a site exactly once in an order that aims to minimize the ``blur'' in capturing the site. Visit-revisit strategies revisit pages after their initial downloads to check for intermediate changes. The revisiting order aims to maximize the ``coherence'' of the site capture(number pages that did not change during the capture). The quality notions of blur and coherence are formalized in the paper. Blur is a stochastic notion that reflects the expected number of page changes that a time-travel access to a site capture would accidentally see, instead of the ideal view of a instantaneously captured, ``sharp'' site. Coherence is a deterministic quality measure that counts the number of unchanged and thus coherently captured pages in a site snapshot. Strategies that aim to either minimize blur or maximize coherence are based on prior knowledge of or predictions for the change rates of individual pages. Our framework includes fairly accurate classifiers for change predictions. All strategies are fully implemented in a testbed, and shown to be effective by experiments with both synthetically generated sites and a periodic crawl series for different Web sites.

Details

show
hide
Language(s): eng - English
 Dates: 20112011
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: eDoc: 618946
DOI: 10.1007/s00778-011-0219-9
URI: http://dx.doi.org/10.1007/s00778-011-0219-9
Other: Local-ID: C1256DBF005F876D-0DE8D19CED5A8AE7C1257849005270A3-SHARC-VLDBJ2011
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: The VLDB Journal
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Berlin : Springer
Pages: - Volume / Issue: 20 (2) Sequence Number: - Start / End Page: 183 - 207 Identifier: ISSN: 1066-8888