日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細

登録内容を編集ファイル形式で保存
 
 
ダウンロード電子メール
  The SHARC Framework for Data Quality in Web Archiving

Denev, D., Mazeika, A., Spaniol, M., & Weikum, G. (2011). The SHARC Framework for Data Quality in Web Archiving. The VLDB Journal, 20(2), 183-207. doi:10.1007/s00778-011-0219-9.

Item is

基本情報

表示: 非表示:
資料種別: 学術論文
LaTeX : The {SHARC} Framework for Data Quality in Web Archiving

ファイル

表示: ファイル
非表示: ファイル
:
sharc-vldbj.pdf (全文テキスト(全般)), 920KB
 
ファイルのパーマリンク:
-
ファイル名:
sharc-vldbj.pdf
説明:
-
OA-Status:
閲覧制限:
非公開
MIMEタイプ / チェックサム:
application/pdf
技術的なメタデータ:
著作権日付:
-
著作権情報:
-
CCライセンス:
-

関連URL

表示:

作成者

表示:
非表示:
 作成者:
Denev, Dimitar1, 著者           
Mazeika, Arturas1, 著者           
Spaniol, Marc1, 著者           
Weikum, Gerhard1, 著者           
所属:
1Databases and Information Systems, MPI for Informatics, Max Planck Society, ou_24018              

内容説明

表示:
非表示:
キーワード: -
 要旨: Web archives preserve the history of born-digital content and offer great potential for sociologists, business analysts, and legal experts on intellectual property and compliance issues. Data quality is crucial for these purposes. Ideally, crawlers should gather coherent captures of entire Web sites, but the politeness etiquette and completeness requirement mandate very slow, long-duration crawling while Web sites undergo changes. %big-picture contribution This paper presents the SHARC framework for assessing the data quality in Web archives and for tuning capturing strategies towards better quality with given resources. We define data-quality measures, characterize their properties, and develop a suite of quality-conscious scheduling strategies for archive crawling. Our framework includes single-visit and visit-revisit crawls. Single-visit crawls download every page of a site exactly once in an order that aims to minimize the ``blur'' in capturing the site. Visit-revisit strategies revisit pages after their initial downloads to check for intermediate changes. The revisiting order aims to maximize the ``coherence'' of the site capture(number pages that did not change during the capture). The quality notions of blur and coherence are formalized in the paper. Blur is a stochastic notion that reflects the expected number of page changes that a time-travel access to a site capture would accidentally see, instead of the ideal view of a instantaneously captured, ``sharp'' site. Coherence is a deterministic quality measure that counts the number of unchanged and thus coherently captured pages in a site snapshot. Strategies that aim to either minimize blur or maximize coherence are based on prior knowledge of or predictions for the change rates of individual pages. Our framework includes fairly accurate classifiers for change predictions. All strategies are fully implemented in a testbed, and shown to be effective by experiments with both synthetically generated sites and a periodic crawl series for different Web sites.

資料詳細

表示:
非表示:
言語: eng - English
 日付: 20112011
 出版の状態: 出版
 ページ: -
 出版情報: -
 目次: -
 査読: 査読あり
 識別子(DOI, ISBNなど): eDoc: 618946
DOI: 10.1007/s00778-011-0219-9
URI: http://dx.doi.org/10.1007/s00778-011-0219-9
その他: Local-ID: C1256DBF005F876D-0DE8D19CED5A8AE7C1257849005270A3-SHARC-VLDBJ2011
 学位: -

関連イベント

表示:

訴訟

表示:

Project information

表示:

出版物 1

表示:
非表示:
出版物名: The VLDB Journal
種別: 学術雑誌
 著者・編者:
所属:
出版社, 出版地: Berlin : Springer
ページ: - 巻号: 20 (2) 通巻号: - 開始・終了ページ: 183 - 207 識別子(ISBN, ISSN, DOIなど): ISSN: 1066-8888