NSDI '04 Abstract
Pp. 4356 of the Proceedings
Design, Implementation, and Evaluation of
Duplicate Transfer Detection in HTTP
Jeffrey C. Mogul, HP Labs; Yee Man Chan, Stanford Human Genome Center; Terence Kelly, HP Labs
Abstract
Organizations use Web caches to avoid transferring the same data twice
over the same path. Numerous studies have shown that forward proxy
caches, in practice, incur miss rates of at least 50%.
Traditional Web caches rely on the reuse of responses for given URLs.
Previous analyses of real-world traces have revealed a complex
relationship between URLs and reply payloads, and have shown that this
complexity frequently causes redundant transfers to caches. For example,
redundant transfers may result if a payload is aliased (accessed
via different URLs), or if a resource rotates (alternates
between different values), or if HTTP's cache revalidation mechanisms
are not fully exploited.
We implement and evaluate a technique known in the literature as
Duplicate Transfer Detection
(DTD), with which a Web cache can use digests to detect and
potentially eliminate all redundant payload transfers.
We show how HTTP can support DTD with few or no protocol changes,
and how a DTD-enabled proxy cache can interoperate with
unmodified existing origin servers and browsers, thereby permitting
incremental deployment.
We present both simulated and experimental results that
quantify the benefits of DTD.
- View the full text of this paper in HTML and PDF.
The Proceedings are published as a collective work, © 2004 by the USENIX Association. All Rights Reserved. Rights to individual papers remain with the author or the author's employer. Permission is granted for the noncommercial reproduction of the complete work for educational or research purposes. USENIX acknowledges all trademarks within this paper.
- If you need the latest Adobe Acrobat Reader, you can download it from Adobe's site.
|