hide
Free keywords:
-
Abstract:
We study the average-case performance of algorithms for the binary knapsack
problem.
Our focus lies on the analysis of so-called {\em core algorithms}, the
predominant
algorithmic concept used in practice.
These algorithms start with the computation of an optimal fractional solution
that has only one fractional item and then they exchange items until an optimal
integral solution is found.
The idea is that in many cases the optimal integral solution should be close to
the fractional
one such that only a few items need to be exchanged.
Despite the well known hardness of the knapsack problem on worst-case instances,
practical studies show that knapsack core algorithms can solve large scale
instances very efficiently.
For example, they exhibit almost linear running time on purely random inputs.
In this paper, we present the first theoretical result on the running time of
core algorithms that comes close to the results observed in practical
experiments.
We prove an upper bound of
$O(n \, \polylog(n))$ on the expected running time of a core algorithm on
instances with $n$ items whose profits and weights are drawn independently,
uniformly at random.
A previous analysis on the average-case complexity of the knapsack problem
proves
a running time of $O(n^4)$, but for a different kind of algorithms.
The previously best known upper bound on the running time of core
algorithms is polynomial as well. The degree of this polynomial, however, is
at least a large three digit number. In addition to uniformly random instances,
we
investigate harder instances in which profits and weights are pairwise
correlated.
For this kind of instances, we can prove a tradeoff describing how the degree of
correlation influences the running time.