ausblenden:
Schlagwörter:
-
Zusammenfassung:
Nonnegative Matrix Approximation is an effective matrix
decomposition technique that has proven to be useful for a
wide variety of applications ranging from document analysis
and image processing to bioinformatics. There exist a few
algorithms for nonnegative matrix approximation (NNMA),
for example, Lee Seungs multiplicative updates, alternating
least squares, and certain gradient descent based procedures.
All of these procedures suffer from either slow convergence,
numerical instabilities, or at worst, theoretical unsoundness.
In this paper we present new and improved algorithms
for the least-squares NNMA problem, which are
not only theoretically well-founded, but also overcome many
of the deficiencies of other methods. In particular, we use
non-diagonal gradient scaling to obtain rapid convergence.
Our methods provide numerical results superior to both Lee
Seungs method as well to the alternating least squares
(ALS) heuristic, which is known to work well in some situations
but has no theoretical guarantees (Berry et al. 2006).
Our approach extends naturally to include regularization and
box-constraints, without sacrificing convergence guarantees.
We present experimental results on both synthetic and realworld
datasets to demonstrate the superiority of our methods,
in terms of better approximations as well as efficiency.