Fast approximate L∞ minimization: Speeding up robust regression
Minimization of the L∞ norm, which can be viewed as approximately solving the non-convex least median estimation problem, is a powerful method for outlier removal and hence robust regression. However, current techniques for solving the problem at the heart of L∞ norm minimization are slow, and therefore cannot be scaled to large problems. A new method for the minimization of the L∞ norm is presented here, which provides a speedup of multiple orders of magnitude for data with high dimension. This method, termed Fast L∞Minimization, allows robust regression to be applied to a class of problems which was previously inaccessible. It is shown how the L∞ norm minimization problem can be broken up into smaller sub-problems, which can then be solved extremely efficiently. Experimental results demonstrate the radical reduction in computation time, along with robustness against large numbers of outliers in a few model-fitting problems.
Year of publication: |
2014
|
---|---|
Authors: | Shen, Fumin ; Shen, Chunhua ; Hill, Rhys ; van den Hengel, Anton ; Tang, Zhenmin |
Published in: |
Computational Statistics & Data Analysis. - Elsevier, ISSN 0167-9473. - Vol. 77.2014, C, p. 25-37
|
Publisher: |
Elsevier |
Subject: | Least-squares regression | Outlier removal | Robust regression | Face recognition |
Saved in:
Saved in favorites
Similar items by subject
-
Robust distances for outlier-free goodness-of-fit testing
Cerioli, Andrea, (2013)
-
A Note on an Estimation Problem in Models with Adaptive Learning
Christopeit, Norbert, (2013)
-
A new algorithm for fixed design regression and denoising
Comte, F., (2004)
- More ...
Similar items by person