Please use this identifier to cite or link to this item: http://localhost/handle/Hannan/537102
Title: Convergence of the Huber Regression M-Estimate in the Presence of Dense Outliers
Authors: ACCESS Linnaeus Centre, R. Inst. of Technol., Stockholm, Sweden;Tsakonas, Efthymios ; Jaldén, Joakim ; Sidiropoulos, Nicholas ; Ottersten, Bjorn
subject: Gaussian processes; matrix algebra; regression analysis; Huber penalty function; Huber regression M-estimate convergence; additive outliers; constant linear fraction; contaminated measurements; dense outliers; deterministic unknown vector; linear transformation; matrix measurement; noisy measurements; sought vector; unknown distribution; Convergence; Electric breakdown; Linear regression; Pollution measurement; Robustness; Standards; Vectors; Breakdown point (BP); Huber estimator; dense outliers; performance analysis;
Year: 2014
Publisher: IEEE
Abstract: We consider the problem of estimating a deterministic unknown vector which depends linearly on n noisy measurements, additionally contaminated with (possibly unbounded) additive outliers. The measurement matrix of the model (i.e., the matrix involved in the linear transformation of the sought vector) is assumed known, and comprised of standard Gaussian i.i.d. entries. The outlier variables are assumed independent of the measurement matrix, deterministic or random with possibly unknown distribution. Under these assumptions we provide a simple proof that the minimizer of the Huber penalty function of the residuals converges to the true parameter vector with a √n-rate, even when outliers are dense, in the sense that there is a constant linear fraction of contaminated measurements which can be arbitrarily close to one. The constants influencing the rate of convergence are shown to explicitly depend on the outlier contamination level.
URI: http://localhost/handle/Hannan/257417
http://localhost/handle/Hannan/537102
ISSN: 1070-9908
volume: 21
issue: 10
Appears in Collections:2014

Files in This Item:
File Description SizeFormat 
6828704.pdf1.19 MBAdobe PDFThumbnail
Preview File
Title: Convergence of the Huber Regression M-Estimate in the Presence of Dense Outliers
Authors: ACCESS Linnaeus Centre, R. Inst. of Technol., Stockholm, Sweden;Tsakonas, Efthymios ; Jaldén, Joakim ; Sidiropoulos, Nicholas ; Ottersten, Bjorn
subject: Gaussian processes; matrix algebra; regression analysis; Huber penalty function; Huber regression M-estimate convergence; additive outliers; constant linear fraction; contaminated measurements; dense outliers; deterministic unknown vector; linear transformation; matrix measurement; noisy measurements; sought vector; unknown distribution; Convergence; Electric breakdown; Linear regression; Pollution measurement; Robustness; Standards; Vectors; Breakdown point (BP); Huber estimator; dense outliers; performance analysis;
Year: 2014
Publisher: IEEE
Abstract: We consider the problem of estimating a deterministic unknown vector which depends linearly on n noisy measurements, additionally contaminated with (possibly unbounded) additive outliers. The measurement matrix of the model (i.e., the matrix involved in the linear transformation of the sought vector) is assumed known, and comprised of standard Gaussian i.i.d. entries. The outlier variables are assumed independent of the measurement matrix, deterministic or random with possibly unknown distribution. Under these assumptions we provide a simple proof that the minimizer of the Huber penalty function of the residuals converges to the true parameter vector with a √n-rate, even when outliers are dense, in the sense that there is a constant linear fraction of contaminated measurements which can be arbitrarily close to one. The constants influencing the rate of convergence are shown to explicitly depend on the outlier contamination level.
URI: http://localhost/handle/Hannan/257417
http://localhost/handle/Hannan/537102
ISSN: 1070-9908
volume: 21
issue: 10
Appears in Collections:2014

Files in This Item:
File Description SizeFormat 
6828704.pdf1.19 MBAdobe PDFThumbnail
Preview File
Title: Convergence of the Huber Regression M-Estimate in the Presence of Dense Outliers
Authors: ACCESS Linnaeus Centre, R. Inst. of Technol., Stockholm, Sweden;Tsakonas, Efthymios ; Jaldén, Joakim ; Sidiropoulos, Nicholas ; Ottersten, Bjorn
subject: Gaussian processes; matrix algebra; regression analysis; Huber penalty function; Huber regression M-estimate convergence; additive outliers; constant linear fraction; contaminated measurements; dense outliers; deterministic unknown vector; linear transformation; matrix measurement; noisy measurements; sought vector; unknown distribution; Convergence; Electric breakdown; Linear regression; Pollution measurement; Robustness; Standards; Vectors; Breakdown point (BP); Huber estimator; dense outliers; performance analysis;
Year: 2014
Publisher: IEEE
Abstract: We consider the problem of estimating a deterministic unknown vector which depends linearly on n noisy measurements, additionally contaminated with (possibly unbounded) additive outliers. The measurement matrix of the model (i.e., the matrix involved in the linear transformation of the sought vector) is assumed known, and comprised of standard Gaussian i.i.d. entries. The outlier variables are assumed independent of the measurement matrix, deterministic or random with possibly unknown distribution. Under these assumptions we provide a simple proof that the minimizer of the Huber penalty function of the residuals converges to the true parameter vector with a √n-rate, even when outliers are dense, in the sense that there is a constant linear fraction of contaminated measurements which can be arbitrarily close to one. The constants influencing the rate of convergence are shown to explicitly depend on the outlier contamination level.
URI: http://localhost/handle/Hannan/257417
http://localhost/handle/Hannan/537102
ISSN: 1070-9908
volume: 21
issue: 10
Appears in Collections:2014

Files in This Item:
File Description SizeFormat 
6828704.pdf1.19 MBAdobe PDFThumbnail
Preview File