Penalized Methods for High-dimensional Least Absolute Deviations Regression

Penalized Methods for High-dimensional Least Absolute Deviations Regression
Title Penalized Methods for High-dimensional Least Absolute Deviations Regression PDF eBook
Author Xiaoli Gao
Publisher
Pages 236
Release 2008
Genre Least absolute deviations (Statistics)
ISBN

Download Penalized Methods for High-dimensional Least Absolute Deviations Regression Book in PDF, Epub and Kindle

Robust Penalized Regression for Complex High-dimensional Data

Robust Penalized Regression for Complex High-dimensional Data
Title Robust Penalized Regression for Complex High-dimensional Data PDF eBook
Author Bin Luo
Publisher
Pages 169
Release 2020
Genre Dimensional analysis
ISBN

Download Robust Penalized Regression for Complex High-dimensional Data Book in PDF, Epub and Kindle

"Robust high-dimensional data analysis has become an important and challenging task in complex Big Data analysis due to the high-dimensionality and data contamination. One of the most popular procedures is the robust penalized regression. In this dissertation, we address three typical robust ultra-high dimensional regression problems via penalized regression approaches. The first problem is related to the linear model with the existence of outliers, dealing with the outlier detection, variable selection and parameter estimation simultaneously. The second problem is related to robust high-dimensional mean regression with irregular settings such as the data contamination, data asymmetry and heteroscedasticity. The third problem is related to robust bi-level variable selection for the linear regression model with grouping structures in covariates. In Chapter 1, we introduce the background and challenges by overviews of penalized least squares methods and robust regression techniques. In Chapter 2, we propose a novel approach in a penalized weighted least squares framework to perform simultaneous variable selection and outlier detection. We provide a unified link between the proposed framework and a robust M-estimation in general settings. We also establish the non-asymptotic oracle inequalities for the joint estimation of both the regression coefficients and weight vectors. In Chapter 3, we establish a framework of robust estimators in high-dimensional regression models using Penalized Robust Approximated quadratic M estimation (PRAM). This framework allows general settings such as random errors lack of symmetry and homogeneity, or covariates are not sub-Gaussian. Theoretically, we show that, in the ultra-high dimension setting, the PRAM estimator has local estimation consistency at the minimax rate enjoyed by the LS-Lasso and owns the local oracle property, under certain mild conditions. In Chapter 4, we extend the study in Chapter 3 to robust high-dimensional data analysis with structured sparsity. In particular, we propose a framework of high-dimensional M-estimators for bi-level variable selection. This framework encourages bi-level sparsity through a computationally efficient two-stage procedure. It produces strong robust parameter estimators if some nonconvex redescending loss functions are applied. In theory, we provide sufficient conditions under which our proposed two-stage penalized M-estimator possesses simultaneous local estimation consistency and the bi-level variable selection consistency, if a certain nonconvex penalty function is used at the group level. The performances of the proposed estimators are demonstrated in both simulation studies and real examples. In Chapter 5, we provide some discussions and future work."--Abstract from author supplied metadata

Least Absolute Deviation Regression Theory and Methods

Least Absolute Deviation Regression Theory and Methods
Title Least Absolute Deviation Regression Theory and Methods PDF eBook
Author S. Eakambaram
Publisher LAP Lambert Academic Publishing
Pages 120
Release 2011-10
Genre
ISBN 9783846508565

Download Least Absolute Deviation Regression Theory and Methods Book in PDF, Epub and Kindle

This monograph deals with Introduction, Basic Concepts, Brief Review of Regression Theory. It present the work related to Least Absolute Deviations (LAD) Regression and its estimation theory with and without auto correlated errors. LAD and Least Squares estimation of censored regression model with fixed and marginal effects are also discussed. Further, it contains LAD estimation for linear and nonlinear regression model for truncated and censored data.

Sparse Polynomial Approximation of High-Dimensional Functions

Sparse Polynomial Approximation of High-Dimensional Functions
Title Sparse Polynomial Approximation of High-Dimensional Functions PDF eBook
Author Ben Adcock
Publisher SIAM
Pages 310
Release 2022-02-16
Genre Mathematics
ISBN 161197688X

Download Sparse Polynomial Approximation of High-Dimensional Functions Book in PDF, Epub and Kindle

Over seventy years ago, Richard Bellman coined the term “the curse of dimensionality” to describe phenomena and computational challenges that arise in high dimensions. These challenges, in tandem with the ubiquity of high-dimensional functions in real-world applications, have led to a lengthy, focused research effort on high-dimensional approximation—that is, the development of methods for approximating functions of many variables accurately and efficiently from data. This book provides an in-depth treatment of one of the latest installments in this long and ongoing story: sparse polynomial approximation methods. These methods have emerged as useful tools for various high-dimensional approximation tasks arising in a range of applications in computational science and engineering. It begins with a comprehensive overview of best s-term polynomial approximation theory for holomorphic, high-dimensional functions, as well as a detailed survey of applications to parametric differential equations. It then describes methods for computing sparse polynomial approximations, focusing on least squares and compressed sensing techniques. Sparse Polynomial Approximation of High-Dimensional Functions presents the first comprehensive and unified treatment of polynomial approximation techniques that can mitigate the curse of dimensionality in high-dimensional approximation, including least squares and compressed sensing. It develops main concepts in a mathematically rigorous manner, with full proofs given wherever possible, and it contains many numerical examples, each accompanied by downloadable code. The authors provide an extensive bibliography of over 350 relevant references, with an additional annotated bibliography available on the book’s companion website (www.sparse-hd-book.com). This text is aimed at graduate students, postdoctoral fellows, and researchers in mathematics, computer science, and engineering who are interested in high-dimensional polynomial approximation techniques.

Penalized Methods and Algorithms for High-dimensional Regression in the Presence of Heterogeneity

Penalized Methods and Algorithms for High-dimensional Regression in the Presence of Heterogeneity
Title Penalized Methods and Algorithms for High-dimensional Regression in the Presence of Heterogeneity PDF eBook
Author Congrui Yi
Publisher
Pages 98
Release 2016
Genre Algorithms
ISBN

Download Penalized Methods and Algorithms for High-dimensional Regression in the Presence of Heterogeneity Book in PDF, Epub and Kindle

In fields such as statistics, economics and biology, heterogeneity is an important topic concerning validity of data inference and discovery of hidden patterns. This thesis focuses on penalized methods for regression analysis with the presence of heterogeneity in a potentially high-dimensional setting. Two possible strategies to deal with heterogeneity are: robust regression methods that provide heterogeneity-resistant coefficient estimation, and direct detection of heterogeneity while estimating coefficients accurately in the meantime. We consider the first strategy for two robust regression methods, Huber loss regression and quantile regression with Lasso or Elastic-Net penalties, which have been studied theoretically but lack efficient algorithms. We propose a new algorithm Semismooth Newton Coordinate Descent to solve them. The algorithm is a novel combination of Semismooth Newton Algorithm and Coordinate Descent that applies to penalized optimization problems with both nonsmooth loss and nonsmooth penalty. We prove its convergence properties, and show its computational efficiency through numerical studies. We also propose a nonconvex penalized regression method, Heterogeneity Discovery Regression (HDR) , as a realization of the second idea. We establish theoretical results that guarantees statistical precision for any local optimum of the objective function with high probability. We also compare the numerical performances of HDR with competitors including Huber loss regression, quantile regression and least squares through simulation studies and a real data example. In these experiments, HDR methods are able to detect heterogeneity accurately, and also largely outperform the competitors in terms of coefficient estimation and variable selection.

Statistical Foundations of Data Science

Statistical Foundations of Data Science
Title Statistical Foundations of Data Science PDF eBook
Author Jianqing Fan
Publisher CRC Press
Pages 752
Release 2020-09-21
Genre Mathematics
ISBN 1466510854

Download Statistical Foundations of Data Science Book in PDF, Epub and Kindle

Statistical Foundations of Data Science gives a thorough introduction to commonly used statistical models, contemporary statistical machine learning techniques and algorithms, along with their mathematical insights and statistical theories. It aims to serve as a graduate-level textbook and a research monograph on high-dimensional statistics, sparsity and covariance learning, machine learning, and statistical inference. It includes ample exercises that involve both theoretical studies as well as empirical applications. The book begins with an introduction to the stylized features of big data and their impacts on statistical analysis. It then introduces multiple linear regression and expands the techniques of model building via nonparametric regression and kernel tricks. It provides a comprehensive account on sparsity explorations and model selections for multiple regression, generalized linear models, quantile regression, robust regression, hazards regression, among others. High-dimensional inference is also thoroughly addressed and so is feature screening. The book also provides a comprehensive account on high-dimensional covariance estimation, learning latent factors and hidden structures, as well as their applications to statistical estimation, inference, prediction and machine learning problems. It also introduces thoroughly statistical machine learning theory and methods for classification, clustering, and prediction. These include CART, random forests, boosting, support vector machines, clustering algorithms, sparse PCA, and deep learning.

Handbook of Quantile Regression

Handbook of Quantile Regression
Title Handbook of Quantile Regression PDF eBook
Author Roger Koenker
Publisher CRC Press
Pages 463
Release 2017-10-12
Genre Mathematics
ISBN 1498725295

Download Handbook of Quantile Regression Book in PDF, Epub and Kindle

Quantile regression constitutes an ensemble of statistical techniques intended to estimate and draw inferences about conditional quantile functions. Median regression, as introduced in the 18th century by Boscovich and Laplace, is a special case. In contrast to conventional mean regression that minimizes sums of squared residuals, median regression minimizes sums of absolute residuals; quantile regression simply replaces symmetric absolute loss by asymmetric linear loss. Since its introduction in the 1970's by Koenker and Bassett, quantile regression has been gradually extended to a wide variety of data analytic settings including time series, survival analysis, and longitudinal data. By focusing attention on local slices of the conditional distribution of response variables it is capable of providing a more complete, more nuanced view of heterogeneous covariate effects. Applications of quantile regression can now be found throughout the sciences, including astrophysics, chemistry, ecology, economics, finance, genomics, medicine, and meteorology. Software for quantile regression is now widely available in all the major statistical computing environments. The objective of this volume is to provide a comprehensive review of recent developments of quantile regression methodology illustrating its applicability in a wide range of scientific settings. The intended audience of the volume is researchers and graduate students across a diverse set of disciplines.