Least Angle Regression. Efron, B., Hastie, T., Johnstone, I., & Tibshirani, R. The Annals of Statistics, April, 2004. arXiv:math/0406456 TLDR: A publicly available algorithm that requires only the same order of magnitude of computational effort as ordinary least squares applied to the full set of covariates is described.
doi  abstract   bibtex   
The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a response variable. Least Angle Regression (LARS), a new model selection algorithm, is a useful and less greedy version of traditional forward selection methods. Three main properties are derived: (1) A simple modification of the LARS algorithm implements the Lasso, an attractive version of ordinary least squares that constrains the sum of the absolute regression coefficients; the LARS modification calculates all possible Lasso estimates for a given problem, using an order of magnitude less computer time than previous methods. (2) A different LARS modification efficiently implements Forward Stagewise linear regression, another promising new model selection method;
@article{efron_least_2004,
	title = {Least {Angle} {Regression}},
	volume = {32},
	issn = {0090-5364},
	doi = {10.1214/009053604000000067},
	abstract = {The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a response variable. Least Angle Regression (LARS), a new model selection algorithm, is a useful and less greedy version of traditional forward selection methods. Three main properties are derived: (1) A simple modification of the LARS algorithm implements the Lasso, an attractive version of ordinary least squares that constrains the sum of the absolute regression coefficients; the LARS modification calculates all possible Lasso estimates for a given problem, using an order of magnitude less computer time than previous methods. (2) A different LARS modification efficiently implements Forward Stagewise linear regression, another promising new model selection method;},
	language = {en},
	number = {2},
	urldate = {2023-06-28},
	journal = {The Annals of Statistics},
	author = {Efron, Bradley and Hastie, Trevor and Johnstone, Iain and Tibshirani, Robert},
	month = apr,
	year = {2004},
	note = {arXiv:math/0406456
TLDR: A publicly available algorithm that requires only the same order of magnitude of computational effort as ordinary least squares applied to the full set of covariates is described.},
	keywords = {\#Analysis, \#Statistics, /unread, 62J07. (Primary), Mathematics - Statistics Theory, ⭐⭐⭐},
}

Downloads: 0