.\"Text automatically generated by txt2man .TH lars "1" "" "" .SH NAME \fBlars \fP- lars .SH SYNOPSIS .nf .fam C \fBlars\fP [\fB-h\fP] [\fB-v\fP] \fB-i\fP \fIstring\fP \fB-r\fP \fIstring\fP [\fB-l\fP \fIdouble\fP] [\fB-L\fP \fIdouble\fP] [\fB-o\fP \fIstring\fP] [\fB-c\fP] \fB-V\fP .fam T .fi .fam T .fi .SH DESCRIPTION An implementation of LARS: Least Angle Regression (Stagewise/laSso). This is a stage-wise homotopy-based algorithm for L1-regularized linear regression (LASSO) and L1+L2-regularized linear regression (Elastic Net). .PP Let X be a matrix where each row is a point and each column is a dimension, and let y be a vector of targets. .PP The Elastic Net problem is to solve .PP .nf .fam C min_beta 0.5 || X * beta - y ||_2^2 + lambda_1 ||beta||_1 + 0.5 lambda_2 ||beta||_2^2 .fam T .fi If lambda_1 > 0 and lambda_2 = 0, the problem is the LASSO. If lambda_1 > 0 and lambda_2 > 0, the problem is the Elastic Net. If lambda_1 = 0 and lambda_2 > 0, the problem is ridge regression. If lambda_1 = 0 and lambda_2 = 0, the problem is unregularized linear regression. .PP For efficiency reasons, it is not recommended to use this algorithm with lambda_1 = 0. In that case, use the 'linear_regression' program, which implements both unregularized linear regression and ridge regression. .RE .PP .SH REQUIRED OPTIONS .TP .B \fB--input_file\fP (\fB-i\fP) [\fIstring\fP] File containing covariates (X). .TP .B \fB--responses_file\fP (\fB-r\fP) [\fIstring\fP] File containing y (responses/observations). .SH OPTIONS .TP .B \fB--help\fP (\fB-h\fP) Default help info. .TP .B \fB--info\fP [\fIstring\fP] Get help on a specific module or option. Default value ''. .TP .B \fB--lambda1\fP (\fB-l\fP) [\fIdouble\fP] Regularization parameter for l1-norm penalty. Default value 0. .TP .B \fB--lambda2\fP (\fB-L\fP) [\fIdouble\fP] Regularization parameter for l2-norm penalty. Default value 0. .TP .B \fB--output_file\fP (\fB-o\fP) [\fIstring\fP] File to save beta (linear estimator) to. Default value 'output.csv'. .TP .B \fB--use_cholesky\fP (\fB-c\fP) Use Cholesky decomposition during computation rather than explicitly computing the full Gram matrix. .TP .B \fB--verbose\fP (\fB-v\fP) Display informational messages and the full list of parameters and timers at the end of execution. .TP .B \fB--version\fP (\fB-V\fP) Display the version of mlpack. .SH ADDITIONAL INFORMATION For further information, including relevant papers, citations, and theory, consult the documentation found at http://www.mlpack.org or included with your distribution of MLPACK.