.\" Text automatically generated by txt2man .TH mlpack_lars 1 "12 December 2020" "mlpack-3.4.2" "User Commands" .SH NAME \fBmlpack_lars \fP- lars .SH SYNOPSIS .nf .fam C \fBmlpack_lars\fP [\fB-i\fP \fIstring\fP] [\fB-m\fP \fIunknown\fP] [\fB-l\fP \fIdouble\fP] [\fB-L\fP \fIdouble\fP] [\fB-r\fP \fIstring\fP] [\fB-t\fP \fIstring\fP] [\fB-c\fP \fIbool\fP] [\fB-V\fP \fIbool\fP] [\fB-M\fP \fIunknown\fP] [\fB-o\fP \fIstring\fP] [\fB-h\fP \fB-v\fP] .fam T .fi .fam T .fi .SH DESCRIPTION An implementation of LARS: Least Angle Regression (Stagewise/laSso). This is a stage-wise homotopy-based algorithm for L1-regularized linear regression (LASSO) and L1+L2-regularized linear regression (Elastic Net). .PP This program is able to train a LARS/LASSO/Elastic Net model or load a model from file, output regression predictions for a test set, and save the trained model to a file. The LARS algorithm is described in more detail below: .PP Let X be a matrix where each row is a point and each column is a dimension, and let y be a vector of targets. .PP The Elastic Net problem is to solve .PP .nf .fam C min_beta 0.5 || X * beta - y ||_2^2 + lambda_1 ||beta||_1 + 0.5 lambda_2 ||beta||_2^2 .fam T .fi If lambda1 > 0 and lambda2 = 0, the problem is the LASSO. If lambda1 > 0 and lambda2 > 0, the problem is the Elastic Net. If lambda1 = 0 and lambda2 > 0, the problem is ridge regression. If lambda1 = 0 and lambda2 = 0, the problem is unregularized linear regression. .PP For efficiency reasons, it is not recommended to use this algorithm with \(cq\fB--lambda1\fP (\fB-l\fP)' = 0. In that case, use the 'linear_regression' program, which implements both unregularized linear regression and ridge regression. .PP To train a LARS/LASSO/Elastic Net model, the '\fB--input_file\fP (\fB-i\fP)' and \(cq\fB--responses_file\fP (\fB-r\fP)' parameters must be given. The '\fB--lambda1\fP (\fB-l\fP)', \(cq\fB--lambda2\fP (\fB-L\fP)', and '\fB--use_cholesky\fP (\fB-c\fP)' parameters control the training options. A trained model can be saved with the '\fB--output_model_file\fP (\fB-M\fP)'. If no training is desired at all, a model can be passed via the \(cq\fB--input_model_file\fP (\fB-m\fP)' parameter. .PP The program can also provide predictions for test data using either the trained model or the given input model. Test points can be specified with the \(cq\fB--test_file\fP (\fB-t\fP)' parameter. Predicted responses to the test points can be saved with the '\fB--output_predictions_file\fP (\fB-o\fP)' output parameter. .PP For example, the following command trains a model on the data 'data.csv' and responses 'responses.csv' with lambda1 set to 0.4 and lambda2 set to 0 (so, LASSO is being solved), and then the model is saved to 'lasso_model.bin': .PP $ \fBmlpack_lars\fP \fB--input_file\fP data.csv \fB--responses_file\fP responses.csv \fB--lambda1\fP 0.4 \fB--lambda2\fP 0 \fB--output_model_file\fP lasso_model.bin .PP The following command uses the 'lasso_model.bin' to provide predicted responses for the data 'test.csv' and save those responses to \(cqtest_predictions.csv': .PP $ \fBmlpack_lars\fP \fB--input_model_file\fP lasso_model.bin \fB--test_file\fP test.csv \fB--output_predictions_file\fP test_predictions.csv .RE .PP .SH OPTIONAL INPUT OPTIONS .TP .B \fB--help\fP (\fB-h\fP) [\fIbool\fP] Default help info. .TP .B \fB--info\fP [\fIstring\fP] Print help on a specific option. Default value ''. .TP .B \fB--input_file\fP (\fB-i\fP) [\fIstring\fP] Matrix of covariates (X). .TP .B \fB--input_model_file\fP (\fB-m\fP) [\fIunknown\fP] Trained LARS model to use. .TP .B \fB--lambda1\fP (\fB-l\fP) [\fIdouble\fP] Regularization parameter for l1-norm penalty. Default value 0. .TP .B \fB--lambda2\fP (\fB-L\fP) [\fIdouble\fP] Regularization parameter for l2-norm penalty. Default value 0. .TP .B \fB--responses_file\fP (\fB-r\fP) [\fIstring\fP] Matrix of responses/observations (y). .TP .B \fB--test_file\fP (\fB-t\fP) [\fIstring\fP] Matrix containing points to regress on (test points). .TP .B \fB--use_cholesky\fP (\fB-c\fP) [\fIbool\fP] Use Cholesky decomposition during computation rather than explicitly computing the full Gram matrix. .TP .B \fB--verbose\fP (\fB-v\fP) [\fIbool\fP] Display informational messages and the full list of parameters and timers at the end of execution. .TP .B \fB--version\fP (\fB-V\fP) [\fIbool\fP] Display the version of mlpack. .SH OPTIONAL OUTPUT OPTIONS .TP .B \fB--output_model_file\fP (\fB-M\fP) [\fIunknown\fP] Output LARS model. .TP .B \fB--output_predictions_file\fP (\fB-o\fP) [\fIstring\fP] If \fB--test_file\fP is specified, this file is where the predicted responses will be saved. .SH ADDITIONAL INFORMATION For further information, including relevant papers, citations, and theory, consult the documentation found at http://www.mlpack.org or included with your distribution of mlpack.