.TH "sc::QNewtonOpt" 3 "Sun Oct 4 2020" "Version 2.3.1" "MPQC" \" -*- nroff -*- .ad l .nh .SH NAME sc::QNewtonOpt \- The \fBQNewtonOpt\fP implements a quasi-Newton optimization scheme\&. .SH SYNOPSIS .br .PP .PP \fC#include \fP .PP Inherits \fBsc::Optimize\fP\&. .SS "Public Member Functions" .in +1c .ti -1c .RI "\fBQNewtonOpt\fP (const \fBRef\fP< \fBKeyVal\fP > &)" .br .RI "The \fBKeyVal\fP constructor\&. " .ti -1c .RI "\fBQNewtonOpt\fP (\fBStateIn\fP &)" .br .ti -1c .RI "void \fBsave_data_state\fP (\fBStateOut\fP &)" .br .RI "Save the base classes (with save_data_state) and the members in the same order that the \fBStateIn\fP CTOR initializes them\&. " .ti -1c .RI "void \fBapply_transform\fP (const \fBRef\fP< \fBNonlinearTransform\fP > &)" .br .ti -1c .RI "void \fBinit\fP ()" .br .RI "Initialize the optimizer\&. " .ti -1c .RI "int \fBupdate\fP ()" .br .RI "Take a step\&. " .in -1c .SS "Protected Attributes" .in +1c .ti -1c .RI "double \fBmaxabs_gradient\fP" .br .ti -1c .RI "double \fBaccuracy_\fP" .br .ti -1c .RI "\fBRefSymmSCMatrix\fP \fBihessian_\fP" .br .ti -1c .RI "\fBRef\fP< \fBHessianUpdate\fP > \fBupdate_\fP" .br .ti -1c .RI "\fBRef\fP< \fBLineOpt\fP > \fBlineopt_\fP" .br .ti -1c .RI "int \fBtake_newton_step_\fP" .br .ti -1c .RI "int \fBprint_hessian_\fP" .br .ti -1c .RI "int \fBprint_x_\fP" .br .ti -1c .RI "int \fBprint_gradient_\fP" .br .ti -1c .RI "int \fBlinear_\fP" .br .ti -1c .RI "int \fBrestrict_\fP" .br .ti -1c .RI "int \fBdynamic_grad_acc_\fP" .br .ti -1c .RI "int \fBforce_search_\fP" .br .ti -1c .RI "int \fBrestart_\fP" .br .in -1c .SS "Additional Inherited Members" .SH "Detailed Description" .PP The \fBQNewtonOpt\fP implements a quasi-Newton optimization scheme\&. .SH "Constructor & Destructor Documentation" .PP .SS "sc::QNewtonOpt::QNewtonOpt (const \fBRef\fP< \fBKeyVal\fP > &)" .PP The \fBKeyVal\fP constructor\&. The \fBKeyVal\fP constructor reads the following keywords: .IP "\fB\fCupdate\fP\fP" 1c This gives a \fBHessianUpdate\fP object\&. The default is to not update the hessian\&. .PP .IP "\fB\fChessian\fP\fP" 1c By default, the guess hessian is obtained from the \fBFunction\fP object\&. This keyword specifies an lower triangle array (the second index must be less than or equal to than the first) that replaces the guess hessian\&. If some of the elements are not given, elements from the guess hessian will be used\&. .PP .IP "\fB\fClineopt\fP\fP" 1c This gives a \fBLineOpt\fP object for doing line optimizations in the Newton direction\&. The default is to skip the line optimizations\&. .PP .IP "\fB\fCaccuracy\fP\fP" 1c The accuracy with which the first gradient will be computed\&. If this is too large, it may be necessary to evaluate the first gradient point twice\&. If it is too small, it may take longer to evaluate the first point\&. The default is 0\&.0001\&. .PP .IP "\fB\fCprint_x\fP\fP" 1c If true, print the coordinates each iteration\&. The default is false\&. .PP .IP "\fB\fCprint_gradient\fP\fP" 1c If true, print the gradient each iteration\&. The default is false\&. .PP .IP "\fB\fCprint_hessian\fP\fP" 1c If true, print the approximate hessian each iteration\&. The default is false\&. .PP .IP "\fB\fCrestrict\fP\fP" 1c Use step size restriction when not using a line search\&. The default is true\&. .PP .PP .SH "Member Function Documentation" .PP .SS "void sc::QNewtonOpt::save_data_state (\fBStateOut\fP &)\fC [virtual]\fP" .PP Save the base classes (with save_data_state) and the members in the same order that the \fBStateIn\fP CTOR initializes them\&. This must be implemented by the derived class if the class has data\&. .PP Reimplemented from \fBsc::Optimize\fP\&. .SS "int sc::QNewtonOpt::update ()\fC [virtual]\fP" .PP Take a step\&. Returns 1 if the optimization has converged, otherwise 0\&. .PP Implements \fBsc::Optimize\fP\&. .SH "Author" .PP Generated automatically by Doxygen for MPQC from the source code\&.