Nonlinear Least Squares Model Estimation - Quick Tab
Select the Quick tab of the Nonlinear Least Squares Model Estimation dialog box to access the options described here.
Element Name | Description |
---|---|
Estimation method | The Estimation method box contains two options: Levenberg-Marquardt and Gauss-Newton. |
Levenberg-Marquardt | For most applications, the default Levenberg-Marquardt (nonlinear least squares) method will yield the best performance, that is, it is the method that is most efficient and fastest to converge. The algorithm is described in detail in Moré (1977). In summary, this method is a modification (improvement) of the Gauss-Newton algorithm. Like the Gauss-Newton algorithm, when using the least-squares loss function, the second-order partial derivatives do not have to be computed (or approximated) in order to find the least-squares parameter estimates; instead, the algorithm will in each iteration solve a set of linear equations to compute the gradient. |
Gauss-Newton | This method is an implementation of the classic Gauss-Newton algorithm for solving nonlinear least squares regression problems; e.g., see Dennis and Schnabel (1983). |
Copyright © 2021. Cloud Software Group, Inc. All Rights Reserved.