Nonlinear Estimation Procedures - Least Squares Estimation
Some of the more common nonlinear regression models are reviewed in Common Nonlinear Regression Models. Now, the question arises as to how these models are estimated. If you are familiar with linear regression techniques (as described in Multiple Regression) or analysis of variance (ANOVA) techniques (as described in ANOVA/MANOVA), then you may be aware of the fact that all of those methods use so-called least squares estimation procedures. In the most general terms, least squares estimation is aimed at minimizing the sum of squared deviations of the observed values for the dependent variable from those predicted by the model. (The term least squares was first used by Legendre, 1805.)
- Nonlinear least-squares: The Levenberg-Marquardt algorithm
- When using the least-squares criterion, the very efficient Levenberg-Marquardt algorithm (Levenberg, 1944; Marquardt, 1963; see also Moré, 1977, for a detailed description of this algorithm) can be used to estimate the parameters for arbitrary linear and nonlinear regression problems. For large data sets, when using the least-squares criterion, this is the recommended method for fitting nonlinear models. See also Levenberg-Marquardt Algorithm (Nonlinear Least Squares) for details concerning this algorithm. Note that differences between a custom fit least squares loss function [(obs-pred)**2] can occur; however, these two methods will generally yield results that agree with the first 5-8 significant digits.
Copyright © 2021. Cloud Software Group, Inc. All Rights Reserved.