Overview
Logistic regression is used for modeling binary outcome variables such as credit default or warranty claims.
It is assumed that the binary response, Y, takes on the values of 0 and 1 with 0 representing failure and 1 representing success.
Overview of Logistic Regression Model
The logistic regression function models the probability that the binary response is as a function of a set of predictor variables
and regression coefficients
as given by:

In practice, the regression coefficients are unknown and are estimated by maximizing the likelihood function. Note that the wi below are case weights and are assumed to be positive. All observations that have a case weight less than or equal to zero are excluded from the analysis and all subsequent results.

The maximization of the likelihood is achieved by an iterative method called Fisher scoring. Fisher scoring is similar to the Newton-Raphson procedure except that the hessian matrix (matrix of second order partial derivatives) is replaced with its expected value.
The Fisher scoring update formula for the regression coefficients is given by:

The algorithm completes when the convergence criterion is satisfied or when the maximum number of iterations has been reached. Convergence is obtained when the difference between the log-likelihood function from one iteration to the next is small. By default, the convergence criterion is 1e-7, thus convergence is obtained when:

In the equations below:

The score vector is given by:

The information matrix is given by:

The asymptotic estimated covariance matrix of the estimated coefficients is given by:
