SANN Overviews - Neural Network Tasks

Like most statistical models, neural networks are capable of performing several major tasks including regression and classification. Regression tasks are concerned with relating a number of input variables x with a set of continuous outcomes t (target variables). By contrast, classification tasks assign class memberships to a categorical target variable given a set of input values. In the next section we will consider regression in more details.

Regression and the Family of Nonparametric (Black-Box) Tools

The most straightforward and perhaps simplest approach to statistical inference is to assume that the data can be modeled using a closed functional form that can contain a number of adjustable parameters (weights) that can be estimated so the model can provide us with the best explanation of the data in hand. As an example, consider a regression problem in which we aim to model or approximate a single target variable t as a linear function of an input variable x. The mathematical function used to model such relationships is simply given by a linear transformation f with two parameters, namely the intercept a and slope b,

Our task is to find suitable values for a and b that relate an input x to the variable t. This problem is known as the linear regression.

Another example of parametric regression is the quadratic problem where the input-output relationship is described by the quadratic form,

Schematic shows the difference between parametric and nonparametric models. In parametric models, the input-target relationship is described by a mathematical function of closed form. By contrast, in nonparametric models, the input-target relationship is governed by an approximator (like a neural network) that cannot be represented by a standard mathematical function.

The examples above belong to the category of the so called parametric methods. They strictly rely on the assumption that t is related to x in a priori known way, or can be sufficiently approximated by a closed mathematical from, e.g., a line or a quadratic function. Once the mathematical function is chosen, all we have to do is to adjust the parameters of the assumed model so they best approximate (predict) t given an instance of x.

By contrast, nonparametric models generally make no assumptions regarding the relationship of x and t. In other words, they assume that the true underlying function governing the relationship between x and t is not known a priori, hence, the term black box. Instead, they attempt to discover a mathematical function (which often does not have a closed form) that can approximate the representation of x and t sufficiently well. The most popular examples of nonparametric models are polynomial functions with adaptable parameters and neural networks.

Since no closed form for the relationship between x and t is assumed, a nonparametric method must be sufficiently flexible to be able to model a wide spectrum of functional relationships. The higher the order of a polynomial, for example, the more flexible the model. Similarly, the more neurons a neural network has, the stronger the model becomes.

Parametric models have the advantage of being easy to use and producing easy to interpret outputs. On the contrary, they suffer from the disadvantage of limited flexibility. Consequently, their usefulness strictly depends on how well the assumed input-target relationship survives the test of reality. Unfortunately many real-world problems do not simply lend themselves to a closed form, and the parametric representation may often prove too restrictive. No wonder then that statisticians and engineers often consider using non-parametric models, especially neural networks, as alternatives to parametric methods.

Neural Networks and Classification Tasks

Neural networks, like most statistical tools, can also be used to tackle classification problems. In contrast to regression problems, a neural network classifier assigns class membership to an input x. For example, if the input set has three categories {A, B, C}, a neural network assigns each and every input to one of the three classes. The class membership information is carried in the target variable t. For that reason, in a classification analysis the target variable must always be categorical. A variable is categorical if (a) it can only assume discrete values that (b) cannot be numerically arranged. For example, a target variable with {MALE, FEMALE} is a two state categorical variable. However, a target variable with date values is not truly categorical since the results can be arranged in numerical order.