SANN Overviews - Neural Network Complexity

The complexity of a neural network (and indeed the two layer perceptron MLP2 and Radial  Basis Functions) is measured by the number of neurons in the hidden layer. The more neurons in a neural network, the larger the flexibility and complexity of the system. Flexible neural networks can be used to approximate any function complexity that relates the input-target variables. Thus, in order to model a data set, it is important to have sufficiently flexible neural networks with enough neurons in the hidden layer. The optimal choice of neurons depends on the problem domain but, generally, it can be related to the number of the inputs.