Backpropagation algorithm
One of the most important developments in neural networks is the backpropagation algorithm. The error backpropagation algorithm is used for training multilayer perceptrons (multilayer feedforward neural network) so that the network can be trained to capture the mapping implicit in the given set of input-output pattern pairs. The backpropagation algorithm is a generalization of the least mean squared algorithm that modifies network weights to minimize the mean squared error between the desired and actual outputs of the network.
The approach to be followed is basically a gradient- descent along the error surface to arrive at the optimum set of weights. The error is defined as the squared differences between the desired output and the actual output obtained at the output layer of the network due to the application of an input pattern from the given input-output pattern pair. Backpropagation uses supervised learning in which the network is trained using data for which inputs, as well as desired outputs, are known. Once trained, the network weights are frozen and can be used to compute output values for new input samples.
The feedforward process involves presenting an input pattern to input layer neurons that pass the input values onto the first hidden layer. Each of the hidden layer nodes computes a weighted sum of its inputs, passes the sum through its activation function, and presents the result to the output layer.
The backpropagation algorithm assumes a feedforward neural network architecture. In this architecture, nodes are partitioned into layers numbered 0 to L, where the layer number indicates the distance of a node from the input nodes. The lowermost layer is the input layer numbered as layer 0, and the topmost layer is the output layer numbered as layer L. Backpropagation addresses networks for which L ≥ 2, containing “hidden layers” numbered 1 to L-1.
Application of Backpropagation
Backpropagation has been used in a wide variety of applications, some of which are as follows-
- Data compression
- Image Compression
- Face Recognition
- Optical Character Recognition
- Control problems
- Non – Linear Simulation
- Fault Detection Problems
- Load Forecasting problems.
Advantages of Backpropagation
The following are the advantages of a backpropagation network:-
(i) The computing time is minimized if the weights chosen are small at the beginning.
(ii) The mathematical formula of backpropagation can be applied to any network.
Disadvantages of Backpropagation
The following are the disadvantages of a backpropagation network:-
(i) It has more number of learning steps, and also the learning phase has intensive calculations.
(ii) The training may sometimes cause temporal instability to the system.
(iii) The network may get trapped in a local minima.