# Theory of the backpropagation neural network

@article{HechtNielsen1989TheoryOT, title={Theory of the backpropagation neural network}, author={Robert Hecht-Nielsen}, journal={International 1989 Joint Conference on Neural Networks}, year={1989}, pages={593-605 vol.1} }

The author presents a survey of the basic theory of the backpropagation neural network architecture covering architectural design, performance measurement, function approximation capability, and learning. The survey includes previously known material, as well as some new results, namely, a formulation of the backpropagation neural network architecture to make it a valid neural network (past formulations violated the locality of processing restriction) and a proof that the backpropagation mean… Expand

#### Topics from this paper

#### 1,594 Citations

A more biologically plausible learning rule than backpropagation applied to a network model of cortical area 7a.

- Psychology, Medicine
- Cerebral cortex
- 1991

Two neural networks are developed with architecture similar to Zipser and Andersen's model and trained to perform the same task using a more biologically plausible learning procedure than backpropagation, which corroborates the validity of this neural network's computational algorithm as a plausible model of how area 7a may perform coordinate transformations. Expand

Feed Forward Neural Network Entities

- Computer Science
- IWANN
- 1997

Although the entities' concept is still developing, some preliminary results indicate superiority over the single FFNN model when applicable to problems involving high-dimensional data (e.g. financial/meteorological data analysis, etc.). Expand

Robust design of multilayer feedforward neural networks: an experimental approach

- Computer Science
- Eng. Appl. Artif. Intell.
- 2004

This article develops a systematic, experimental strategy which emphasizes simultaneous optimization of BPN parameters under various noise conditions and shows that fine-tuning the BPN output is effective in improving the signal-to-noise ratio. Expand

Rates of Approximation in a Feedforward Network Depend on the Type of Computational Unit

- Mathematics
- 1998

The approximation capabilities of feedforward neural networks with a single hidden layer and with various activation functions has been widely studied ([19], [8], [1], [2], [13]). Mhaskar and… Expand

A neural network learning algorithm tailored for VLSI implementation

- Computer Science, Medicine
- IEEE Trans. Neural Networks
- 1994

This paper describes concepts that optimize an on-chip learning algorithm for implementation of VLSI neural networks with conventional technologies. The network considered comprises an analog… Expand

Approximation theory of the MLP model in neural networks

- Computer Science
- 1999

In this survey we discuss various approximation-theoretic problems that arise in the multilayer feedforward perceptron (MLP) model in neural networks. The MLP model is one of the more popular and… Expand

A Novel Design Method for Multilayer Feedforward Neural Networks

- Computer Science
- Neural Computation
- 1994

It is shown in several examples that the proposed model and the design method are capable of rapidly learning the training patterns compared to conventional multilayer feedforward neural networks with random initialization techniques. Expand

Neural subnet design by direct polynomial mapping

- Computer Science, Medicine
- IEEE Trans. Neural Networks
- 1992

A method for the analysis and synthesis of single-input, single-output neural subnetworks is described and it is shown that the mapped subnets avoid local minima which backpropagation-trained subnets get trapped in and that the mapping approach is much faster. Expand

FEEDFORWARD NEURAL NETWORKS FOR THE IDENTIFICATION OF DYNAMIC PROCESSES

- Mathematics
- 1991

Abstract This paper presents an introduction to the use of neural network computational algorithms for the identification of dynamic systems. Simulated linear and non-linear systems and real plant… Expand

Dynamic backpropagation algorithm for neural network controlled resonator-bank architecture

- Computer Science
- 1992

Simulation results show that the neural network controlled resonator-bank architecture is computationally feasible and can be used as a general building block in a wide range of identification and control problems. Expand

#### References

SHOWING 1-10 OF 56 REFERENCES

Backpropagation: past and future

- Computer Science
- IEEE 1988 International Conference on Neural Networks
- 1988

The author proposes development of a general theory of intelligence in which backpropagation and comparisons to the brain play a central role, and points to a series of intermediate steps and applications leading up to the construction of such generalized systems. Expand

Learning representations by back-propagating errors

- Computer Science
- Nature
- 1986

Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain. Expand

Neocognitron: A hierarchical neural network capable of visual pattern recognition

- Computer Science
- Neural Networks
- 1988

The operation of tolerating positional error a little at a time at each stage, rather than all in one step, plays an important role in endowing the network with an ability to recognize even distorted patterns. Expand

Dynamic Node Creation in Backpropagation Networks

- Computer Science
- 1989

A new method called Dynamic Node Creation (DNC) which automatically grows BP networks until the target problem is solved, and yielded a solution for every problem tried. Expand

Neurons with graded response have collective computational properties like those of two-state neurons.

- Computer Science, Mathematics
- Proceedings of the National Academy of Sciences of the United States of America
- 1984

A model for a large network of "neurons" with a graded response (or sigmoid input-output relation) is studied and collective properties in very close correspondence with the earlier stochastic model based on McCulloch - Pitts neurons are studied. Expand

Neural Networks and Natural Intelligence

- Psychology
- 1988

From the Publisher:
Stephen Grossberg and his colleagues at Boston University's Center for Adaptive Systems are producing some of the most exciting research in the neural network approach to making… Expand

There exists a neural network that does not make avoidable mistakes

- Mathematics, Computer Science
- IEEE 1988 International Conference on Neural Networks
- 1988

The authors show that a multiple-input, single-output, single-hidden-layer feedforward network with (known) hardwired connections from input to hidden layer, monotone squashing at the hidden layer… Expand

A massively parallel architecture for a self-organizing neural pattern recognition machine

- Computer Science
- Comput. Vis. Graph. Image Process.
- 1987

A neural network architecture for the learning of recognition categories is derived which circumvents the noise, saturation, capacity, orthogonality, and linear predictability constraints that limit the codes which can be stably learned by alternative recognition models. Expand

Neocognitron: A new algorithm for pattern recognition tolerant of deformations and shifts in position

- Computer Science
- Pattern Recognit.
- 1982

The neocognitron recognizes stimulus patterns correctly without being affected by shifts in position or even by considerable distortions in shape of the stimulus patterns. Expand

Learning of word stress in a sub-optimal second order back-propagation neural network

- Computer Science
- IEEE 1988 International Conference on Neural Networks
- 1988

The authors show an example of an efficient and easy solution, using a neural network, of a problem that cannot be easily solved with rules, of the localization of primary word stress in text-to-speech synthesis of Italian. Expand