- Brief History of Neural Networks
- Definition of a Neural Network
- Learning in a Neural Network
- The Pattern Associator
- The Hebb Rule
- The Delta Rule
- The Generalized Delta Rule

This module on Neural Networks was written by Ingrid Russell of the University of Hartford . It is being printed with permission from Collegiate Microcomputer Journal.

If you have any comments or suggestions, please send email to irussell@mail.hartford.edu

The Hebb rule determines the change in the weight connection from ui to uj by Dwij = r * ai * aj, where r is the learning rate and ai, aj represent the activations of ui and uj respectively. Thus, if both ui and uj are activated the weight of the connection from ui to uj should be increased.

Examples can be given of input/output associations which can be learned by a two-layer Hebb rule pattern associator. In fact, it can be proved that if the set of input patterns used in training are mutually orthogonal, the association can be learned by a two-layer pattern associator using Hebbian learning. However, if the set of input patterns are not mutually orthogonal, interference may occur and the network may not be able to learn associations. This limitation of Hebbian learning can be overcome by using the delta rule.

- Brief History of Neural Networks
- Definition of a Neural Network
- Learning in a Neural Network
- The Hebb Rule
- The Delta Rule
- The Generalized Delta Rule

Copyright 1996 by Ingrid Russell.