next up previous contents
Next: Neural networks: a comprehensive Up: Summary of References Related Previous: Backpropagation applied to handwritten   Contents

Subsections

Generalization and network design strategies [58]

Original Abstract

An interestmg property of connectiomst systems is their ability tolearn from examples. Although most recent work in the field concentrateson reducing learning times, the most important feature of a learning ma-chine is its generalization performance. It is usually accepted that goodgeneralization performance on real-world problems cannot be achievedunless some a pnon knowledge about the task is butlt Into the system.Back-propagation networks provide a way of specifymg such knowledgeby imposing constraints both on the architecture of the network and onits weights. In general, such constramts can be considered as particulartransformations of the parameter spaceBuilding a constramed network for image recogmtton appears to be afeasible task. We descnbe a small handwritten digit recogmtion problemand show that, even though the problem is linearly separable, single layernetworks exhibit poor generalizatton performance. Multtlayer constrainednetworks perform very well on this task when orgamzed in a hierarchicalstructure with shift invariant feature detectors.These results confirm the idea that minimizing the number of freeparameters in the network enhances generalization.


next up previous contents
Next: Neural networks: a comprehensive Up: Summary of References Related Previous: Backpropagation applied to handwritten   Contents
Miquel Perello Nieto 2014-11-28