next up previous contents
Next: Learning Vector Quantization Up: Self-Organizing Map Previous: Asymptotic density of the

Some SOM applications and properties

The topology preservation has the property to map slightly altered input vectors to the original BMU or its immediate neighbors. This property is useful in different kinds of error tolerant mappings and smooth function approximations, required, for example, in transmission of signals [Luttrell, 1990,Bradburn, 1989,Kangas, 1995].

The topology preserving mapping from the high-dimensional input space to the low-dimensional display provides an opportunity to illustrate the relationships between different input regions. It is also possible to use SOM to find out the most significant data components that affect the ordering by viewing the component planes of the SOM weight vectors. The ability to visualize multidimensional data is exploited in many application areas, for example, data mining [Kaski, 1997] and process analysis [Kasslin et al., 1992].

The probability density of the input space is projected to the SOM by the point density of units such that the SOM weight vectors are at the densest in the areas where the most accurate vector quantization is needed. In addition to aiding in some VQ applications, this offers a way to approximate the probability density function (PDF) of the input. The PDF approximation can be used directly in maximum likelihood classification of static vectors [Hämäläinen, 1995] or, combined with state transition probabilities, to classify pattern sequences in the HMM framework as explained in this work (Section 3.2.4). The SOM is, however, not intended for optimal classification, but the classification accuracy of the SOM codebook can be improved by LVQ methods [Kohonen, 1995] which are discussed for this work in Sections 2.2 and 3.3.

The smoothness of the obtained mapping allows efficient exploitation of the available modeling capacity, when the amount of codebook vectors is much larger than the number of the distinct input clusters. Insufficiency and defects in the training data produces often some important training problems, for example, missing components in data vectors, over-learning and need for parameter smoothing. However, in SOM training their effect is not necessarily so bad since the use of training neighborhood provides smoothing also for units that are seldom chosen to be the BMU [Kohonen, 1995].

If a set of best matching SOM units, say the top five ranks, are wanted for the application, the ordering of the units can be used to apply some very fast approximative search methods for large codebooks. Such an approximation can be sufficient, if the exact identity of the BMUs is not crucial, but only the quantization errors should be of correct order of magnitude to show how well the input matches with the SOM. This is typical for using SOM for density function approximation purposes as in this work. For example, the search could then begin from the neighborhood of good BMU candidates and gradually proceed towards the direction of improved match values [Kohonen, 1996] (see the Section 3.2.5 for a more exact description).

Comprehensive views on different SOM applications are given in [Kohonen, 1995,Kohonen et al., 1996d].


next up previous contents
Next: Learning Vector Quantization Up: Self-Organizing Map Previous: Asymptotic density of the
Mikko Kurimo
11/7/1997