Generative models handle missing values in an easy and natural way. Whenever a model is found, reconstructions of the missing values are also obtained. Generative models are not the only way to handle the missing data [8,10,11], but this work covers only them. Unsupervised learning can be used for supervised learning by considering the outputs of the test data as missing values (Figure 1). This combines feature extraction and supervised learning.

The ability to reconstruct missing values measures the quality of a model and its ability to generalise. Reconstructions are used here to demonstrate the properties of nonlinear factor analysis (NFA) [6] by comparing it to linear factor analysis (FA) and to the self-organising map (SOM) [5].

FA is like principal component analysis (PCA) with modelled noise. It is a basic tool that works well when nonlinear effects are not important. Large dimensionality of data is not a problem. The SOM captures nonlinearities and clusters, but has difficulties with data of high intrinsic dimensionality and generalisation. NFA has properties of both of them; It can handle both large dimensionality and nonlinear effects.

The three methods are described shortly in the next two sections. The fourth section describes the experiments and the results. They are discussed in section five.