Next: Overview of the Bayesian
Up: Fast Algorithms for Bayesian
Previous: Application to General ICA
Above, noise was assumed to have a small variance to justify certain
approximations. Therefore the result was not strictly an algorithm for
noisy ICA since the approximations get worse with increasing noise
variance. Below, we will consider a speedup modification for Bayesian
noisy ICA. The Bayesian approach adopted here gives certain important
advantages:
- noise can have a finite variance
- source densities need not be fixed a priori; they can be estimated
- the number of sources can be estimated
- model comparison is possible
Specifically the source distributions are modeled as mixtures of
Gaussians and the posterior is approximated using ensemble learning.
The treatment of the source distribution is similar to [1]
which uses a factorial approximation of the posterior source
distributions in connection with the EM-algorithm. The modification
would be directly applicable to the algorithm in [1], but
we will consider an algorithm where all the posterior distribution is
estimated for all parameters, i.e., point estimates are not used at
all.
Harri Lappalainen
2000-03-09