next up previous
Next: Overview of the Bayesian Up: Fast Algorithms for Bayesian Previous: Application to General ICA

Application to Bayesian Noisy ICA

Above, noise was assumed to have a small variance to justify certain approximations. Therefore the result was not strictly an algorithm for noisy ICA since the approximations get worse with increasing noise variance. Below, we will consider a speedup modification for Bayesian noisy ICA. The Bayesian approach adopted here gives certain important advantages:

Specifically the source distributions are modeled as mixtures of Gaussians and the posterior is approximated using ensemble learning. The treatment of the source distribution is similar to [1] which uses a factorial approximation of the posterior source distributions in connection with the EM-algorithm. The modification would be directly applicable to the algorithm in [1], but we will consider an algorithm where all the posterior distribution is estimated for all parameters, i.e., point estimates are not used at all.


Harri Lappalainen