next up previous contents
Next: First Example Up: Gaussian Variables Previous: Gaussian Variables

Update Rule

The posterior distribution q(s) of a latent Gaussian node can be updated as follows.

1.
First, the gradients of Cp w.r.t. $\left< s \right>$, $\mathrm{Var}\left\{s\right\}$ and $\left< \exp s \right>$ are computed.
2.
Second, the terms in Cp which depend on $\overline{s}$ and $\widetilde{s}$ are assumed to be $a \overline{s} + b [(\overline{s}-
\overline{s}_{\text{current}})^2 + \widetilde{s}] + c\left< \exp s \right> + d$, where $a=\partial C_p / \partial \overline{s}$, $b=\partial C_p / \partial
\widetilde{s}$ and $c=\partial C / \partial \left< \exp s \right>$. This is shown to be true in Section [*].
3.
Third, the minimum of Cs = Cs,p + Cs,q is solved.

The cost function is written explicitely as the function of $\overline{s}$and $\widetilde{s}$:

\begin{displaymath}C_s(\overline{s},\widetilde{s})=a_1\overline{s}^2+a_2\overlin...
...ilde{s}+c\exp{\left(\overline{s}+\widetilde{s}/2\right)}+a_5 ,
\end{displaymath} (4.7)

where a1 > 0 and a4 < 0. First, $\widetilde{s}$ is kept constant and the optimal $\overline{s}$ is solved using Newton's iteration. Then the optimal $\widetilde{s}$ is solved using a stabilised fixed-point iteration and keeping $\overline{s}$ constant.

In the special case c=0, the minimum of $C_s(\overline{s},\widetilde{s})$

$\displaystyle \overline{s}_\text{opt}$ = $\displaystyle -\frac{a_2}{2a_1}$ (4.8)
$\displaystyle \widetilde{s}_\text{opt}$ = $\displaystyle \frac{a_4}{a_3}$ (4.9)

can be found analytically. In this case, q(s) is optimal among all functions i.e. the free-form approximation.


next up previous contents
Next: First Example Up: Gaussian Variables Previous: Gaussian Variables
Tapani Raiko
2001-12-10