The Gaussian node is a variable node and the basic element in building hierarchical models. Figure 2 (leftmost subfigure) shows the schematic diagram of the Gaussian node. Its output is the value of a Gaussian random variable , which is conditioned by the inputs and . Denote generally by the probability density function of a Gaussian random variable having the mean and variance . Then the conditional probability function (cpf) of the variable is . As a generative model, the Gaussian node takes its mean input and adds to it Gaussian noise (or innovation) with variance .
Variables can be latent or observed. Observing a variable means fixing its output to the value in the data. Section 4 is devoted to inferring the distribution over the latent variables given the observed variables. Inferring the distribution over variables that are independent of is also called learning.