next up previous contents
Next: Structural learning Up: Logical hidden Markov models Previous: Logical hidden Markov models   Contents

Reachable states

An important point for computational efficiency in the parameter learning algorithm is the pruning of unreachable states. Before any probabilistic parameters are even considered, the algorithm finds all the reachable hidden states at each time step, given the whole observation sequence. The algorithm works as follows.

The only reachable state at time 0 is the start state. Then for each time $ t$ from 0 to $ T-1$, all the transitions from $ S_t$ that agree with the current observation $ \mathtt{x}_t$ are used to produce the set of reachable states $ S_{t+1}$. At this stage, the states are reachable given the observation sequence so far. Finally for each time $ t$ back from $ T-1$ to 0, those states in $ S_t$ whose transitions lead outside $ S_{t+1}$, are removed. This takes into account the whole observation sequence.

Note that this procedure resembles the forward-backward algorithm for HMMs. As a by-product, it can be used to check whether an observation sequence could have been generated with the LOHMM. If not, the sets of reachable states are empty.


next up previous contents
Next: Structural learning Up: Logical hidden Markov models Previous: Logical hidden Markov models   Contents
Tapani Raiko 2006-11-21