next up previous contents
Next: Special States Up: Avoiding Nonglobal Minima Previous: Avoiding Nonglobal Minima

Rebooting

Some local minima can be avoided by sometimes resetting the neuron activities. It is called ``rebooting''. Time independent parameters and weights are left as they were but sources sand u are set to their means:

$\displaystyle \forall i,k,t: \overline{s}_{i,k,\text{new}}(t) = \frac{1}{T}\sum_{r=1}^{T}\overline{s}_{i,k}(r)$     (6.6)
$\displaystyle \forall i,k,t: \widetilde{s}_{i,k,\text{new}}(t) = \frac{1}{T}\sum_{r=1}^{T}\widetilde{s}_{i,k}(r)$     (6.7)

and similarly for u.

If there is lots of data from the same data set available, this is a good opportunity to change the data. It can be better to iterate more with less data than the other way around. Switching the data samples from time to time can still preserve some of the benefits of the larger data set. A network that performs well with new data, has a good generalisation capability by definition.



Tapani Raiko
2001-12-10