Learning Inference friendly Bayesian Networks - using incremental compilation

Studenteropgave: Kandidatspeciale og HD afgangsprojekt

  • Martin Karlsen
  • Søren Pedersen
4. semester, Datalogi, Kandidat (Kandidatuddannelse)
This report describes a project with the aim of exploring structural learning of Bayesian networks. Specifically the complexity of the generated network, as a result of the chosen learning method.
We examine the junction tree method for do- ing propagation in Bayesian networks, de- scribing the steps in compiling the junction tree from the network structure. From this analysis we learn that one cause of complex- ity in junction tree is the size of the cliques. A score function which scored a network di- rectly on the combined size of the cliques and the log-likelihood are proposed. This func- tion uses a parameter to weight whether the complexity versus the likelihood. This func- tion uses incremental compilation to avoid having to re-triangulate the entire junction tree for each candidate network.
This score function and regular BIC scoring (also augmented with a weighing parameter) was tested for precision and inference time. This analysis shows that there is a gain in using the size of the junction tree as a com- ponent in the scoring of learned nets, as these net scored using this function was most often faster when used for inference, and in some cases even able to produce usable networks where the networks learned with BIC-scoring proved too complex.
SprogEngelsk
Udgivelsesdatojun. 2008
ID: 61072885