Recently two search algorithms, A* and breadth-first branch and
bound (BFBnB), were developed based on a simple admissible heuristic
for learning Bayesian network structures that optimize a scoring
function. The heuristic represents a relaxation of the learning
problem such that each variable chooses optimal parents
independently. As a result, the heuristic may contain many directed
cycles and result in a loose bound. This paper introduces an
improved admissible heuristic that tries to avoid directed cycles
within small groups of variables. A sparse representation is also
introduced to store only the unique optimal parent
choices. Empirical results show that the new techniques
significantly improved the efficiency and scalability of A* and
BFBnB on most of datasets tested in this paper.
[ Paper ]