Clusters and Coarse Partitions in LP Relaxations. Sontag, D.; Globerson, A.; and Jaakkola, T. In Koller, D.; Schuurmans, D.; Bengio, Y.; and Bottou, L., editors, Advances in Neural Information Processing Systems 21, pages 1537–1544, 2009. MIT Press.
Clusters and Coarse Partitions in LP Relaxations [pdf]Paper  abstract   bibtex   
We propose a new class of consistency constraints for Linear Programming (LP) relaxations for finding the most probable (MAP) configuration in graphical models. Usual cluster-based LP relaxations enforce joint consistency on the beliefs of a cluster of variables, with computational cost increasing exponentially with the size of the clusters. By partitioning the state space of a cluster and enforcing consistency only across partitions, we obtain a class of constraints which, although less tight, are computationally feasible for large clusters. We show how to solve the cluster selection and partitioning problem monotonically in the dual LP, using the current beliefs to guide these choices. We obtain a dual message passing algorithm and apply it to protein design problems where the variables have large state spaces and the usual cluster-based relaxations are very costly. The resulting method solves many of these problems exactly, and significantly faster than a method that does not use partitioning.
@inproceedings{SonGloJaa_nips08,
 title  = {Clusters and Coarse Partitions in {LP} Relaxations},
 author = {David Sontag and Amir Globerson and Tommi Jaakkola},
 booktitle = {Advances in Neural Information Processing Systems 21},
 editor = {D. Koller and D. Schuurmans and Y. Bengio and L. Bottou},
 pages = {1537--1544},
 publisher = {MIT Press},
 year = {2009},
 keywords = {Machine learning, Approximate inference in graphical models},
 url_Paper = {http://people.csail.mit.edu/dsontag/papers/sontag_nips08.pdf},
 abstract = {We propose a new class of consistency constraints for Linear Programming (LP) relaxations for finding the most probable (MAP) configuration in graphical models. Usual cluster-based LP relaxations enforce joint consistency on the beliefs of a cluster of variables, with computational cost increasing exponentially with the size of the clusters. By partitioning the state space of a cluster and enforcing consistency only across partitions, we obtain a class of constraints which, although less tight, are computationally feasible for large clusters. We show how to solve the cluster selection and partitioning problem monotonically in the dual LP, using the current beliefs to guide these choices. We obtain a dual message passing algorithm and apply it to protein design problems where the variables have large state spaces and the usual cluster-based relaxations are very costly. The resulting method solves many of these problems exactly, and significantly faster than a method that does not use partitioning.}
}
Downloads: 0