|
|
Anyone who has never made a mistake has never tried anything new. |
|
|
(Albert Einstein) |
|
Inference in Graphical Models
|
The project titled "Inference in Graphical Models" subsumes approaches for predicting the most likely configuration or marginals of probability distributions. We detail the following methods:
- Distributed Message Passing for Large Scale Graphical Models
- Globally convergent LP Relaxation Solver using Fenchel-Young Margins
|
Distributed Message Passing for Large Scale Graphical Models
|
by: A.G. Schwing, T. Hazan, M. Pollefeys and R. Urtasun
|
In this paper we propose a distributed message-passing
algorithm for inference in large scale graphical models.
Our method can handle large problems efficiently by distributing
and parallelizing the computation and the memory
requirements. The convergence and optimality guarantees
of recently developed message-passing algorithms are preserved
by introducing new types of consistency messages,
sent between the distributed computers. We demonstrate
the effectiveness of our approach in the task of stereo reconstruction
from high-resolution imagery, and show that
inference is possible with more than 200 labels in images
larger than 10 MPixel.
|
Preprint:
Supplementary:
|
Notes for fully general C++ implementation (higher order, arbitrary graph):
- Dependency: MPI Implementation, e.g. OpenMPI, MPICH2, ...
- A tutorial for installation on Amazon EC2 is available.
- A Matlab example is included.
- Support for the UAI file format.
- Tested on Windows (Win32,x64), Linux (x64) and Mac OS X (x64) operating systems as well as Amazon EC2 and the ETH Brutus cluster.
- See included README for further details regarding compilation.
- The downloadable distributed convex belief propagation algorithm has a destinct server task which loads the entire graphical model to its memory. This is intractable for really large models, where "large" is determined by the available server memory. We have other implementations waiting for your real challenges. Please don't hesitate to contact us.
|
License:
Distributed convex belief propagation is licensed under GPL v3 (or higher). Upon request, other licensing options are available, e.g., if you want to use distributed convex belief propagation in a closed-source product.
|
Citations:
If you write a paper describing research that made use of this library, please cite the following paper describing distributed convex belief propagation.
Alexander G. Schwing, Tamir Hazan, Marc Pollefeys and Raquel Urtasun;Distributed Message Passing for Large Scale Graphical Models;In Proc. CVPR 2011;
|
@inproceedings{SchwingCVPR2011a,
author = {A.~G. Schwing and T. Hazan and M. Pollefeys and R. Urtasun},
title = {{Distributed Message Passing for Large Scale Graphical Models}},
booktitle = {Proc. CVPR},
year = {2011},
}
|
Downloads:
|
Usage: (others than myself)
|
|
Globally Convergent Dual MAP LP Relaxation Solvers using Fenchel-Young Margins
|
by: A.G. Schwing, T. Hazan, M. Pollefeys and R. Urtasun
|
While finding the exact solution for the MAP inference problem is intractable for
many real-world tasks, MAP LP relaxations have been shown to be very effective
in practice. However, the most efficient methods that perform block coordinate
descent can get stuck in sub-optimal points as they are not globally convergent.
In this work we propose to augment these algorithms with an epsilon-descent approach
and present a method to efficiently optimize for a descent direction in the sub-differential using a margin-based formulation of the Fenchel-Young duality theorem. Furthermore, the presented approach provides a methodology to construct
a primal optimal solution from its dual optimal counterpart. We demonstrate the
efficiency of the presented approach on spin glass models and protein interaction
problems and show that our approach outperforms state-of-the-art solvers.
|
Preprint:
|
|