Variational Bayesian Methods For Multimedia Problems. Chen, Z., Babacan, S. D., Molina, R., & Katsaggelos, A. K. IEEE Transactions on Multimedia, 16(4):1000–1017, IEEE, jun, 2014.
Variational Bayesian Methods For Multimedia Problems [link]Paper  doi  abstract   bibtex   
In this paper we present an introduction to Variational Bayesian (VB) methods in the context of probabilistic graphical models, and discuss their application in multimedia related problems. VB is a family of deterministic probability distribution approximation procedures that offer distinct advantages over alternative approaches based on stochastic sampling and those providing only point estimates. VB inference is flexible to be applied in different practical problems, yet is broad enough to subsume as its special cases several alternative inference approaches including Maximum A Posteriori (MAP) and the Expectation-Maximization (EM) algorithm. In this paper we also show the connections between VB and other posterior approximation methods such as the marginalization-based Loopy Belief Propagation (LBP) and the Expectation Propagation (EP) algorithms. Specifically, both VB and EP are variational methods that minimize functionals based on the Kullback-Leibler (KL) divergence. LBP, traditionally developed using graphical models, can also be viewed as a VB inference procedure. We present several multimedia related applications illustrating the use and effectiveness of the VB algorithms discussed herein. We hope that by reading this tutorial the readers will obtain a general understanding of Bayesian methods and establish connections among popular algorithms used in practice. © 1999-2012 IEEE.
@article{chen2014variational,
abstract = {In this paper we present an introduction to Variational Bayesian (VB) methods in the context of probabilistic graphical models, and discuss their application in multimedia related problems. VB is a family of deterministic probability distribution approximation procedures that offer distinct advantages over alternative approaches based on stochastic sampling and those providing only point estimates. VB inference is flexible to be applied in different practical problems, yet is broad enough to subsume as its special cases several alternative inference approaches including Maximum A Posteriori (MAP) and the Expectation-Maximization (EM) algorithm. In this paper we also show the connections between VB and other posterior approximation methods such as the marginalization-based Loopy Belief Propagation (LBP) and the Expectation Propagation (EP) algorithms. Specifically, both VB and EP are variational methods that minimize functionals based on the Kullback-Leibler (KL) divergence. LBP, traditionally developed using graphical models, can also be viewed as a VB inference procedure. We present several multimedia related applications illustrating the use and effectiveness of the VB algorithms discussed herein. We hope that by reading this tutorial the readers will obtain a general understanding of Bayesian methods and establish connections among popular algorithms used in practice. {\textcopyright} 1999-2012 IEEE.},
author = {Chen, Zhaofu and Babacan, S. Derin and Molina, Rafael and Katsaggelos, Aggelos K.},
doi = {10.1109/TMM.2014.2307692},
issn = {1520-9210},
journal = {IEEE Transactions on Multimedia},
keywords = {Bayes methods,graphical models,inverse problems,multimedia signal processing,variational Bayes},
month = {jun},
number = {4},
pages = {1000--1017},
publisher = {IEEE},
title = {{Variational Bayesian Methods For Multimedia Problems}},
url = {http://ieeexplore.ieee.org/document/6747301/},
volume = {16},
year = {2014}
}

Downloads: 0