Estimation of a Low-Rank Probability-Tensor from Sample Sub-Tensors via Joint Factorization Minimizing the Kullback-Leibler Divergence. Yeredor, A. & Haardt, M. In 2019 27th European Signal Processing Conference (EUSIPCO), pages 1-5, Sep., 2019.
Paper doi abstract bibtex Recently there has been a growing interest in the estimation of the Probability Mass Function (PMF) of discrete random vectors (RVs) from partial observations thereof (namely when observed realizations of the RV are limited to random subsets of its elements). It was shown that under a low-rank assumption on the PMF tensor (and some additional mild conditions), the full tensor can be recovered, e.g., by applying an approximate coupled factorization to empirical estimates of all joint PMFs of subgroups of fixed cardinality larger than two (e.g., triplets). The coupled factorization is based on a Least Squares (LS) fit to the empirically estimated lower-order sub-tensors. In this work we take a different approach by trying to fit the coupled factorization to estimated sub-tensors in the sense of minimizing the Kullback-Leibler divergence (KLD) between the estimated and inferred tensors. We explain why the KLD-based fitting is better-suited than LS-based fitting for the problem of PMF estimation, propose an associated minimization approach and demonstrate some advantages over LS-based fitting in this context using simulation results.
@InProceedings{8903161,
author = {A. Yeredor and M. Haardt},
booktitle = {2019 27th European Signal Processing Conference (EUSIPCO)},
title = {Estimation of a Low-Rank Probability-Tensor from Sample Sub-Tensors via Joint Factorization Minimizing the Kullback-Leibler Divergence},
year = {2019},
pages = {1-5},
abstract = {Recently there has been a growing interest in the estimation of the Probability Mass Function (PMF) of discrete random vectors (RVs) from partial observations thereof (namely when observed realizations of the RV are limited to random subsets of its elements). It was shown that under a low-rank assumption on the PMF tensor (and some additional mild conditions), the full tensor can be recovered, e.g., by applying an approximate coupled factorization to empirical estimates of all joint PMFs of subgroups of fixed cardinality larger than two (e.g., triplets). The coupled factorization is based on a Least Squares (LS) fit to the empirically estimated lower-order sub-tensors. In this work we take a different approach by trying to fit the coupled factorization to estimated sub-tensors in the sense of minimizing the Kullback-Leibler divergence (KLD) between the estimated and inferred tensors. We explain why the KLD-based fitting is better-suited than LS-based fitting for the problem of PMF estimation, propose an associated minimization approach and demonstrate some advantages over LS-based fitting in this context using simulation results.},
keywords = {least squares approximations;minimisation;probability;tensors;vectors;PMF tensor;lower-order sub-tensors;Kullback-Leibler divergence;PMF estimation;minimization approach;probability mass function;discrete random vectors;low-rank assumption;KLD-based fitting;LS-based fitting;low-rank probability-tensor;least squares fit;Tensors;Minimization;Signal processing;Europe;Maximum likelihood estimation;Loading;Low-Rank Tensor Factorization;Probability Mass Function (PMF);Approximate Coupled Factorization;Kullback-Leibler Divergence (KLD);Nonnegative Tensor Factorization;Canonical Polyadic Decomposition (CPD)},
doi = {10.23919/EUSIPCO.2019.8903161},
issn = {2076-1465},
month = {Sep.},
url = {https://www.eurasip.org/proceedings/eusipco/eusipco2019/proceedings/papers/1570533520.pdf},
}
Downloads: 0
{"_id":"kjCqE7ZYjXPYw8ssh","bibbaseid":"yeredor-haardt-estimationofalowrankprobabilitytensorfromsamplesubtensorsviajointfactorizationminimizingthekullbackleiblerdivergence-2019","authorIDs":[],"author_short":["Yeredor, A.","Haardt, M."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","author":[{"firstnames":["A."],"propositions":[],"lastnames":["Yeredor"],"suffixes":[]},{"firstnames":["M."],"propositions":[],"lastnames":["Haardt"],"suffixes":[]}],"booktitle":"2019 27th European Signal Processing Conference (EUSIPCO)","title":"Estimation of a Low-Rank Probability-Tensor from Sample Sub-Tensors via Joint Factorization Minimizing the Kullback-Leibler Divergence","year":"2019","pages":"1-5","abstract":"Recently there has been a growing interest in the estimation of the Probability Mass Function (PMF) of discrete random vectors (RVs) from partial observations thereof (namely when observed realizations of the RV are limited to random subsets of its elements). It was shown that under a low-rank assumption on the PMF tensor (and some additional mild conditions), the full tensor can be recovered, e.g., by applying an approximate coupled factorization to empirical estimates of all joint PMFs of subgroups of fixed cardinality larger than two (e.g., triplets). The coupled factorization is based on a Least Squares (LS) fit to the empirically estimated lower-order sub-tensors. In this work we take a different approach by trying to fit the coupled factorization to estimated sub-tensors in the sense of minimizing the Kullback-Leibler divergence (KLD) between the estimated and inferred tensors. We explain why the KLD-based fitting is better-suited than LS-based fitting for the problem of PMF estimation, propose an associated minimization approach and demonstrate some advantages over LS-based fitting in this context using simulation results.","keywords":"least squares approximations;minimisation;probability;tensors;vectors;PMF tensor;lower-order sub-tensors;Kullback-Leibler divergence;PMF estimation;minimization approach;probability mass function;discrete random vectors;low-rank assumption;KLD-based fitting;LS-based fitting;low-rank probability-tensor;least squares fit;Tensors;Minimization;Signal processing;Europe;Maximum likelihood estimation;Loading;Low-Rank Tensor Factorization;Probability Mass Function (PMF);Approximate Coupled Factorization;Kullback-Leibler Divergence (KLD);Nonnegative Tensor Factorization;Canonical Polyadic Decomposition (CPD)","doi":"10.23919/EUSIPCO.2019.8903161","issn":"2076-1465","month":"Sep.","url":"https://www.eurasip.org/proceedings/eusipco/eusipco2019/proceedings/papers/1570533520.pdf","bibtex":"@InProceedings{8903161,\n author = {A. Yeredor and M. Haardt},\n booktitle = {2019 27th European Signal Processing Conference (EUSIPCO)},\n title = {Estimation of a Low-Rank Probability-Tensor from Sample Sub-Tensors via Joint Factorization Minimizing the Kullback-Leibler Divergence},\n year = {2019},\n pages = {1-5},\n abstract = {Recently there has been a growing interest in the estimation of the Probability Mass Function (PMF) of discrete random vectors (RVs) from partial observations thereof (namely when observed realizations of the RV are limited to random subsets of its elements). It was shown that under a low-rank assumption on the PMF tensor (and some additional mild conditions), the full tensor can be recovered, e.g., by applying an approximate coupled factorization to empirical estimates of all joint PMFs of subgroups of fixed cardinality larger than two (e.g., triplets). The coupled factorization is based on a Least Squares (LS) fit to the empirically estimated lower-order sub-tensors. In this work we take a different approach by trying to fit the coupled factorization to estimated sub-tensors in the sense of minimizing the Kullback-Leibler divergence (KLD) between the estimated and inferred tensors. We explain why the KLD-based fitting is better-suited than LS-based fitting for the problem of PMF estimation, propose an associated minimization approach and demonstrate some advantages over LS-based fitting in this context using simulation results.},\n keywords = {least squares approximations;minimisation;probability;tensors;vectors;PMF tensor;lower-order sub-tensors;Kullback-Leibler divergence;PMF estimation;minimization approach;probability mass function;discrete random vectors;low-rank assumption;KLD-based fitting;LS-based fitting;low-rank probability-tensor;least squares fit;Tensors;Minimization;Signal processing;Europe;Maximum likelihood estimation;Loading;Low-Rank Tensor Factorization;Probability Mass Function (PMF);Approximate Coupled Factorization;Kullback-Leibler Divergence (KLD);Nonnegative Tensor Factorization;Canonical Polyadic Decomposition (CPD)},\n doi = {10.23919/EUSIPCO.2019.8903161},\n issn = {2076-1465},\n month = {Sep.},\n url = {https://www.eurasip.org/proceedings/eusipco/eusipco2019/proceedings/papers/1570533520.pdf},\n}\n\n","author_short":["Yeredor, A.","Haardt, M."],"key":"8903161","id":"8903161","bibbaseid":"yeredor-haardt-estimationofalowrankprobabilitytensorfromsamplesubtensorsviajointfactorizationminimizingthekullbackleiblerdivergence-2019","role":"author","urls":{"Paper":"https://www.eurasip.org/proceedings/eusipco/eusipco2019/proceedings/papers/1570533520.pdf"},"keyword":["least squares approximations;minimisation;probability;tensors;vectors;PMF tensor;lower-order sub-tensors;Kullback-Leibler divergence;PMF estimation;minimization approach;probability mass function;discrete random vectors;low-rank assumption;KLD-based fitting;LS-based fitting;low-rank probability-tensor;least squares fit;Tensors;Minimization;Signal processing;Europe;Maximum likelihood estimation;Loading;Low-Rank Tensor Factorization;Probability Mass Function (PMF);Approximate Coupled Factorization;Kullback-Leibler Divergence (KLD);Nonnegative Tensor Factorization;Canonical Polyadic Decomposition (CPD)"],"metadata":{"authorlinks":{}},"downloads":0},"bibtype":"inproceedings","biburl":"https://raw.githubusercontent.com/Roznn/EUSIPCO/main/eusipco2019url.bib","creationDate":"2021-02-11T19:15:22.174Z","downloads":0,"keywords":["least squares approximations;minimisation;probability;tensors;vectors;pmf tensor;lower-order sub-tensors;kullback-leibler divergence;pmf estimation;minimization approach;probability mass function;discrete random vectors;low-rank assumption;kld-based fitting;ls-based fitting;low-rank probability-tensor;least squares fit;tensors;minimization;signal processing;europe;maximum likelihood estimation;loading;low-rank tensor factorization;probability mass function (pmf);approximate coupled factorization;kullback-leibler divergence (kld);nonnegative tensor factorization;canonical polyadic decomposition (cpd)"],"search_terms":["estimation","low","rank","probability","tensor","sample","sub","tensors","via","joint","factorization","minimizing","kullback","leibler","divergence","yeredor","haardt"],"title":"Estimation of a Low-Rank Probability-Tensor from Sample Sub-Tensors via Joint Factorization Minimizing the Kullback-Leibler Divergence","year":2019,"dataSources":["NqWTiMfRR56v86wRs","r6oz3cMyC99QfiuHW"]}