Extending the Beta divergence to complex values. Vaz, C. & Narayanan, S. Pattern Recognition Letters, Apr, 2020.
Extending the Beta divergence to complex values [link]Paper  doi  abstract   bibtex   
Various information-theoretic divergences have been proposed for the cost function in tasks such as matrix factorization and clustering. One class of divergence is called the Beta divergence. By varying a real-valued parameter beta, the Beta divergence connects several well-known divergences, such as the Euclidean distance, Kullback-Leibler divergence, and Itakura-Saito divergence. Unfortunately, the Beta divergence is properly defined only for positive real values, hindering its use for measuring distances between complex-valued data points. We define a new divergence, the Complex Beta divergence, that operates on complex values, and show that it coincides with the standard Beta divergence when the data is restricted to be in phase. Moreover, we show that different values of beta place different penalties on errors in magnitude and phase.
@article{VAZ2020,
 abstract = {Various information-theoretic divergences have been proposed for the cost function in tasks such as matrix factorization and clustering. One class of divergence is called the Beta divergence. By varying a real-valued parameter beta, the Beta divergence connects several well-known divergences, such as the Euclidean distance, Kullback-Leibler divergence, and Itakura-Saito divergence. Unfortunately, the Beta divergence is properly defined only for positive real values, hindering its use for measuring distances between complex-valued data points. We define a new divergence, the Complex Beta divergence, that operates on complex values, and show that it coincides with the standard Beta divergence when the data is restricted to be in phase. Moreover, we show that different values of beta place different penalties on errors in magnitude and phase.},
 author = {Colin Vaz and Shrikanth Narayanan},
 doi = {https://doi.org/10.1016/j.patrec.2020.11.005},
 issn = {0167-8655},
 journal = {Pattern Recognition Letters},
 keywords = {Information theory, KL divergence, Objective function, Young's inequality},
 link = {http://sail.usc.edu/publications/files/VazBetaDivergence-PRL.pdf},
 title = {Extending the Beta divergence to complex values},
 url = {http://www.sciencedirect.com/science/article/pii/S0167865520304104},
 year = {2020},
 month = {Apr}
}

Downloads: 0