DeepMP for Non–Negative Sparse Decomposition. Voulgaris, K. A., Davies, M. E., & Yaghoobi, M. In 2020 28th European Signal Processing Conference (EUSIPCO), pages 2035-2039, Aug, 2020.
doi  abstract   bibtex   
Non–negative signals form an important class of sparse signals. Many algorithms have already been proposed to recover such non-negative representations, where greedy and convex relaxed algorithms are among the most popular methods. The greedy techniques are low computational cost algorithms, which have also been modified to incorporate the non-negativity of the representations. One such modification has been pro-posed for Matching Pursuit (MP) based algorithms, which first chooses positive coefficients and uses a non-negative optimisation technique that guarantees the non–negativity of the coefficients. The performance of greedy algorithms, like all non–exhaustive search methods, suffer from high coherence with the linear generative model, called the dictionary. We here first reformulate the non–negative matching pursuit algorithm in the form of a deep neural network. We then show that the proposed model after training yields a significant improvement in terms of exact recovery performance, compared to other non–trained greedy algorithms, while keeping the complexity low.
@InProceedings{9287711,
  author = {K. A. Voulgaris and M. E. Davies and M. Yaghoobi},
  booktitle = {2020 28th European Signal Processing Conference (EUSIPCO)},
  title = {DeepMP for Non–Negative Sparse Decomposition},
  year = {2020},
  pages = {2035-2039},
  abstract = {Non–negative signals form an important class of sparse signals. Many algorithms have already been proposed to recover such non-negative representations, where greedy and convex relaxed algorithms are among the most popular methods. The greedy techniques are low computational cost algorithms, which have also been modified to incorporate the non-negativity of the representations. One such modification has been pro-posed for Matching Pursuit (MP) based algorithms, which first chooses positive coefficients and uses a non-negative optimisation technique that guarantees the non–negativity of the coefficients. The performance of greedy algorithms, like all non–exhaustive search methods, suffer from high coherence with the linear generative model, called the dictionary. We here first reformulate the non–negative matching pursuit algorithm in the form of a deep neural network. We then show that the proposed model after training yields a significant improvement in terms of exact recovery performance, compared to other non–trained greedy algorithms, while keeping the complexity low.},
  keywords = {Greedy algorithms;Training;Search methods;Neural networks;Matching pursuit algorithms;Signal processing algorithms;Signal processing;Matching Pursuit;Non-negative Sparse Approximations;Multilabel Classification;Deep Neural Networks},
  doi = {10.23919/Eusipco47968.2020.9287711},
  issn = {2076-1465},
  month = {Aug},
}

Downloads: 0