Learning fast sparsifying overcomplete dictionaries. Rusu, C. & Thompson, J. In *2017 25th European Signal Processing Conference (EUSIPCO)*, pages 723-727, Aug, 2017.

Paper doi abstract bibtex

Paper doi abstract bibtex

In this paper we propose a dictionary learning method that builds an over complete dictionary that is computationally efficient to manipulate, i.e., sparse approximation algorithms have sub-quadratic computationally complexity. To achieve this we consider two factors (both to be learned from data) in order to design the dictionary: an orthonormal component made up of a fixed number of fast fundamental orthonormal transforms and a sparse component that builds linear combinations of elements from the first, orthonormal component. We show how effective the proposed technique is to encode image data and compare against a previously proposed method from the literature. We expect the current work to contribute to the spread of sparsity and dictionary learning techniques to hardware scenarios where there are hard limits on the computational capabilities and energy consumption of the computer systems.

@InProceedings{8081302, author = {C. Rusu and J. Thompson}, booktitle = {2017 25th European Signal Processing Conference (EUSIPCO)}, title = {Learning fast sparsifying overcomplete dictionaries}, year = {2017}, pages = {723-727}, abstract = {In this paper we propose a dictionary learning method that builds an over complete dictionary that is computationally efficient to manipulate, i.e., sparse approximation algorithms have sub-quadratic computationally complexity. To achieve this we consider two factors (both to be learned from data) in order to design the dictionary: an orthonormal component made up of a fixed number of fast fundamental orthonormal transforms and a sparse component that builds linear combinations of elements from the first, orthonormal component. We show how effective the proposed technique is to encode image data and compare against a previously proposed method from the literature. We expect the current work to contribute to the spread of sparsity and dictionary learning techniques to hardware scenarios where there are hard limits on the computational capabilities and energy consumption of the computer systems.}, keywords = {approximation theory;computational complexity;iterative methods;learning (artificial intelligence);sparse matrices;image data;sparse component;fast fundamental orthonormal transforms;orthonormal component;sub-quadratic computationally complexity;sparse approximation algorithms;dictionary learning method;overcomplete dictionaries;Dictionaries;Sparse matrices;Signal processing algorithms;Machine learning;Transforms;Linear programming;Approximation algorithms}, doi = {10.23919/EUSIPCO.2017.8081302}, issn = {2076-1465}, month = {Aug}, url = {https://www.eurasip.org/proceedings/eusipco/eusipco2017/papers/1570346733.pdf}, }

Downloads: 0

{"_id":"LDMvBdJGtNn2svWiB","bibbaseid":"rusu-thompson-learningfastsparsifyingovercompletedictionaries-2017","authorIDs":[],"author_short":["Rusu, C.","Thompson, J."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","author":[{"firstnames":["C."],"propositions":[],"lastnames":["Rusu"],"suffixes":[]},{"firstnames":["J."],"propositions":[],"lastnames":["Thompson"],"suffixes":[]}],"booktitle":"2017 25th European Signal Processing Conference (EUSIPCO)","title":"Learning fast sparsifying overcomplete dictionaries","year":"2017","pages":"723-727","abstract":"In this paper we propose a dictionary learning method that builds an over complete dictionary that is computationally efficient to manipulate, i.e., sparse approximation algorithms have sub-quadratic computationally complexity. To achieve this we consider two factors (both to be learned from data) in order to design the dictionary: an orthonormal component made up of a fixed number of fast fundamental orthonormal transforms and a sparse component that builds linear combinations of elements from the first, orthonormal component. We show how effective the proposed technique is to encode image data and compare against a previously proposed method from the literature. We expect the current work to contribute to the spread of sparsity and dictionary learning techniques to hardware scenarios where there are hard limits on the computational capabilities and energy consumption of the computer systems.","keywords":"approximation theory;computational complexity;iterative methods;learning (artificial intelligence);sparse matrices;image data;sparse component;fast fundamental orthonormal transforms;orthonormal component;sub-quadratic computationally complexity;sparse approximation algorithms;dictionary learning method;overcomplete dictionaries;Dictionaries;Sparse matrices;Signal processing algorithms;Machine learning;Transforms;Linear programming;Approximation algorithms","doi":"10.23919/EUSIPCO.2017.8081302","issn":"2076-1465","month":"Aug","url":"https://www.eurasip.org/proceedings/eusipco/eusipco2017/papers/1570346733.pdf","bibtex":"@InProceedings{8081302,\n author = {C. Rusu and J. Thompson},\n booktitle = {2017 25th European Signal Processing Conference (EUSIPCO)},\n title = {Learning fast sparsifying overcomplete dictionaries},\n year = {2017},\n pages = {723-727},\n abstract = {In this paper we propose a dictionary learning method that builds an over complete dictionary that is computationally efficient to manipulate, i.e., sparse approximation algorithms have sub-quadratic computationally complexity. To achieve this we consider two factors (both to be learned from data) in order to design the dictionary: an orthonormal component made up of a fixed number of fast fundamental orthonormal transforms and a sparse component that builds linear combinations of elements from the first, orthonormal component. We show how effective the proposed technique is to encode image data and compare against a previously proposed method from the literature. We expect the current work to contribute to the spread of sparsity and dictionary learning techniques to hardware scenarios where there are hard limits on the computational capabilities and energy consumption of the computer systems.},\n keywords = {approximation theory;computational complexity;iterative methods;learning (artificial intelligence);sparse matrices;image data;sparse component;fast fundamental orthonormal transforms;orthonormal component;sub-quadratic computationally complexity;sparse approximation algorithms;dictionary learning method;overcomplete dictionaries;Dictionaries;Sparse matrices;Signal processing algorithms;Machine learning;Transforms;Linear programming;Approximation algorithms},\n doi = {10.23919/EUSIPCO.2017.8081302},\n issn = {2076-1465},\n month = {Aug},\n url = {https://www.eurasip.org/proceedings/eusipco/eusipco2017/papers/1570346733.pdf},\n}\n\n","author_short":["Rusu, C.","Thompson, J."],"key":"8081302","id":"8081302","bibbaseid":"rusu-thompson-learningfastsparsifyingovercompletedictionaries-2017","role":"author","urls":{"Paper":"https://www.eurasip.org/proceedings/eusipco/eusipco2017/papers/1570346733.pdf"},"keyword":["approximation theory;computational complexity;iterative methods;learning (artificial intelligence);sparse matrices;image data;sparse component;fast fundamental orthonormal transforms;orthonormal component;sub-quadratic computationally complexity;sparse approximation algorithms;dictionary learning method;overcomplete dictionaries;Dictionaries;Sparse matrices;Signal processing algorithms;Machine learning;Transforms;Linear programming;Approximation algorithms"],"metadata":{"authorlinks":{}}},"bibtype":"inproceedings","biburl":"https://raw.githubusercontent.com/Roznn/EUSIPCO/main/eusipco2017url.bib","creationDate":"2021-02-13T16:38:25.594Z","downloads":0,"keywords":["approximation theory;computational complexity;iterative methods;learning (artificial intelligence);sparse matrices;image data;sparse component;fast fundamental orthonormal transforms;orthonormal component;sub-quadratic computationally complexity;sparse approximation algorithms;dictionary learning method;overcomplete dictionaries;dictionaries;sparse matrices;signal processing algorithms;machine learning;transforms;linear programming;approximation algorithms"],"search_terms":["learning","fast","sparsifying","overcomplete","dictionaries","rusu","thompson"],"title":"Learning fast sparsifying overcomplete dictionaries","year":2017,"dataSources":["2MNbFYjMYTD6z7ExY","uP2aT6Qs8sfZJ6s8b"]}