ICA using kernel entropy estimation with nlogn complexity. Shwartz, S., Zibulevsky, M., & Schechner, Y. Y LNCS , 3195:422--429, 2004. abstract bibtex Mutual information (MI) is a common criterion in indepen- dent component analysis (ICA) optimization. MI is derived from proba- bility density functions (PDF). There are scenarios in which assuming a parametric form for the PDF leads to poor performance. Therefore, the need arises for non-parametric PDF and MI estimation. Existing non- parametric algorithms suffer from high complexity, particularly in high dimensions. To counter this obstacle, we present an ICA algorithm based on accelerated kernel entropy estimation. It achieves both high separa- tion performance and low computational complexity. For K sources with N samples, our ICA algorithm has an iteration complexity of at most O(KN log N + K 2 N ).
@Article{ 2104,
title = "ICA using kernel entropy estimation with nlogn complexity",
author = "Sarit Shwartz and Michael Zibulevsky and Yoav Y Schechner",
journal = "LNCS ",
volume = "3195",
year = "2004",
pages = "422--429",
abstract = "Mutual information (MI) is a common criterion in indepen- dent component analysis (ICA) optimization. MI is derived from proba- bility density functions (PDF). There are scenarios in which assuming a parametric form for the PDF leads to poor performance. Therefore, the need arises for non-parametric PDF and MI estimation. Existing non- parametric algorithms suffer from high complexity, particularly in high dimensions. To counter this obstacle, we present an ICA algorithm based on accelerated kernel entropy estimation. It achieves both high separa- tion performance and low computational complexity. For K sources with N samples, our ICA algorithm has an iteration complexity of at most O(KN log N + K 2 N ). ",
localfile = "/home/stephan/Daten/Arbeit/Paper\_Tutorials/Paper/040101\_2104\_ICAUsingKernelEntropyEstimationWithNLogNComplexity.pdf"
}
Downloads: 0
{"_id":"uN9CyejXWvTbj8PWW","bibbaseid":"shwartz-zibulevsky-schechner-icausingkernelentropyestimationwithnlogncomplexity-2004","downloads":0,"creationDate":"2016-04-30T08:54:33.520Z","title":"ICA using kernel entropy estimation with nlogn complexity","author_short":["Shwartz, S.","Zibulevsky, M.","Schechner, Y. Y"],"year":2004,"bibtype":"article","biburl":"http://www.stephansigg.de/stephan/AmbientIntelligenceGroup/bibtex/LiteraturStephan","bibdata":{"bibtype":"article","type":"article","title":"ICA using kernel entropy estimation with nlogn complexity","author":[{"firstnames":["Sarit"],"propositions":[],"lastnames":["Shwartz"],"suffixes":[]},{"firstnames":["Michael"],"propositions":[],"lastnames":["Zibulevsky"],"suffixes":[]},{"firstnames":["Yoav","Y"],"propositions":[],"lastnames":["Schechner"],"suffixes":[]}],"journal":"LNCS ","volume":"3195","year":"2004","pages":"422--429","abstract":"Mutual information (MI) is a common criterion in indepen- dent component analysis (ICA) optimization. MI is derived from proba- bility density functions (PDF). There are scenarios in which assuming a parametric form for the PDF leads to poor performance. Therefore, the need arises for non-parametric PDF and MI estimation. Existing non- parametric algorithms suffer from high complexity, particularly in high dimensions. To counter this obstacle, we present an ICA algorithm based on accelerated kernel entropy estimation. It achieves both high separa- tion performance and low computational complexity. For K sources with N samples, our ICA algorithm has an iteration complexity of at most O(KN log N + K 2 N ). ","localfile":"/home/stephan/Daten/Arbeit/Paper_Tutorials/Paper/040101_2104_ICAUsingKernelEntropyEstimationWithNLogNComplexity.pdf","bibtex":"@Article{ 2104,\n\ttitle = \"ICA using kernel entropy estimation with nlogn complexity\",\n\tauthor = \"Sarit Shwartz and Michael Zibulevsky and Yoav Y Schechner\",\n\tjournal = \"LNCS \",\n\tvolume = \"3195\",\n\tyear = \"2004\",\n\tpages = \"422--429\",\n\tabstract = \"Mutual information (MI) is a common criterion in indepen- dent component analysis (ICA) optimization. MI is derived from proba- bility density functions (PDF). There are scenarios in which assuming a parametric form for the PDF leads to poor performance. Therefore, the need arises for non-parametric PDF and MI estimation. Existing non- parametric algorithms suffer from high complexity, particularly in high dimensions. To counter this obstacle, we present an ICA algorithm based on accelerated kernel entropy estimation. It achieves both high separa- tion performance and low computational complexity. For K sources with N samples, our ICA algorithm has an iteration complexity of at most O(KN log N + K 2 N ). \",\n\tlocalfile = \"/home/stephan/Daten/Arbeit/Paper\\_Tutorials/Paper/040101\\_2104\\_ICAUsingKernelEntropyEstimationWithNLogNComplexity.pdf\"\n}\n\n","author_short":["Shwartz, S.","Zibulevsky, M.","Schechner, Y. Y"],"key":"2104","id":"2104","bibbaseid":"shwartz-zibulevsky-schechner-icausingkernelentropyestimationwithnlogncomplexity-2004","role":"author","urls":{},"downloads":0,"html":""},"search_terms":["ica","using","kernel","entropy","estimation","nlogn","complexity","shwartz","zibulevsky","schechner"],"keywords":[],"authorIDs":[],"dataSources":["LHPT6ZANm8yTecwp8"]}