ICA using kernel entropy estimation with nlogn complexity. Shwartz, S., Zibulevsky, M., & Schechner, Y. Y LNCS , 3195:422--429, 2004.
abstract   bibtex   
Mutual information (MI) is a common criterion in indepen- dent component analysis (ICA) optimization. MI is derived from proba- bility density functions (PDF). There are scenarios in which assuming a parametric form for the PDF leads to poor performance. Therefore, the need arises for non-parametric PDF and MI estimation. Existing non- parametric algorithms suffer from high complexity, particularly in high dimensions. To counter this obstacle, we present an ICA algorithm based on accelerated kernel entropy estimation. It achieves both high separa- tion performance and low computational complexity. For K sources with N samples, our ICA algorithm has an iteration complexity of at most O(KN log N + K 2 N ).
@Article{ 2104,
	title = "ICA using kernel entropy estimation with nlogn complexity",
	author = "Sarit Shwartz and Michael Zibulevsky and Yoav Y Schechner",
	journal = "LNCS ",
	volume = "3195",
	year = "2004",
	pages = "422--429",
	abstract = "Mutual information (MI) is a common criterion in indepen- dent component analysis (ICA) optimization. MI is derived from proba- bility density functions (PDF). There are scenarios in which assuming a parametric form for the PDF leads to poor performance. Therefore, the need arises for non-parametric PDF and MI estimation. Existing non- parametric algorithms suffer from high complexity, particularly in high dimensions. To counter this obstacle, we present an ICA algorithm based on accelerated kernel entropy estimation. It achieves both high separa- tion performance and low computational complexity. For K sources with N samples, our ICA algorithm has an iteration complexity of at most O(KN log N + K 2 N ). ",
	localfile = "/home/stephan/Daten/Arbeit/Paper\_Tutorials/Paper/040101\_2104\_ICAUsingKernelEntropyEstimationWithNLogNComplexity.pdf"
}

Downloads: 0