Feature Discovery by Competitive Learning*. Rumelhart, D. E. & Zipser, D. Cognitive Science, 9(1):75--112, 1985.
Feature Discovery by Competitive Learning* [link]Paper  doi  abstract   bibtex   
This paper reports the results of our studies with an unsupervised learning paradigm which we have called “Competitive Learning.” We have examined competitive learning using both computer simulation and formal analysis and have found that when it is applied to parallel networks of neuron-like elements, many potentially useful learning tasks can be accomplished. We were attracted to competitive learning because it seems to provide a way to discover the salient, general features which can be used to classify a set of patterns. We show how a very simply competitive mechanism can discover a set of feature detectors which capture important aspects of the set of stimulus input patterns. We also show how these feature detectors can form the basis of a multilayer system that can serve to learn categorizations of stimulus sets which are not linearly separable. We show how the use of correlated stimuli con serve as a kind of “teaching” input to the system to allow the development of feature detectors which would not develop otherwise. Although we find the competitive learning mechanism a very interesting and powerful learning principle, we do not, of course, imagine thot it is the only learning principle. Competitive learning is an essentially nonassociative statistical learning scheme. We certainly imagine that other kinds of learning mechanisms will be involved in the building of associations among patterns of activation in a more complete neural network. We offer this analysis of these competitive learning mechanisms to further our understanding of how simple adaptive networks can discover features important in the description of the stimulus environment in which the system finds itself.
@article{rumelhart_feature_1985,
	title = {Feature {Discovery} by {Competitive} {Learning}*},
	volume = {9},
	copyright = {© 1985 Cognitive Science Society, Inc.},
	issn = {1551-6709},
	url = {http://onlinelibrary.wiley.com/doi/10.1207/s15516709cog0901_5/abstract},
	doi = {10.1207/s15516709cog0901_5},
	abstract = {This paper reports the results of our studies with an unsupervised learning paradigm which we have called “Competitive Learning.” We have examined competitive learning using both computer simulation and formal analysis and have found that when it is applied to parallel networks of neuron-like elements, many potentially useful learning tasks can be accomplished. We were attracted to competitive learning because it seems to provide a way to discover the salient, general features which can be used to classify a set of patterns. We show how a very simply competitive mechanism can discover a set of feature detectors which capture important aspects of the set of stimulus input patterns. We also show how these feature detectors can form the basis of a multilayer system that can serve to learn categorizations of stimulus sets which are not linearly separable. We show how the use of correlated stimuli con serve as a kind of “teaching” input to the system to allow the development of feature detectors which would not develop otherwise. Although we find the competitive learning mechanism a very interesting and powerful learning principle, we do not, of course, imagine thot it is the only learning principle. Competitive learning is an essentially nonassociative statistical learning scheme. We certainly imagine that other kinds of learning mechanisms will be involved in the building of associations among patterns of activation in a more complete neural network. We offer this analysis of these competitive learning mechanisms to further our understanding of how simple adaptive networks can discover features important in the description of the stimulus environment in which the system finds itself.},
	language = {en},
	number = {1},
	urldate = {2015-02-04TZ},
	journal = {Cognitive Science},
	author = {Rumelhart, David E. and Zipser, David},
	year = {1985},
	pages = {75--112}
}

Downloads: 0