Capabilities and training of feedforward nets. Sontag, E. In Neural networks (New Brunswick, NJ, 1990), pages 303–321. Academic Press, Boston, MA, 1991. abstract bibtex This paper surveys recent work by the author on learning and representational capabilities of feedforward nets. The learning results show that, among two possible variants of the so-called backpropagation training method for sigmoidal nets, both of which variants are used in practice, one is a better generalization of the older perceptron training algorithm than the other. The representation results show that nets consisting of sigmoidal neurons have at least twice the representational capabilities of nets that use classical threshold neurons, at least when this increase is quantified in terms of classification power. On the other hand, threshold nets are shown to be more useful when approximating implicit functions, as illustrated with an application to a typical control problem.
@INCOLLECTION{MR1114761,
AUTHOR = {E.D. Sontag},
BOOKTITLE = {Neural networks (New Brunswick, NJ, 1990)},
PUBLISHER = {Academic Press},
TITLE = {Capabilities and training of feedforward nets},
YEAR = {1991},
ADDRESS = {Boston, MA},
OPTCHAPTER = {},
OPTCROSSREF = {},
OPTEDITION = {},
OPTEDITOR = {},
OPTMONTH = {},
OPTNOTE = {},
OPTNUMBER = {},
PAGES = {303--321},
OPTSERIES = {},
OPTTYPE = {},
OPTVOLUME = {},
KEYWORDS = {neural networks, neural networks},
PDF = {../../FTPDIR/90caip.pdf},
ABSTRACT = { This paper surveys recent work by the author on
learning and representational capabilities of feedforward nets. The
learning results show that, among two possible variants of the
so-called backpropagation training method for sigmoidal nets, both of
which variants are used in practice, one is a better generalization
of the older perceptron training algorithm than the other. The
representation results show that nets consisting of sigmoidal neurons
have at least twice the representational capabilities of nets that
use classical threshold neurons, at least when this increase is
quantified in terms of classification power. On the other hand,
threshold nets are shown to be more useful when approximating
implicit functions, as illustrated with an application to a typical
control problem. }
}
Downloads: 0
{"_id":"wB9x3Jqyo2pETGgm8","bibbaseid":"sontag-capabilitiesandtrainingoffeedforwardnets-1991","downloads":0,"creationDate":"2018-10-18T05:07:06.590Z","title":"Capabilities and training of feedforward nets","author_short":["Sontag, E."],"year":1991,"bibtype":"incollection","biburl":"http://www.sontaglab.org/PUBDIR/Biblio/complete-bibliography.bib","bibdata":{"bibtype":"incollection","type":"incollection","author":[{"firstnames":["E.D."],"propositions":[],"lastnames":["Sontag"],"suffixes":[]}],"booktitle":"Neural networks (New Brunswick, NJ, 1990)","publisher":"Academic Press","title":"Capabilities and training of feedforward nets","year":"1991","address":"Boston, MA","optchapter":"","optcrossref":"","optedition":"","opteditor":"","optmonth":"","optnote":"","optnumber":"","pages":"303–321","optseries":"","opttype":"","optvolume":"","keywords":"neural networks, neural networks","pdf":"../../FTPDIR/90caip.pdf","abstract":"This paper surveys recent work by the author on learning and representational capabilities of feedforward nets. The learning results show that, among two possible variants of the so-called backpropagation training method for sigmoidal nets, both of which variants are used in practice, one is a better generalization of the older perceptron training algorithm than the other. The representation results show that nets consisting of sigmoidal neurons have at least twice the representational capabilities of nets that use classical threshold neurons, at least when this increase is quantified in terms of classification power. On the other hand, threshold nets are shown to be more useful when approximating implicit functions, as illustrated with an application to a typical control problem. ","bibtex":"@INCOLLECTION{MR1114761,\n AUTHOR = {E.D. Sontag},\n BOOKTITLE = {Neural networks (New Brunswick, NJ, 1990)},\n PUBLISHER = {Academic Press},\n TITLE = {Capabilities and training of feedforward nets},\n YEAR = {1991},\n ADDRESS = {Boston, MA},\n OPTCHAPTER = {},\n OPTCROSSREF = {},\n OPTEDITION = {},\n OPTEDITOR = {},\n OPTMONTH = {},\n OPTNOTE = {},\n OPTNUMBER = {},\n PAGES = {303--321},\n OPTSERIES = {},\n OPTTYPE = {},\n OPTVOLUME = {},\n KEYWORDS = {neural networks, neural networks},\n PDF = {../../FTPDIR/90caip.pdf},\n ABSTRACT = { This paper surveys recent work by the author on \n learning and representational capabilities of feedforward nets. The \n learning results show that, among two possible variants of the \n so-called backpropagation training method for sigmoidal nets, both of \n which variants are used in practice, one is a better generalization \n of the older perceptron training algorithm than the other. The \n representation results show that nets consisting of sigmoidal neurons \n have at least twice the representational capabilities of nets that \n use classical threshold neurons, at least when this increase is \n quantified in terms of classification power. On the other hand, \n threshold nets are shown to be more useful when approximating \n implicit functions, as illustrated with an application to a typical \n control problem. }\n}\n\n","author_short":["Sontag, E."],"key":"MR1114761","id":"MR1114761","bibbaseid":"sontag-capabilitiesandtrainingoffeedforwardnets-1991","role":"author","urls":{},"keyword":["neural networks","neural networks"],"downloads":0,"html":""},"search_terms":["capabilities","training","feedforward","nets","sontag"],"keywords":["neural networks","neural networks"],"authorIDs":["5bc814f9db768e100000015a"],"dataSources":["DKqZbTmd7peqE4THw"]}