On the complexity of training neural networks with continuous activation functions. DasGupta, B., Siegelmann, H., & Sontag, E. IEEE Trans. Neural Networks, 6:1490–1504, 1995.
abstract   bibtex   
Blum and Rivest showed that any possible neural net learning algorithm based on fixed architectures faces severe computational barriers. This paper extends their NP-completeness result, which applied only to nets based on hard threshold activations, to nets that employ a particular continuous activation. In view of neural network practice, this is a more relevant result to understanding the limitations of backpropagation and related techniques.
@ARTICLE{DasGupta-TNN,
   AUTHOR       = {B. DasGupta and H.T. Siegelmann and E.D. Sontag},
   JOURNAL      = {IEEE Trans. Neural Networks},
   TITLE        = {On the complexity of training neural networks with 
      continuous activation functions},
   YEAR         = {1995},
   OPTMONTH     = {},
   OPTNOTE      = {},
   OPTNUMBER    = {},
   PAGES        = {1490--1504},
   VOLUME       = {6},
   KEYWORDS     = {neural networks, analog computing, theory of computing, 
      neural networks, computational complexity, machine learning},
   PDF          = {../../FTPDIR/complexity-training.pdf},
   ABSTRACT     = { Blum and Rivest showed that any possible neural net 
      learning algorithm based on fixed architectures faces severe 
      computational barriers. This paper extends their NP-completeness 
      result, which applied only to nets based on hard threshold 
      activations, to nets that employ a particular continuous activation. 
      In view of neural network practice, this is a more relevant result to 
      understanding the limitations of backpropagation and related 
      techniques. }
}

Downloads: 0