Generative linguistics and neural networks at 60: Foundation, friction, and fusion. Pater, J. Language, 2019.
Generative linguistics and neural networks at 60: Foundation, friction, and fusion [link]Paper  doi  abstract   bibtex   1 download  
The birthdate of both generative linguistics and neural networks can be taken as 1957, the year of the publication of foundational work by both Noam Chomsky and Frank Rosenblatt. This paper traces the development of these two approaches to cognitive science, from their largely autonomous early development in their first thirty years, through their collision in the 1980s around the past tense debate (Rumelhart and McClelland 1986, Pinker and Prince 1988), and their integration in much subsequent work up to the present, 2017. Although this integration has produced a considerable body of results, the continued general gulf between these two lines of research is likely impeding progress in both: on learning in generative linguistics, and on the representation of language in neural modeling. The paper concludes with a brief argument that generative linguistics is unlikely to fulfill its promise of accounting for language learning if it continues to maintain its distance from neural and other statistical approaches to learning.
@article{Pater2019,
abstract = {The birthdate of both generative linguistics and neural networks can be taken as 1957, the year of the publication of foundational work by both Noam Chomsky and Frank Rosenblatt. This paper traces the development of these two approaches to cognitive science, from their largely autonomous early development in their first thirty years, through their collision in the 1980s around the past tense debate (Rumelhart and McClelland 1986, Pinker and Prince 1988), and their integration in much subsequent work up to the present, 2017. Although this integration has produced a considerable body of results, the continued general gulf between these two lines of research is likely impeding progress in both: on learning in generative linguistics, and on the representation of language in neural modeling. The paper concludes with a brief argument that generative linguistics is unlikely to fulfill its promise of accounting for language learning if it continues to maintain its distance from neural and other statistical approaches to learning.},
author = {Pater, Joe},
doi = {10.1353/lan.2019.0005},
file = {:Users/shanest/Documents/Library/Pater/Language/Pater - 2019 - Generative linguistics and neural networks at 60 Foundation, friction, and fusion.pdf:pdf},
issn = {1535-0665},
journal = {Language},
keywords = {position,survey},
number = {1},
title = {{Generative linguistics and neural networks at 60: Foundation, friction, and fusion}},
url = {https://muse.jhu.edu/article/718444},
volume = {95},
year = {2019}
}

Downloads: 1