Production without Rules: Using an Instance Memory Model to Exploit Structure in Natural Language. Johns, B. T., Jamieson, R. K., Crump, M. J. C., Jones, M. N., & K., M. D. J. Journal of Memory and Language, Accepted.
abstract   bibtex   
Recent research in the artificial grammar learning has shown that a simple instance model of memory can account for a wide variety of artificial grammar results (Jamieson & Mewhort, 2009, 2010, 2011), indicating that language processing may have more in common with episodic memory than previously thought. These results have been used to develop new instance models of natural language processing, including a model of sentence comprehension (Johns & Jones, 2015) and semantic memory (Jamieson, Avery, Johns, & Jones, 2018). The foundations of the models lie in the storage and retrieval of episodic traces of linguistic experience. The current research extends the idea to account for natural language sentence production. We show that the structure of language itself provides sufficient information to generate syntactically correct sentences, even with no higher-level information (such as knowledge of a grammar) available to the model. This work provides insight into the highly structured nature of natural language, and how instance memory models can be a powerful model type to exploit this structure. Additionally, it demonstrates the utility of using the formalisms developed in episodic memory research to understand performance in other domains, such as in language processing.
@article{JohnsetalJML,
  title = {Production without Rules: Using an Instance Memory Model to Exploit Structure in Natural Language},
  abstract = {Recent research in the artificial grammar learning has shown that a simple instance model of memory can account for a wide variety of artificial grammar results (Jamieson & Mewhort, 2009, 2010, 2011), indicating that language processing may have more in common with episodic memory than previously thought. These results have been used to develop new instance models of natural language processing, including a model of sentence comprehension (Johns & Jones, 2015) and semantic memory (Jamieson, Avery, Johns, & Jones, 2018). The foundations of the models lie in the storage and retrieval of episodic traces of linguistic experience. The current research extends the idea to account for natural language sentence production. We show that the structure of language itself provides sufficient information to generate syntactically correct sentences, even with no higher-level information (such as knowledge of a grammar) available to the model. This work provides insight into the highly structured nature of natural language, and how instance memory models can be a powerful model type to exploit this structure. Additionally, it demonstrates the utility of using the formalisms developed in episodic memory research to understand performance in other domains, such as in language processing.},
  author = {Johns, Brendan T. and Jamieson, Randall K. and Crump, Matthew J. C. and Jones, Michael N. and Mewhort D. J. K.},
  journal = {Journal of Memory and Language},
  year = {Accepted}
}
Downloads: 0