Distributional semantics using neural networks. Svoboda, L. Technical Report DCSE/TR-2016-0, University of West Bohemia in Pilsen, Brno, June, 2016.
Distributional semantics using neural networks [pdf]Paper  abstract   bibtex   
During recent years, neural networks show crucial improvement in catching semantics of words or sentences. They also show improves in Language modeling, which is crucial for many tasks among Natural Language Processing (NLP). One of the most used architectures of Artificial Neural Networks (ANN) in NLP are Recurrent Neural Networks (RNN) that do not use limited size of context. By using recurrent connections, information can cycle in side these networks for arbitrarily long time. Thesis summarizes the state-of-the-art approaches to distributional semantics. Thesis also focus on further use of ANN among NLP problems

Downloads: 0