Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks. Adi, Y., Kermany, E., Belinkov, Y., Lavi, O., & Goldberg, Y. In International Conference on Learning Representations, 2017. Paper abstract bibtex There is a lot of research interest in encoding variable length sentences into fixed length vectors, in a way that preserves the sentence meanings. Two common methods include representations based on averaging word vectors, and representations based on the hidden states of recurrent neural networks such as LSTMs. The sentence vectors are used as features for subsequent machine learning tasks or for pre-training in the context of deep learning. However, not much is known about the properties that are encoded in these sentence representations and about the language information they capture. We propose a framework that facilitates better understanding of the encoded representations. We define prediction tasks around isolated aspects of sentence structure (namely sentence length, word content, and word order), and score representations by the ability to train a classifier to solve each prediction task when using the representation as input. We demonstrate the potential contribution of the approach by analyzing different sentence representation mechanisms. The analysis sheds light on the relative strengths of different sentence embedding methods with respect to these low level prediction tasks, and on the effect of the encoded vector's dimensionality on the resulting representations.
@inproceedings{Adi2017,
abstract = {There is a lot of research interest in encoding variable length sentences into fixed length vectors, in a way that preserves the sentence meanings. Two common methods include representations based on averaging word vectors, and representations based on the hidden states of recurrent neural networks such as LSTMs. The sentence vectors are used as features for subsequent machine learning tasks or for pre-training in the context of deep learning. However, not much is known about the properties that are encoded in these sentence representations and about the language information they capture. We propose a framework that facilitates better understanding of the encoded representations. We define prediction tasks around isolated aspects of sentence structure (namely sentence length, word content, and word order), and score representations by the ability to train a classifier to solve each prediction task when using the representation as input. We demonstrate the potential contribution of the approach by analyzing different sentence representation mechanisms. The analysis sheds light on the relative strengths of different sentence embedding methods with respect to these low level prediction tasks, and on the effect of the encoded vector's dimensionality on the resulting representations.},
archivePrefix = {arXiv},
arxivId = {1608.04207},
author = {Adi, Yossi and Kermany, Einat and Belinkov, Yonatan and Lavi, Ofer and Goldberg, Yoav},
booktitle = {International Conference on Learning Representations},
eprint = {1608.04207},
file = {:Users/shanest/Documents/Library/Adi et al/International Conference on Learning Representations/Adi et al. - 2017 - Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks.pdf:pdf},
keywords = {method: diagnostic classifier},
title = {{Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks}},
url = {http://arxiv.org/abs/1608.04207},
year = {2017}
}
Downloads: 0
{"_id":"webiZXDdk4JPaNBXS","bibbaseid":"adi-kermany-belinkov-lavi-goldberg-finegrainedanalysisofsentenceembeddingsusingauxiliarypredictiontasks-2017","authorIDs":[],"author_short":["Adi, Y.","Kermany, E.","Belinkov, Y.","Lavi, O.","Goldberg, Y."],"bibdata":{"bibtype":"inproceedings","type":"inproceedings","abstract":"There is a lot of research interest in encoding variable length sentences into fixed length vectors, in a way that preserves the sentence meanings. Two common methods include representations based on averaging word vectors, and representations based on the hidden states of recurrent neural networks such as LSTMs. The sentence vectors are used as features for subsequent machine learning tasks or for pre-training in the context of deep learning. However, not much is known about the properties that are encoded in these sentence representations and about the language information they capture. We propose a framework that facilitates better understanding of the encoded representations. We define prediction tasks around isolated aspects of sentence structure (namely sentence length, word content, and word order), and score representations by the ability to train a classifier to solve each prediction task when using the representation as input. We demonstrate the potential contribution of the approach by analyzing different sentence representation mechanisms. The analysis sheds light on the relative strengths of different sentence embedding methods with respect to these low level prediction tasks, and on the effect of the encoded vector's dimensionality on the resulting representations.","archiveprefix":"arXiv","arxivid":"1608.04207","author":[{"propositions":[],"lastnames":["Adi"],"firstnames":["Yossi"],"suffixes":[]},{"propositions":[],"lastnames":["Kermany"],"firstnames":["Einat"],"suffixes":[]},{"propositions":[],"lastnames":["Belinkov"],"firstnames":["Yonatan"],"suffixes":[]},{"propositions":[],"lastnames":["Lavi"],"firstnames":["Ofer"],"suffixes":[]},{"propositions":[],"lastnames":["Goldberg"],"firstnames":["Yoav"],"suffixes":[]}],"booktitle":"International Conference on Learning Representations","eprint":"1608.04207","file":":Users/shanest/Documents/Library/Adi et al/International Conference on Learning Representations/Adi et al. - 2017 - Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks.pdf:pdf","keywords":"method: diagnostic classifier","title":"Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks","url":"http://arxiv.org/abs/1608.04207","year":"2017","bibtex":"@inproceedings{Adi2017,\nabstract = {There is a lot of research interest in encoding variable length sentences into fixed length vectors, in a way that preserves the sentence meanings. Two common methods include representations based on averaging word vectors, and representations based on the hidden states of recurrent neural networks such as LSTMs. The sentence vectors are used as features for subsequent machine learning tasks or for pre-training in the context of deep learning. However, not much is known about the properties that are encoded in these sentence representations and about the language information they capture. We propose a framework that facilitates better understanding of the encoded representations. We define prediction tasks around isolated aspects of sentence structure (namely sentence length, word content, and word order), and score representations by the ability to train a classifier to solve each prediction task when using the representation as input. We demonstrate the potential contribution of the approach by analyzing different sentence representation mechanisms. The analysis sheds light on the relative strengths of different sentence embedding methods with respect to these low level prediction tasks, and on the effect of the encoded vector's dimensionality on the resulting representations.},\narchivePrefix = {arXiv},\narxivId = {1608.04207},\nauthor = {Adi, Yossi and Kermany, Einat and Belinkov, Yonatan and Lavi, Ofer and Goldberg, Yoav},\nbooktitle = {International Conference on Learning Representations},\neprint = {1608.04207},\nfile = {:Users/shanest/Documents/Library/Adi et al/International Conference on Learning Representations/Adi et al. - 2017 - Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks.pdf:pdf},\nkeywords = {method: diagnostic classifier},\ntitle = {{Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks}},\nurl = {http://arxiv.org/abs/1608.04207},\nyear = {2017}\n}\n","author_short":["Adi, Y.","Kermany, E.","Belinkov, Y.","Lavi, O.","Goldberg, Y."],"key":"Adi2017","id":"Adi2017","bibbaseid":"adi-kermany-belinkov-lavi-goldberg-finegrainedanalysisofsentenceembeddingsusingauxiliarypredictiontasks-2017","role":"author","urls":{"Paper":"http://arxiv.org/abs/1608.04207"},"keyword":["method: diagnostic classifier"],"metadata":{"authorlinks":{}},"downloads":0},"bibtype":"inproceedings","biburl":"https://www.shane.st/teaching/575/win20/MachineLearning-interpretability.bib","creationDate":"2020-01-09T18:15:21.652Z","downloads":0,"keywords":["method: diagnostic classifier"],"search_terms":["fine","grained","analysis","sentence","embeddings","using","auxiliary","prediction","tasks","adi","kermany","belinkov","lavi","goldberg"],"title":"Fine-grained Analysis of Sentence Embeddings Using Auxiliary Prediction Tasks","year":2017,"dataSources":["okYcdTpf4JJ2zkj7A","znj7izS5PeehdLR3G"]}