Learning concept embeddings for query expansion by quantum entropy minimization. Sordoni, A., Bengio, Y., & Nie, J. In Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence, of AAAI'14, pages 1586–1592, Québec City, Québec, Canada, July, 2014. AAAI Press.
abstract   bibtex   
In web search, users queries are formulated using only few terms and term-matching retrieval functions could fail at retrieving relevant documents. Given a user query, the technique of query expansion (QE) consists in selecting related terms that could enhance the likelihood of retrieving relevant documents. Selecting such expansion terms is challenging and requires a computational framework capable of encoding complex semantic relationships. In this paper, we propose a novel method for learning, in a supervised way, semantic representations for words and phrases. By embedding queries and documents in special matrices, our model disposes of an increased representational power with respect to existing approaches adopting a vector representation. We show that our model produces high-quality query expansion terms. Our expansion increase IR measures beyond expansion from current word-embeddings models and well-established traditional QE methods.
@inproceedings{sordoni_learning_2014,
	address = {Québec City, Québec, Canada},
	series = {{AAAI}'14},
	title = {Learning concept embeddings for query expansion by quantum entropy minimization},
	abstract = {In web search, users queries are formulated using only few terms and term-matching retrieval functions could fail at retrieving relevant documents. Given a user query, the technique of query expansion (QE) consists in selecting related terms that could enhance the likelihood of retrieving relevant documents. Selecting such expansion terms is challenging and requires a computational framework capable of encoding complex semantic relationships. In this paper, we propose a novel method for learning, in a supervised way, semantic representations for words and phrases. By embedding queries and documents in special matrices, our model disposes of an increased representational power with respect to existing approaches adopting a vector representation. We show that our model produces high-quality query expansion terms. Our expansion increase IR measures beyond expansion from current word-embeddings models and well-established traditional QE methods.},
	urldate = {2020-04-06},
	booktitle = {Proceedings of the {Twenty}-{Eighth} {AAAI} {Conference} on {Artificial} {Intelligence}},
	publisher = {AAAI Press},
	author = {Sordoni, Alessandro and Bengio, Yoshua and Nie, Jian-Yun},
	month = jul,
	year = {2014},
	pages = {1586--1592}
}

Downloads: 0