Latent Dirichlet Allocation. Blei, D. M., Ng, A. Y., & Jordan, M. I. J. Mach. Learn. Res., 3:993–1022, 2003. Paper abstract bibtex We describe latent Dirichlet allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. LDA is a three-level hierarchical Bayesian model, in which each item of a collection is modeled as a finite mixture over an underlying set of topics. Each topic is, in turn, modeled as an infinite mixture over an underlying set of topic probabilities. In the context of text modeling, the topic probabilities provide an explicit representation of a document. We present efficient approximate inference techniques based on variational methods and an EM algorithm for empirical Bayes parameter estimation. We report results in document modeling, text classification, and collaborative filtering, comparing to a mixture of unigrams model and the probabilistic LSI model.
@article{blei_latent_2003,
title = {Latent {Dirichlet} {Allocation}},
volume = {3},
issn = {1532-4435},
url = {http://dl.acm.org/citation.cfm?id=944919.944937},
abstract = {We describe latent Dirichlet allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. LDA is a three-level hierarchical Bayesian model, in which each item of a collection is modeled as a finite mixture over an underlying set of topics. Each topic is, in turn, modeled as an infinite mixture over an underlying set of topic probabilities. In the context of text modeling, the topic probabilities provide an explicit representation of a document. We present efficient approximate inference techniques based on variational methods and an EM algorithm for empirical Bayes parameter estimation. We report results in document modeling, text classification, and collaborative filtering, comparing to a mixture of unigrams model and the probabilistic LSI model.},
language = {en},
urldate = {2014-03-29},
journal = {J. Mach. Learn. Res.},
author = {Blei, David M. and Ng, Andrew Y. and Jordan, Michael I.},
year = {2003},
pages = {993--1022},
}
Downloads: 0
{"_id":{"_str":"5350059d83dcd70757000404"},"__v":0,"authorIDs":[],"author_short":["Blei, D. M.","Ng, A. Y.","Jordan, M. I."],"bibbaseid":"blei-ng-jordan-latentdirichletallocation-2003","bibdata":{"bibtype":"article","type":"article","title":"Latent Dirichlet Allocation","volume":"3","issn":"1532-4435","url":"http://dl.acm.org/citation.cfm?id=944919.944937","abstract":"We describe latent Dirichlet allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. LDA is a three-level hierarchical Bayesian model, in which each item of a collection is modeled as a finite mixture over an underlying set of topics. Each topic is, in turn, modeled as an infinite mixture over an underlying set of topic probabilities. In the context of text modeling, the topic probabilities provide an explicit representation of a document. We present efficient approximate inference techniques based on variational methods and an EM algorithm for empirical Bayes parameter estimation. We report results in document modeling, text classification, and collaborative filtering, comparing to a mixture of unigrams model and the probabilistic LSI model.","language":"en","urldate":"2014-03-29","journal":"J. Mach. Learn. Res.","author":[{"propositions":[],"lastnames":["Blei"],"firstnames":["David","M."],"suffixes":[]},{"propositions":[],"lastnames":["Ng"],"firstnames":["Andrew","Y."],"suffixes":[]},{"propositions":[],"lastnames":["Jordan"],"firstnames":["Michael","I."],"suffixes":[]}],"year":"2003","pages":"993–1022","bibtex":"@article{blei_latent_2003,\n\ttitle = {Latent {Dirichlet} {Allocation}},\n\tvolume = {3},\n\tissn = {1532-4435},\n\turl = {http://dl.acm.org/citation.cfm?id=944919.944937},\n\tabstract = {We describe latent Dirichlet allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. LDA is a three-level hierarchical Bayesian model, in which each item of a collection is modeled as a finite mixture over an underlying set of topics. Each topic is, in turn, modeled as an infinite mixture over an underlying set of topic probabilities. In the context of text modeling, the topic probabilities provide an explicit representation of a document. We present efficient approximate inference techniques based on variational methods and an EM algorithm for empirical Bayes parameter estimation. We report results in document modeling, text classification, and collaborative filtering, comparing to a mixture of unigrams model and the probabilistic LSI model.},\n\tlanguage = {en},\n\turldate = {2014-03-29},\n\tjournal = {J. Mach. Learn. Res.},\n\tauthor = {Blei, David M. and Ng, Andrew Y. and Jordan, Michael I.},\n\tyear = {2003},\n\tpages = {993--1022},\n}\n\n","author_short":["Blei, D. M.","Ng, A. Y.","Jordan, M. I."],"key":"blei_latent_2003","id":"blei_latent_2003","bibbaseid":"blei-ng-jordan-latentdirichletallocation-2003","role":"author","urls":{"Paper":"http://dl.acm.org/citation.cfm?id=944919.944937"},"metadata":{"authorlinks":{}},"downloads":0},"bibtype":"article","biburl":"https://api.zotero.org/groups/2386895/collections/QIDBSRLZ/items?format=bibtex&limit=100","downloads":0,"keywords":[],"search_terms":["latent","dirichlet","allocation","blei","ng","jordan"],"title":"Latent Dirichlet Allocation","year":2003,"dataSources":["3cSHcH82NdYLcnKP2","EJbQZ5DryKAnqjJXj","fNQimi75NnY5dsMR6","wPWgDzyxsGksjg6mb","g9KCRaSxC6wqS4oGp","TJkbwzD8s2wCxBy6Y","XFrKPG99s5t3W7xuW","N4kJAiLiJ7kxfNsoh","J3RXbtXEvGquHrZXE"]}