Expectation–maximization algorithm. February, 2025. Page Version ID: 1275108110
Paper abstract bibtex In statistics, an expectation–maximization (EM) algorithm is an iterative method to find (local) maximum likelihood or maximum a posteriori (MAP) estimates of parameters in statistical models, where the model depends on unobserved latent variables. The EM iteration alternates between performing an expectation (E) step, which creates a function for the expectation of the log-likelihood evaluated using the current estimate for the parameters, and a maximization (M) step, which computes parameters maximizing the expected log-likelihood found on the E step. These parameter-estimates are then used to determine the distribution of the latent variables in the next E step. It can be used, for example, to estimate a mixture of gaussians, or to solve the multiple linear regression problem.
@misc{noauthor_expectationmaximization_2025,
title = {Expectation–maximization algorithm},
copyright = {Creative Commons Attribution-ShareAlike License},
url = {https://en.wikipedia.org/w/index.php?title=Expectation%E2%80%93maximization_algorithm&oldid=1275108110},
abstract = {In statistics, an expectation–maximization (EM) algorithm is an iterative method to find (local) maximum likelihood or maximum a posteriori (MAP) estimates of parameters in statistical models, where the model depends on unobserved latent variables. The EM iteration alternates between performing an expectation (E) step, which creates a function for the expectation of the log-likelihood evaluated using the current estimate for the parameters, and a maximization (M) step, which computes parameters maximizing the expected log-likelihood found on the E step. These parameter-estimates are then used to determine the distribution of the latent variables in the next E step. It can be used, for example, to estimate a mixture of gaussians, or to solve the multiple linear regression problem.},
language = {en},
urldate = {2025-04-01},
journal = {Wikipedia},
month = feb,
year = {2025},
note = {Page Version ID: 1275108110},
}
Downloads: 0
{"_id":"3reHu27gCRFbmhpZx","bibbaseid":"anonymous-expectationmaximizationalgorithm-2025","bibdata":{"bibtype":"misc","type":"misc","title":"Expectation–maximization algorithm","copyright":"Creative Commons Attribution-ShareAlike License","url":"https://en.wikipedia.org/w/index.php?title=Expectation%E2%80%93maximization_algorithm&oldid=1275108110","abstract":"In statistics, an expectation–maximization (EM) algorithm is an iterative method to find (local) maximum likelihood or maximum a posteriori (MAP) estimates of parameters in statistical models, where the model depends on unobserved latent variables. The EM iteration alternates between performing an expectation (E) step, which creates a function for the expectation of the log-likelihood evaluated using the current estimate for the parameters, and a maximization (M) step, which computes parameters maximizing the expected log-likelihood found on the E step. These parameter-estimates are then used to determine the distribution of the latent variables in the next E step. It can be used, for example, to estimate a mixture of gaussians, or to solve the multiple linear regression problem.","language":"en","urldate":"2025-04-01","journal":"Wikipedia","month":"February","year":"2025","note":"Page Version ID: 1275108110","bibtex":"@misc{noauthor_expectationmaximization_2025,\n\ttitle = {Expectation–maximization algorithm},\n\tcopyright = {Creative Commons Attribution-ShareAlike License},\n\turl = {https://en.wikipedia.org/w/index.php?title=Expectation%E2%80%93maximization_algorithm&oldid=1275108110},\n\tabstract = {In statistics, an expectation–maximization (EM) algorithm is an iterative method to find (local) maximum likelihood or maximum a posteriori (MAP) estimates of parameters in statistical models, where the model depends on unobserved latent variables. The EM iteration alternates between performing an expectation (E) step, which creates a function for the expectation of the log-likelihood evaluated using the current estimate for the parameters, and a maximization (M) step, which computes parameters maximizing the expected log-likelihood found on the E step. These parameter-estimates are then used to determine the distribution of the latent variables in the next E step. It can be used, for example, to estimate a mixture of gaussians, or to solve the multiple linear regression problem.},\n\tlanguage = {en},\n\turldate = {2025-04-01},\n\tjournal = {Wikipedia},\n\tmonth = feb,\n\tyear = {2025},\n\tnote = {Page Version ID: 1275108110},\n}\n\n\n\n","key":"noauthor_expectationmaximization_2025","id":"noauthor_expectationmaximization_2025","bibbaseid":"anonymous-expectationmaximizationalgorithm-2025","role":"","urls":{"Paper":"https://en.wikipedia.org/w/index.php?title=Expectation%E2%80%93maximization_algorithm&oldid=1275108110"},"metadata":{"authorlinks":{}}},"bibtype":"misc","biburl":"https://bibbase.org/zotero/abhishek-p","dataSources":["h7kKWXpJh2iaX92T5"],"keywords":[],"search_terms":["expectation","maximization","algorithm"],"title":"Expectation–maximization algorithm","year":2025}