Inside-Outside and Forward-Backward Algorithms Are Just Backprop (Tutorial Paper). Eisner, J EMNLP 2016, 2016. abstract bibtex Abstract A probabilistic or weighted grammar implies a posterior probability distribution over possible parses of a given input sentence. One often needs to extract information from this distribution, by computing the expected counts (in the unknown parse) of various grammar.
@Article{Eisner2016,
author = {Eisner, J},
title = {Inside-Outside and Forward-Backward Algorithms Are Just Backprop (Tutorial Paper)},
journal = {EMNLP 2016},
volume = {},
number = {},
pages = {},
year = {2016},
abstract = {Abstract A probabilistic or weighted grammar implies a posterior probability distribution over possible parses of a given input sentence. One often needs to extract information from this distribution, by computing the expected counts (in the unknown parse) of various grammar.},
location = {},
keywords = {}}
Downloads: 0
{"_id":"7Xr4roaP8kGf6Eeay","bibbaseid":"eisner-insideoutsideandforwardbackwardalgorithmsarejustbackproptutorialpaper-2016","authorIDs":[],"author_short":["Eisner, J"],"bibdata":{"bibtype":"article","type":"article","author":[{"propositions":[],"lastnames":["Eisner"],"firstnames":["J"],"suffixes":[]}],"title":"Inside-Outside and Forward-Backward Algorithms Are Just Backprop (Tutorial Paper)","journal":"EMNLP 2016","volume":"","number":"","pages":"","year":"2016","abstract":"Abstract A probabilistic or weighted grammar implies a posterior probability distribution over possible parses of a given input sentence. One often needs to extract information from this distribution, by computing the expected counts (in the unknown parse) of various grammar.","location":"","keywords":"","bibtex":"@Article{Eisner2016,\nauthor = {Eisner, J}, \ntitle = {Inside-Outside and Forward-Backward Algorithms Are Just Backprop (Tutorial Paper)}, \njournal = {EMNLP 2016}, \nvolume = {}, \nnumber = {}, \npages = {}, \nyear = {2016}, \nabstract = {Abstract A probabilistic or weighted grammar implies a posterior probability distribution over possible parses of a given input sentence. One often needs to extract information from this distribution, by computing the expected counts (in the unknown parse) of various grammar.}, \nlocation = {}, \nkeywords = {}}\n\n\n","author_short":["Eisner, J"],"key":"Eisner2016","id":"Eisner2016","bibbaseid":"eisner-insideoutsideandforwardbackwardalgorithmsarejustbackproptutorialpaper-2016","role":"author","urls":{},"downloads":0},"bibtype":"article","biburl":"https://gist.githubusercontent.com/stuhlmueller/a37ef2ef4f378ebcb73d249fe0f8377a/raw/6f96f6f779501bd9482896af3e4db4de88c35079/references.bib","creationDate":"2020-01-27T02:13:34.172Z","downloads":0,"keywords":[],"search_terms":["inside","outside","forward","backward","algorithms","backprop","tutorial","paper","eisner"],"title":"Inside-Outside and Forward-Backward Algorithms Are Just Backprop (Tutorial Paper)","year":2016,"dataSources":["hEoKh4ygEAWbAZ5iy"]}