Huffman Coding. Furht, B., editor In Encyclopedia of Multimedia, pages 288–289. Springer US, 2008. 00000
Paper abstract bibtex The most popular entropy-based encoding technique is the Huffman code [1]. It provides the least amount of information units (bits) per source symbol. This short article describes how it works.The first step in the Huffman algorithm consists in creating a series of source reductions, by sorting the probabilities of each symbol and combining the (two) least probable symbols into a single symbol, which will then be used in the next source reduction stage. Figure 1 shows an example of consecutive source reductions. The original source symbols appear on the left-hand side, sorted in decreasing order by their probability of occurrence. In the first reduction, the two least probable symbols (a3 with prob. = 0.06 and a5 with prob. = 0.04) are combined into a composite symbol, whose probability is 0.06 + 0.04 = 0.1. This composite symbol and its probability are copied onto the first source reduction column at the proper slot (so as to enforce the requirement that the probabilities are sorted i ...
@incollection{furht_huffman_2008,
title = {Huffman {Coding}},
copyright = {©2008 Springer-Verlag},
isbn = {978-0-387-74724-8 978-0-387-78414-4},
url = {http://link.springer.com/referenceworkentry/10.1007/978-0-387-78414-4_338},
abstract = {The most popular entropy-based encoding technique is the Huffman code [1]. It provides the least amount of information units (bits) per source symbol. This short article describes how it works.The first step in the Huffman algorithm consists in creating a series of source reductions, by sorting the probabilities of each symbol and combining the (two) least probable symbols into a single symbol, which will then be used in the next source reduction stage. Figure 1 shows an example of consecutive source reductions. The original source symbols appear on the left-hand side, sorted in decreasing order by their probability of occurrence. In the first reduction, the two least probable symbols (a3 with prob. = 0.06 and a5 with prob. = 0.04) are combined into a composite symbol, whose probability is 0.06 + 0.04 = 0.1. This composite symbol and its probability are copied onto the first source reduction column at the proper slot (so as to enforce the requirement that the probabilities are sorted i ...},
language = {en},
urldate = {2016-05-03},
booktitle = {Encyclopedia of {Multimedia}},
publisher = {Springer US},
editor = {Furht, Borko},
year = {2008},
note = {00000},
pages = {288--289}
}
Downloads: 0
{"_id":"56rquqWDLTDMomFvm","bibbaseid":"furht-huffmancoding-2008","authorIDs":[],"bibdata":{"bibtype":"incollection","type":"incollection","title":"Huffman Coding","copyright":"©2008 Springer-Verlag","isbn":"978-0-387-74724-8 978-0-387-78414-4","url":"http://link.springer.com/referenceworkentry/10.1007/978-0-387-78414-4_338","abstract":"The most popular entropy-based encoding technique is the Huffman code [1]. It provides the least amount of information units (bits) per source symbol. This short article describes how it works.The first step in the Huffman algorithm consists in creating a series of source reductions, by sorting the probabilities of each symbol and combining the (two) least probable symbols into a single symbol, which will then be used in the next source reduction stage. Figure 1 shows an example of consecutive source reductions. The original source symbols appear on the left-hand side, sorted in decreasing order by their probability of occurrence. In the first reduction, the two least probable symbols (a3 with prob. = 0.06 and a5 with prob. = 0.04) are combined into a composite symbol, whose probability is 0.06 + 0.04 = 0.1. This composite symbol and its probability are copied onto the first source reduction column at the proper slot (so as to enforce the requirement that the probabilities are sorted i ...","language":"en","urldate":"2016-05-03","booktitle":"Encyclopedia of Multimedia","publisher":"Springer US","editor":[{"propositions":[],"lastnames":["Furht"],"firstnames":["Borko"],"suffixes":[]}],"year":"2008","note":"00000","pages":"288–289","bibtex":"@incollection{furht_huffman_2008,\n\ttitle = {Huffman {Coding}},\n\tcopyright = {©2008 Springer-Verlag},\n\tisbn = {978-0-387-74724-8 978-0-387-78414-4},\n\turl = {http://link.springer.com/referenceworkentry/10.1007/978-0-387-78414-4_338},\n\tabstract = {The most popular entropy-based encoding technique is the Huffman code [1]. It provides the least amount of information units (bits) per source symbol. This short article describes how it works.The first step in the Huffman algorithm consists in creating a series of source reductions, by sorting the probabilities of each symbol and combining the (two) least probable symbols into a single symbol, which will then be used in the next source reduction stage. Figure 1 shows an example of consecutive source reductions. The original source symbols appear on the left-hand side, sorted in decreasing order by their probability of occurrence. In the first reduction, the two least probable symbols (a3 with prob. = 0.06 and a5 with prob. = 0.04) are combined into a composite symbol, whose probability is 0.06 + 0.04 = 0.1. This composite symbol and its probability are copied onto the first source reduction column at the proper slot (so as to enforce the requirement that the probabilities are sorted i ...},\n\tlanguage = {en},\n\turldate = {2016-05-03},\n\tbooktitle = {Encyclopedia of {Multimedia}},\n\tpublisher = {Springer US},\n\teditor = {Furht, Borko},\n\tyear = {2008},\n\tnote = {00000},\n\tpages = {288--289}\n}\n\n","editor_short":["Furht, B."],"key":"furht_huffman_2008","id":"furht_huffman_2008","bibbaseid":"furht-huffmancoding-2008","role":"editor","urls":{"Paper":"http://link.springer.com/referenceworkentry/10.1007/978-0-387-78414-4_338"},"downloads":0,"html":""},"bibtype":"incollection","biburl":"http://www.telemidia.puc-rio.br/~alan/files/all.bib","creationDate":"2020-03-03T14:08:14.581Z","downloads":0,"keywords":[],"search_terms":["huffman","coding"],"title":"Huffman Coding","year":2008,"dataSources":["jAxurbvLP8q5LTdLa"]}