In *Encyclopedia of Multimedia*, pages 288–289. Springer US, 2008. 00000

Paper abstract bibtex

Paper abstract bibtex

The most popular entropy-based encoding technique is the Huffman code [1]. It provides the least amount of information units (bits) per source symbol. This short article describes how it works.The first step in the Huffman algorithm consists in creating a series of source reductions, by sorting the probabilities of each symbol and combining the (two) least probable symbols into a single symbol, which will then be used in the next source reduction stage. Figure 1 shows an example of consecutive source reductions. The original source symbols appear on the left-hand side, sorted in decreasing order by their probability of occurrence. In the first reduction, the two least probable symbols (a3 with prob. = 0.06 and a5 with prob. = 0.04) are combined into a composite symbol, whose probability is 0.06 + 0.04 = 0.1. This composite symbol and its probability are copied onto the first source reduction column at the proper slot (so as to enforce the requirement that the probabilities are sorted i ...

@incollection{furht_huffman_2008, title = {Huffman {Coding}}, copyright = {©2008 Springer-Verlag}, isbn = {978-0-387-74724-8 978-0-387-78414-4}, url = {http://link.springer.com/referenceworkentry/10.1007/978-0-387-78414-4_338}, abstract = {The most popular entropy-based encoding technique is the Huffman code [1]. It provides the least amount of information units (bits) per source symbol. This short article describes how it works.The first step in the Huffman algorithm consists in creating a series of source reductions, by sorting the probabilities of each symbol and combining the (two) least probable symbols into a single symbol, which will then be used in the next source reduction stage. Figure 1 shows an example of consecutive source reductions. The original source symbols appear on the left-hand side, sorted in decreasing order by their probability of occurrence. In the first reduction, the two least probable symbols (a3 with prob. = 0.06 and a5 with prob. = 0.04) are combined into a composite symbol, whose probability is 0.06 + 0.04 = 0.1. This composite symbol and its probability are copied onto the first source reduction column at the proper slot (so as to enforce the requirement that the probabilities are sorted i ...}, language = {en}, urldate = {2016-05-03}, booktitle = {Encyclopedia of {Multimedia}}, publisher = {Springer US}, editor = {Furht, Borko}, year = {2008}, note = {00000}, pages = {288--289} }

Downloads: 0