Decay Makes Supervised Predictive Coding Generative. Wei Sun Master's thesis. Paper abstract bibtex Predictive Coding is a hierarchical model of neural computation that approximates backpropagation using only local computations and local learning rules. An important aspect of Predictive Coding is the presence of feedback connections between layers. These feedback connections allow Predictive Coding networks to potentially be generative as well as discriminative. However, Predictive Coding networks trained on supervised classification tasks cannot generate accurate input samples close to the training inputs from the class vectors alone. This problem arises from the fact that generating inputs from classes requires solving an under determined system, which contains an infinite number of solutions. Generating the correct inputs involves reaching a specific solution in that infinite solution space. But by imposing a minimum-norm constraint on the state nodes and the synaptic weights of aPredictive Coding network, the solution space collapses to a unique solution that is close to the training inputs. This minimum-norm constraint can be enforced by adding decay to the Predictive Coding equations. Decay is implemented in the form of weight decay and activity decay. Analyses done on linear Predictive Coding networks show that applying weight decay during training helps the network learn weights that can generate the correct input samples from the class vectors, while applying activity decay during input generation helps to lower the variance in the network’s generated samples. Additionally, weight decay regularizes the values of the network weights, avoiding extreme values, and improves the rate at which the network converges to equilibrium by regularizing the eigenvalues of the Jacobian at the equilibrium. Experiments on the MNIST dataset of handwritten digits provide evidence that decay makes Predictive Coding networks generative even when the network contains deep layers and uses nonlinear tanh activations. A Predictive Coding network equipped with weight and activity decay successfully generates images resembling MNIST digits from the class vectors alone.
@mastersthesis{wei_sun_decay_nodate,
title = {Decay {Makes} {Supervised} {Predictive} {Coding} {Generative}},
url = {https://uwspace.uwaterloo.ca/bitstream/handle/10012/16141/Sun_Wei.pdf?sequence=3&isAllowed=y},
abstract = {Predictive Coding is a hierarchical model of neural computation that approximates backpropagation using only local computations and local learning rules. An important aspect of Predictive Coding is the presence of feedback connections between layers. These feedback connections allow Predictive Coding networks to potentially be generative as well as discriminative. However, Predictive Coding networks trained on supervised classification tasks cannot generate accurate input samples close to the training inputs from the class vectors alone.
This problem arises from the fact that generating inputs from classes requires solving an under determined system, which contains an infinite number of solutions. Generating the correct inputs involves reaching a specific solution in that infinite solution space. But by imposing a minimum-norm constraint on the state nodes and the synaptic weights of aPredictive Coding network, the solution space collapses to a unique solution that is close to the training inputs. This minimum-norm constraint can be enforced by adding decay to the Predictive Coding equations.
Decay is implemented in the form of weight decay and activity decay. Analyses done on linear Predictive Coding networks show that applying weight decay during training helps the network learn weights that can generate the correct input samples from the class vectors, while applying activity decay during input generation helps to lower the variance in the network’s generated samples. Additionally, weight decay regularizes the values of the network weights, avoiding extreme values, and improves the rate at which the network converges to equilibrium by regularizing the eigenvalues of the Jacobian at the equilibrium.
Experiments on the MNIST dataset of handwritten digits provide evidence that decay makes Predictive Coding networks generative even when the network contains deep layers and uses nonlinear tanh activations. A Predictive Coding network equipped with weight and activity decay successfully generates images resembling MNIST digits from the class vectors alone.},
language = {en},
author = {{Wei Sun}},
keywords = {computer science, neural networks},
}
Downloads: 0
{"_id":"ixoTXSKcHYniAzfRg","bibbaseid":"weisun-decaymakessupervisedpredictivecodinggenerative","authorIDs":[],"author_short":["Wei Sun"],"bibdata":{"bibtype":"mastersthesis","type":"mastersthesis","title":"Decay Makes Supervised Predictive Coding Generative","url":"https://uwspace.uwaterloo.ca/bitstream/handle/10012/16141/Sun_Wei.pdf?sequence=3&isAllowed=y","abstract":"Predictive Coding is a hierarchical model of neural computation that approximates backpropagation using only local computations and local learning rules. An important aspect of Predictive Coding is the presence of feedback connections between layers. These feedback connections allow Predictive Coding networks to potentially be generative as well as discriminative. However, Predictive Coding networks trained on supervised classification tasks cannot generate accurate input samples close to the training inputs from the class vectors alone. This problem arises from the fact that generating inputs from classes requires solving an under determined system, which contains an infinite number of solutions. Generating the correct inputs involves reaching a specific solution in that infinite solution space. But by imposing a minimum-norm constraint on the state nodes and the synaptic weights of aPredictive Coding network, the solution space collapses to a unique solution that is close to the training inputs. This minimum-norm constraint can be enforced by adding decay to the Predictive Coding equations. Decay is implemented in the form of weight decay and activity decay. Analyses done on linear Predictive Coding networks show that applying weight decay during training helps the network learn weights that can generate the correct input samples from the class vectors, while applying activity decay during input generation helps to lower the variance in the network’s generated samples. Additionally, weight decay regularizes the values of the network weights, avoiding extreme values, and improves the rate at which the network converges to equilibrium by regularizing the eigenvalues of the Jacobian at the equilibrium. Experiments on the MNIST dataset of handwritten digits provide evidence that decay makes Predictive Coding networks generative even when the network contains deep layers and uses nonlinear tanh activations. A Predictive Coding network equipped with weight and activity decay successfully generates images resembling MNIST digits from the class vectors alone.","language":"en","author":[{"firstnames":[],"propositions":[],"lastnames":["Wei Sun"],"suffixes":[]}],"keywords":"computer science, neural networks","bibtex":"@mastersthesis{wei_sun_decay_nodate,\n\ttitle = {Decay {Makes} {Supervised} {Predictive} {Coding} {Generative}},\n\turl = {https://uwspace.uwaterloo.ca/bitstream/handle/10012/16141/Sun_Wei.pdf?sequence=3&isAllowed=y},\n\tabstract = {Predictive Coding is a hierarchical model of neural computation that approximates backpropagation using only local computations and local learning rules. An important aspect of Predictive Coding is the presence of feedback connections between layers. These feedback connections allow Predictive Coding networks to potentially be generative as well as discriminative. However, Predictive Coding networks trained on supervised classification tasks cannot generate accurate input samples close to the training inputs from the class vectors alone.\n\nThis problem arises from the fact that generating inputs from classes requires solving an under determined system, which contains an infinite number of solutions. Generating the correct inputs involves reaching a specific solution in that infinite solution space. But by imposing a minimum-norm constraint on the state nodes and the synaptic weights of aPredictive Coding network, the solution space collapses to a unique solution that is close to the training inputs. This minimum-norm constraint can be enforced by adding decay to the Predictive Coding equations.\n\nDecay is implemented in the form of weight decay and activity decay. Analyses done on linear Predictive Coding networks show that applying weight decay during training helps the network learn weights that can generate the correct input samples from the class vectors, while applying activity decay during input generation helps to lower the variance in the network’s generated samples. Additionally, weight decay regularizes the values of the network weights, avoiding extreme values, and improves the rate at which the network converges to equilibrium by regularizing the eigenvalues of the Jacobian at the equilibrium.\n\nExperiments on the MNIST dataset of handwritten digits provide evidence that decay makes Predictive Coding networks generative even when the network contains deep layers and uses nonlinear tanh activations. A Predictive Coding network equipped with weight and activity decay successfully generates images resembling MNIST digits from the class vectors alone.},\n\tlanguage = {en},\n\tauthor = {{Wei Sun}},\n\tkeywords = {computer science, neural networks},\n}\n\n\n\n","author_short":["Wei Sun"],"key":"wei_sun_decay_nodate","id":"wei_sun_decay_nodate","bibbaseid":"weisun-decaymakessupervisedpredictivecodinggenerative","role":"author","urls":{"Paper":"https://uwspace.uwaterloo.ca/bitstream/handle/10012/16141/Sun_Wei.pdf?sequence=3&isAllowed=y"},"keyword":["computer science","neural networks"],"metadata":{"authorlinks":{}},"downloads":0},"bibtype":"mastersthesis","biburl":"https://bibbase.org/zotero-group/nicoguaro/525293","creationDate":"2020-09-27T15:53:39.381Z","downloads":0,"keywords":["computer science","neural networks"],"search_terms":["decay","makes","supervised","predictive","coding","generative","wei sun"],"title":"Decay Makes Supervised Predictive Coding Generative","year":null,"dataSources":["YtBDXPDiQEyhyEDZC","fhHfrQgj3AaGp7e9E","qzbMjEJf5d9Lk78vE","45tA9RFoXA9XeH4MM","MeSgs2KDKZo3bEbxH","nSXCrcahhCNfzvXEY","ecatNAsyr4f2iQyGq","tpWeaaCgFjPTYCjg3"]}