Building fast Bayesian computing machines out of intentionally stochastic, digital parts. Mansinghka, V. & Jonas, E. arXiv:1402.4914 [cs, stat], February, 2014. arXiv: 1402.4914
Building fast Bayesian computing machines out of intentionally stochastic, digital parts [link]Paper  abstract   bibtex   
The brain interprets ambiguous sensory information faster and more reliably than modern computers, using neurons that are slower and less reliable than logic gates. But Bayesian inference, which underpins many computational models of perception and cognition, appears computationally challenging even given modern transistor speeds and energy budgets. The computational principles and structures needed to narrow this gap are unknown. Here we show how to build fast Bayesian computing machines using intentionally stochastic, digital parts, narrowing this efficiency gap by multiple orders of magnitude. We find that by connecting stochastic digital components according to simple mathematical rules, one can build massively parallel, low precision circuits that solve Bayesian inference problems and are compatible with the Poisson firing statistics of cortical neurons. We evaluate circuits for depth and motion perception, perceptual learning and causal reasoning, each performing inference over 10,000+ latent variables in real time - a 1,000x speed advantage over commodity microprocessors. These results suggest a new role for randomness in the engineering and reverse-engineering of intelligent computation.
@article{mansinghka_building_2014,
	title = {Building fast {Bayesian} computing machines out of intentionally stochastic, digital parts},
	url = {http://arxiv.org/abs/1402.4914},
	abstract = {The brain interprets ambiguous sensory information faster and more reliably than modern computers, using neurons that are slower and less reliable than logic gates. But Bayesian inference, which underpins many computational models of perception and cognition, appears computationally challenging even given modern transistor speeds and energy budgets. The computational principles and structures needed to narrow this gap are unknown. Here we show how to build fast Bayesian computing machines using intentionally stochastic, digital parts, narrowing this efficiency gap by multiple orders of magnitude. We find that by connecting stochastic digital components according to simple mathematical rules, one can build massively parallel, low precision circuits that solve Bayesian inference problems and are compatible with the Poisson firing statistics of cortical neurons. We evaluate circuits for depth and motion perception, perceptual learning and causal reasoning, each performing inference over 10,000+ latent variables in real time - a 1,000x speed advantage over commodity microprocessors. These results suggest a new role for randomness in the engineering and reverse-engineering of intelligent computation.},
	urldate = {2018-02-04TZ},
	journal = {arXiv:1402.4914 [cs, stat]},
	author = {Mansinghka, Vikash and Jonas, Eric},
	month = feb,
	year = {2014},
	note = {arXiv: 1402.4914},
	keywords = {⛔ No DOI found}
}

Downloads: 0