Dynamic analyses of information encoding in neural ensembles. Barbieri, R., Frank, M, L., Nguyen, P, D., Quirk, C, M., Solo, V., Wilson, A, M., Brown, & N, E. Neural Comput., 16(2):277--307, February, 2004.
Dynamic analyses of information encoding in neural ensembles. [link]Paper  doi  abstract   bibtex   
Neural spike train decoding algorithms and techniques to compute Shannon mutual information are important methods for analyzing how neural systems represent biological signals. Decoding algorithms are also one of several strategies being used to design controls for brain-machine interfaces. Developing optimal strategies to design decoding algorithms and compute mutual information are therefore important problems in computational neuroscience. We present a general recursive filter decoding algorithm based on a point process model of individual neuron spiking activity and a linear stochastic state-space model of the biological signal. We derive from the algorithm new instantaneous estimates of the entropy, entropy rate, and the mutual information between the signal and the ensemble spiking activity. We assess the accuracy of the algorithm by computing, along with the decoding error, the true coverage probability of the approximate 0.95 confidence regions for the individual signal estimates. We illustrate the new algorithm by reanalyzing the position and ensemble neural spiking activity of CA1 hippocampal neurons from two rats foraging in an open circular environment. We compare the performance of this algorithm with a linear filter constructed by the widely used reverse correlation method. The median decoding error for Animal 1 (2) during 10 minutes of open foraging was 5.9 (5.5) cm, the median entropy was 6.9 (7.0) bits, the median information was 9.4 (9.4) bits, and the true coverage probability for 0.95 confidence regions was 0.67 (0.75) using 34 (32) neurons. These findings improve significantly on our previous results and suggest an integrated approach to dynamically reading neural codes, measuring their properties, and quantifying the accuracy with which encoded information is extracted.
@article{ Barb_2004_277,
  abstract = {Neural spike train decoding algorithms and techniques to compute Shannon
	mutual information are important methods for analyzing how neural
	systems represent biological signals. Decoding algorithms are also
	one of several strategies being used to design controls for brain-machine
	interfaces. Developing optimal strategies to design decoding algorithms
	and compute mutual information are therefore important problems in
	computational neuroscience. We present a general recursive filter
	decoding algorithm based on a point process model of individual neuron
	spiking activity and a linear stochastic state-space model of the
	biological signal. We derive from the algorithm new instantaneous
	estimates of the entropy, entropy rate, and the mutual information
	between the signal and the ensemble spiking activity. We assess the
	accuracy of the algorithm by computing, along with the decoding error,
	the true coverage probability of the approximate 0.95 confidence
	regions for the individual signal estimates. We illustrate the new
	algorithm by reanalyzing the position and ensemble neural spiking
	activity of CA1 hippocampal neurons from two rats foraging in an
	open circular environment. We compare the performance of this algorithm
	with a linear filter constructed by the widely used reverse correlation
	method. The median decoding error for Animal 1 (2) during 10 minutes
	of open foraging was 5.9 (5.5) cm, the median entropy was 6.9 (7.0)
	bits, the median information was 9.4 (9.4) bits, and the true coverage
	probability for 0.95 confidence regions was 0.67 (0.75) using 34
	(32) neurons. These findings improve significantly on our previous
	results and suggest an integrated approach to dynamically reading
	neural codes, measuring their properties, and quantifying the accuracy
	with which encoded information is extracted.},
  added-at = {2009-06-03T11:20:58.000+0200},
  author = {Barbieri, Riccardo and Frank, Loren M and Nguyen, David P and Quirk, Michael C and Solo, Victor and Wilson, Matthew A and Brown, Emery N},
  biburl = {http://www.bibsonomy.org/bibtex/255b36022e10a07e42f0339963a3ba074/hake},
  description = {The whole bibliography file I use.},
  doi = {10.1162/089976604322742038},
  file = {Barb_2004_277.pdf:Barb_2004_277.pdf:PDF},
  interhash = {068c5ef6cfc1453132764740b244f6a5},
  intrahash = {55b36022e10a07e42f0339963a3ba074},
  journal = {Neural Comput.},
  keywords = {(Computer), 15006097 Action Algorithms, Animals, Behavior, Comparative Computer-Assisted, Exploratory Gov't, Hippocampus, Long-Evans, Nerve Net, Networks Neural Neurons, Non-P.H.S., Non-U.S. P.H.S., Potentials, Processes, Processing, Rats, Reaction Reproducibility Research Results, Signal Stochastic Study, Support, Synaptic Time, Transmission, U.S. of},
  month = {February},
  number = {2},
  pages = {277--307},
  pmid = {15006097},
  timestamp = {2006.07.05},
  title = {Dynamic analyses of information encoding in neural ensembles.},
  url = {http://dx.doi.org/10.1162/089976604322742038},
  volume = {16},
  year = {2004}
}

Downloads: 0