Inferring eye position from populations of lateral intraparietal neurons. Graf, A. B. & Andersen, R. A. eLife, 2014.
doi  abstract   bibtex   
Understanding how the brain computes eye position is essential to unraveling high-level visual functions such as eye movement planning, coordinate transformations and stability of spatial awareness. The lateral intraparietal area (LIP) is essential for this process. However, despite decades of research, its contribution to the eye position signal remains controversial. LIP neurons have recently been reported to inaccurately represent eye position during a saccadic eye movement, and to be too slow to support a role in high-level visual functions. We addressed this issue by predicting eye position and saccade direction from the responses of populations of LIP neurons. We found that both signals were accurately predicted before, during and after a saccade. Also, the dynamics of these signals support their contribution to visual functions. These findings provide a principled understanding of the coding of information in populations of neurons within an important node of the cortical network for visual-motor behaviors.DOI: http://dx.doi.org/10.7554/eLife.02813.001$\backslash$nUnderstanding how the brain computes eye position is essential to unraveling high-level visual functions such as eye movement planning, coordinate transformations and stability of spatial awareness. The lateral intraparietal area (LIP) is essential for this process. However, despite decades of research, its contribution to the eye position signal remains controversial. LIP neurons have recently been reported to inaccurately represent eye position during a saccadic eye movement, and to be too slow to support a role in high-level visual functions. We addressed this issue by predicting eye position and saccade direction from the responses of populations of LIP neurons. We found that both signals were accurately predicted before, during and after a saccade. Also, the dynamics of these signals support their contribution to visual functions. These findings provide a principled understanding of the coding of information in populations of neurons within an important node of the cortical network for visual-motor behaviors.$\backslash$nDOI: http://dx.doi.org/10.7554/eLife.02813.001$\backslash$nWhenever we reach towards an object, we automatically use visual information to guide our movements and make any adjustments required. Visual feedback helps us to learn new motor skills, and ensures that our physical view of the world remains stable despite the fact that every eye movement causes the image on the retina to shift dramatically. However, such visual feedback is only useful because it can be compared with information on the position of the eyes, which is stored by the brain at all times.$\backslash$nIt is thought that one important structure where information on eye position is stored is an area towards the back of the brain called the lateral intraparietal cortex, but the exact contribution of this region has long been controversial. Graf and Andersen have now clarified the role of this area by studying monkeys as they performed an eye-movement task.$\backslash$nRhesus monkeys were trained to fixate on a particular location on a grid. A visual target was then flashed up briefly in another location and, after a short delay, the monkeys moved their eyes to the new location to earn a reward. As the monkeys performed the task, a group of electrodes recorded signals from multiple neurons within the lateral intraparietal cortex. This meant that Graf and Andersen could compare the neuronal responses of populations of neurons before, during, and after the movement.$\backslash$nBy studying neural populations, it was possible to accurately predict the direction in which a monkey was about to move his eyes, and also the initial and final eye positions. After a movement had occurred, the neurons also signaled the direction in which the monkey's eyes had been facing beforehand. Thus, the lateral intraparietal area stores both retrospective and forward-looking information about eye position and movement.$\backslash$nThe work of Graf and Andersen confirms that the LIP has a central role in eye movement functions, and also contributes more generally to our understanding of how behaviors are encoded at the level of populations of neurons. Such information could ultimately aid the development of neural prostheses to help patients with paralysis resulting from injury or neurodegeneration.$\backslash$nDOI: http://dx.doi.org/10.7554/eLife.02813.002
@article{graf2014inferring,
abstract = {Understanding how the brain computes eye position is essential to unraveling high-level visual functions such as eye movement planning, coordinate transformations and stability of spatial awareness. The lateral intraparietal area (LIP) is essential for this process. However, despite decades of research, its contribution to the eye position signal remains controversial. LIP neurons have recently been reported to inaccurately represent eye position during a saccadic eye movement, and to be too slow to support a role in high-level visual functions. We addressed this issue by predicting eye position and saccade direction from the responses of populations of LIP neurons. We found that both signals were accurately predicted before, during and after a saccade. Also, the dynamics of these signals support their contribution to visual functions. These findings provide a principled understanding of the coding of information in populations of neurons within an important node of the cortical network for visual-motor behaviors.DOI: http://dx.doi.org/10.7554/eLife.02813.001$\backslash$nUnderstanding how the brain computes eye position is essential to unraveling high-level visual functions such as eye movement planning, coordinate transformations and stability of spatial awareness. The lateral intraparietal area (LIP) is essential for this process. However, despite decades of research, its contribution to the eye position signal remains controversial. LIP neurons have recently been reported to inaccurately represent eye position during a saccadic eye movement, and to be too slow to support a role in high-level visual functions. We addressed this issue by predicting eye position and saccade direction from the responses of populations of LIP neurons. We found that both signals were accurately predicted before, during and after a saccade. Also, the dynamics of these signals support their contribution to visual functions. These findings provide a principled understanding of the coding of information in populations of neurons within an important node of the cortical network for visual-motor behaviors.$\backslash$nDOI: http://dx.doi.org/10.7554/eLife.02813.001$\backslash$nWhenever we reach towards an object, we automatically use visual information to guide our movements and make any adjustments required. Visual feedback helps us to learn new motor skills, and ensures that our physical view of the world remains stable despite the fact that every eye movement causes the image on the retina to shift dramatically. However, such visual feedback is only useful because it can be compared with information on the position of the eyes, which is stored by the brain at all times.$\backslash$nIt is thought that one important structure where information on eye position is stored is an area towards the back of the brain called the lateral intraparietal cortex, but the exact contribution of this region has long been controversial. Graf and Andersen have now clarified the role of this area by studying monkeys as they performed an eye-movement task.$\backslash$nRhesus monkeys were trained to fixate on a particular location on a grid. A visual target was then flashed up briefly in another location and, after a short delay, the monkeys moved their eyes to the new location to earn a reward. As the monkeys performed the task, a group of electrodes recorded signals from multiple neurons within the lateral intraparietal cortex. This meant that Graf and Andersen could compare the neuronal responses of populations of neurons before, during, and after the movement.$\backslash$nBy studying neural populations, it was possible to accurately predict the direction in which a monkey was about to move his eyes, and also the initial and final eye positions. After a movement had occurred, the neurons also signaled the direction in which the monkey's eyes had been facing beforehand. Thus, the lateral intraparietal area stores both retrospective and forward-looking information about eye position and movement.$\backslash$nThe work of Graf and Andersen confirms that the LIP has a central role in eye movement functions, and also contributes more generally to our understanding of how behaviors are encoded at the level of populations of neurons. Such information could ultimately aid the development of neural prostheses to help patients with paralysis resulting from injury or neurodegeneration.$\backslash$nDOI: http://dx.doi.org/10.7554/eLife.02813.002},
author = {Graf, Arnulf B.A. and Andersen, Richard A.},
doi = {10.7554/eLife.02813},
isbn = {2050-084X (Electronic)},
issn = {2050084X},
journal = {eLife},
keywords = {Subgroup2},
mendeley-tags = {Subgroup2},
number = {3},
pmid = {24844707},
title = {{Inferring eye position from populations of lateral intraparietal neurons}},
volume = {2014},
year = {2014}
}

Downloads: 0