Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention. Itti, L., Dhavale, N., & Pighin, F. In Bosacchi, B., Fogel, D. B., & Bezdek, J. C., editors, Proc. SPIE 48th Annual International Symposium on Optical Science and Technology, volume 5200, pages 64-78, Bellingham, WA, Aug, 2003. SPIE Press.
abstract   bibtex   
We describe a neurobiological model of visual attention and eye/head movements in primates, and its application to the automatic animation of a realistic virtual human head watching an unconstrained variety of visual inputs. The bottom-up (image-based) attention model is based on the known neurophysiology of visual processing along the occipito-parietal pathway of the primate brain, while the eye/head movement model is derived from recordings in freely behaving Rhesus monkeys. The system is successful at autonomously saccading towards and tracking salient targets in a variety of video clips, including synthetic stimuli, real outdoors scenes and gaming console outputs. The resulting virtual human eye/head animation yields realistic rendering of the simulation results, both suggesting applicability of this approach to avatar animation and reinforcing the plausibility of the neural model.
@inproceedings{ Itti_etal03spienn,
  author = {L. Itti and N. Dhavale and F. Pighin},
  title = {Realistic Avatar Eye and Head Animation Using a
Neurobiological Model of Visual Attention},
  abstract = { We describe a neurobiological model of visual attention and
eye/head movements in primates, and its application to the automatic
animation of a realistic virtual human head watching an unconstrained
variety of visual inputs. The bottom-up (image-based) attention model
is based on the known neurophysiology of visual processing along the
occipito-parietal pathway of the primate brain, while the eye/head
movement model is derived from recordings in freely behaving Rhesus
monkeys. The system is successful at autonomously saccading towards
and tracking salient targets in a variety of video clips, including
synthetic stimuli, real outdoors scenes and gaming console outputs.
The resulting virtual human eye/head animation yields realistic
rendering of the simulation results, both suggesting applicability of
this approach to avatar animation and reinforcing the plausibility of
the neural model.},
  booktitle = { Proc. SPIE 48th Annual International Symposium on
Optical Science and Technology },
  editor = {B. Bosacchi and D. B. Fogel and J. C. Bezdek},
  volume = {5200},
  publisher = {SPIE Press},
  address = {Bellingham, WA},
  type = { bu;mod;cv },
  month = {Aug},
  pages = {64-78},
  year = {2003},
  file = { http://iLab.usc.edu/publications/doc/Itti_etal03spienn.pdf },
  review = {abs/conf}
}

Downloads: 0