Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention. Itti, L., Dhavale, N., & Pighin, F. In Bosacchi, B., Fogel, D. B., & Bezdek, J. C., editors, Proc. SPIE 48th Annual International Symposium on Optical Science and Technology, volume 5200, pages 64-78, Bellingham, WA, Aug, 2003. SPIE Press. abstract bibtex We describe a neurobiological model of visual attention and eye/head movements in primates, and its application to the automatic animation of a realistic virtual human head watching an unconstrained variety of visual inputs. The bottom-up (image-based) attention model is based on the known neurophysiology of visual processing along the occipito-parietal pathway of the primate brain, while the eye/head movement model is derived from recordings in freely behaving Rhesus monkeys. The system is successful at autonomously saccading towards and tracking salient targets in a variety of video clips, including synthetic stimuli, real outdoors scenes and gaming console outputs. The resulting virtual human eye/head animation yields realistic rendering of the simulation results, both suggesting applicability of this approach to avatar animation and reinforcing the plausibility of the neural model.
@inproceedings{ Itti_etal03spienn,
author = {L. Itti and N. Dhavale and F. Pighin},
title = {Realistic Avatar Eye and Head Animation Using a
Neurobiological Model of Visual Attention},
abstract = { We describe a neurobiological model of visual attention and
eye/head movements in primates, and its application to the automatic
animation of a realistic virtual human head watching an unconstrained
variety of visual inputs. The bottom-up (image-based) attention model
is based on the known neurophysiology of visual processing along the
occipito-parietal pathway of the primate brain, while the eye/head
movement model is derived from recordings in freely behaving Rhesus
monkeys. The system is successful at autonomously saccading towards
and tracking salient targets in a variety of video clips, including
synthetic stimuli, real outdoors scenes and gaming console outputs.
The resulting virtual human eye/head animation yields realistic
rendering of the simulation results, both suggesting applicability of
this approach to avatar animation and reinforcing the plausibility of
the neural model.},
booktitle = { Proc. SPIE 48th Annual International Symposium on
Optical Science and Technology },
editor = {B. Bosacchi and D. B. Fogel and J. C. Bezdek},
volume = {5200},
publisher = {SPIE Press},
address = {Bellingham, WA},
type = { bu;mod;cv },
month = {Aug},
pages = {64-78},
year = {2003},
file = { http://iLab.usc.edu/publications/doc/Itti_etal03spienn.pdf },
review = {abs/conf}
}
Downloads: 0
{"_id":{"_str":"5298a1a19eb585cc260008fe"},"__v":0,"authorIDs":[],"author_short":["Itti, L.","Dhavale, N.","Pighin, F."],"bibbaseid":"itti-dhavale-pighin-realisticavatareyeandheadanimationusinganeurobiologicalmodelofvisualattention-2003","bibdata":{"html":"<div class=\"bibbase_paper\"> \n\n\n<span class=\"bibbase_paper_titleauthoryear\">\n\t<span class=\"bibbase_paper_title\"><a name=\"Itti_etal03spienn\"> </a>Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention.</span>\n\t<span class=\"bibbase_paper_author\">\nItti, L.; Dhavale, N.; and Pighin, F.</span>\n\t<!-- <span class=\"bibbase_paper_year\">2003</span>. -->\n</span>\n\n\n\nIn\nBosacchi, B.; Fogel, D. B.; and Bezdek, J. C., editor, <i>Proc. SPIE 48th Annual International Symposium on Optical Science and Technology</i>, volume 5200, page 64-78, Bellingham, WA, Aug 2003.\n\n\nSPIE Press.\n\n\n\n\n<br class=\"bibbase_paper_content\"/>\n\n<span class=\"bibbase_paper_content\">\n \n \n \n <a href=\"javascript:showBib('Itti_etal03spienn')\"\n class=\"bibbase link\">\n <!-- <img src=\"http://www.bibbase.org/img/filetypes/bib.png\" -->\n\t<!-- alt=\"Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention [bib]\" -->\n\t<!-- class=\"bibbase_icon\" -->\n\t<!-- style=\"width: 24px; height: 24px; border: 0px; vertical-align: text-top\"><span class=\"bibbase_icon_text\">Bibtex</span> -->\n BibTeX\n <i class=\"fa fa-caret-down\"></i></a>\n \n \n \n <a class=\"bibbase_abstract_link bibbase link\"\n href=\"javascript:showAbstract('Itti_etal03spienn')\">\n Abstract\n <i class=\"fa fa-caret-down\"></i></a>\n \n \n \n\n \n \n \n</span>\n\n<div class=\"well well-small bibbase\" id=\"bib_Itti_etal03spienn\"\n style=\"display:none\">\n <pre>@inproceedings{ Itti_etal03spienn,\n author = {L. Itti and N. Dhavale and F. Pighin},\n title = {Realistic Avatar Eye and Head Animation Using a\nNeurobiological Model of Visual Attention},\n abstract = { We describe a neurobiological model of visual attention and\neye/head movements in primates, and its application to the automatic\nanimation of a realistic virtual human head watching an unconstrained\nvariety of visual inputs. The bottom-up (image-based) attention model\nis based on the known neurophysiology of visual processing along the\noccipito-parietal pathway of the primate brain, while the eye/head\nmovement model is derived from recordings in freely behaving Rhesus\nmonkeys. The system is successful at autonomously saccading towards\nand tracking salient targets in a variety of video clips, including\nsynthetic stimuli, real outdoors scenes and gaming console outputs.\nThe resulting virtual human eye/head animation yields realistic\nrendering of the simulation results, both suggesting applicability of\nthis approach to avatar animation and reinforcing the plausibility of\nthe neural model.},\n booktitle = { Proc. SPIE 48th Annual International Symposium on\nOptical Science and Technology },\n editor = {B. Bosacchi and D. B. Fogel and J. C. Bezdek},\n volume = {5200},\n publisher = {SPIE Press},\n address = {Bellingham, WA},\n type = { bu;mod;cv },\n month = {Aug},\n pages = {64-78},\n year = {2003},\n file = { http://iLab.usc.edu/publications/doc/Itti_etal03spienn.pdf },\n review = {abs/conf}\n}</pre>\n</div>\n\n\n<div class=\"well well-small bibbase\" id=\"abstract_Itti_etal03spienn\"\n style=\"display:none\">\n We describe a neurobiological model of visual attention and eye/head movements in primates, and its application to the automatic animation of a realistic virtual human head watching an unconstrained variety of visual inputs. The bottom-up (image-based) attention model is based on the known neurophysiology of visual processing along the occipito-parietal pathway of the primate brain, while the eye/head movement model is derived from recordings in freely behaving Rhesus monkeys. The system is successful at autonomously saccading towards and tracking salient targets in a variety of video clips, including synthetic stimuli, real outdoors scenes and gaming console outputs. The resulting virtual human eye/head animation yields realistic rendering of the simulation results, both suggesting applicability of this approach to avatar animation and reinforcing the plausibility of the neural model.\n</div>\n\n\n</div>\n","downloads":0,"bibbaseid":"itti-dhavale-pighin-realisticavatareyeandheadanimationusinganeurobiologicalmodelofvisualattention-2003","role":"author","year":"2003","volume":"5200","type":"bu;mod;cv","title":"Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention","review":"abs/conf","publisher":"SPIE Press","pages":"64-78","month":"Aug","key":"Itti_etal03spienn","id":"Itti_etal03spienn","file":"http://iLab.usc.edu/publications/doc/Itti_etal03spienn.pdf","editor_short":["Bosacchi, B.","Fogel, D.<nbsp>B.","Bezdek, J.<nbsp>C."],"editor":["Bosacchi, B.","Fogel, D. B.","Bezdek, J. C."],"booktitle":"Proc. SPIE 48th Annual International Symposium on Optical Science and Technology","bibtype":"inproceedings","bibtex":"@inproceedings{ Itti_etal03spienn,\n author = {L. Itti and N. Dhavale and F. Pighin},\n title = {Realistic Avatar Eye and Head Animation Using a\nNeurobiological Model of Visual Attention},\n abstract = { We describe a neurobiological model of visual attention and\neye/head movements in primates, and its application to the automatic\nanimation of a realistic virtual human head watching an unconstrained\nvariety of visual inputs. The bottom-up (image-based) attention model\nis based on the known neurophysiology of visual processing along the\noccipito-parietal pathway of the primate brain, while the eye/head\nmovement model is derived from recordings in freely behaving Rhesus\nmonkeys. The system is successful at autonomously saccading towards\nand tracking salient targets in a variety of video clips, including\nsynthetic stimuli, real outdoors scenes and gaming console outputs.\nThe resulting virtual human eye/head animation yields realistic\nrendering of the simulation results, both suggesting applicability of\nthis approach to avatar animation and reinforcing the plausibility of\nthe neural model.},\n booktitle = { Proc. SPIE 48th Annual International Symposium on\nOptical Science and Technology },\n editor = {B. Bosacchi and D. B. Fogel and J. C. Bezdek},\n volume = {5200},\n publisher = {SPIE Press},\n address = {Bellingham, WA},\n type = { bu;mod;cv },\n month = {Aug},\n pages = {64-78},\n year = {2003},\n file = { http://iLab.usc.edu/publications/doc/Itti_etal03spienn.pdf },\n review = {abs/conf}\n}","author_short":["Itti, L.","Dhavale, N.","Pighin, F."],"author":["Itti, L.","Dhavale, N.","Pighin, F."],"address":"Bellingham, WA","abstract":"We describe a neurobiological model of visual attention and eye/head movements in primates, and its application to the automatic animation of a realistic virtual human head watching an unconstrained variety of visual inputs. The bottom-up (image-based) attention model is based on the known neurophysiology of visual processing along the occipito-parietal pathway of the primate brain, while the eye/head movement model is derived from recordings in freely behaving Rhesus monkeys. The system is successful at autonomously saccading towards and tracking salient targets in a variety of video clips, including synthetic stimuli, real outdoors scenes and gaming console outputs. The resulting virtual human eye/head animation yields realistic rendering of the simulation results, both suggesting applicability of this approach to avatar animation and reinforcing the plausibility of the neural model."},"bibtype":"inproceedings","biburl":"http://ilab.usc.edu/publications/src/ilab.bib","downloads":0,"search_terms":["realistic","avatar","eye","head","animation","using","neurobiological","model","visual","attention","itti","dhavale","pighin"],"title":"Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention","year":2003,"dataSources":["wedBDxEpNXNCLZ2sZ"]}