Adaptive Object Tracking by Learning Background Context. Borji, A., Frintrop, S., Sihite, D. N., & Itti, L. In Proc. IEEE CVPR 2012, Egocentric Vision workshop, Providence, Rhode Island, pages 1-8, Jun, 2012. abstract bibtex One challenge when tracking objects is to adapt the object representation depending on the scene context to account for changes in illumination, coloring, scaling, etc. Here, we present a solution that is based on our earlier approach for object tracking using particle filters and component-based descriptors. We extend the approach to deal with changing backgrounds by using a quick training phase with user interaction at the beginning of an image sequence. During this phase, some background clusters are learned along with object representations for those clusters. Next, for the rest of the sequence the best fitting background cluster is determined for each frame and the corresponding object representation is used for tracking. Experiments show a particle filter adapting to background changes can efficiently track objects and persons in natural scenes and results in higher tracking results than the basic approach. Additionally, using an object tracker to follow the main character in video games, we were able to explain a large amount of eye fixations higher than other saliency models in terms of NSS score proving that tracking is an important top-down attention component.
@inproceedings{ Borji_etal12ego,
author = {A. Borji and S. Frintrop and D. N. Sihite and L. Itti},
title = {Adaptive Object Tracking by Learning Background Context},
booktitle = {Proc. IEEE CVPR 2012, Egocentric Vision workshop, Providence, Rhode Island},
abstract = {One challenge when tracking objects is to adapt the object representation depending on the scene context to
account for changes in illumination, coloring, scaling, etc. Here, we present a solution that is
based on our earlier approach for object tracking using particle filters and component-based
descriptors. We extend the approach to deal with changing backgrounds by using a quick training phase
with user interaction at the beginning of an image sequence. During this phase, some background
clusters are learned along with object representations for those clusters. Next, for the rest of the
sequence the best fitting background cluster is determined for each frame and the corresponding object
representation is used for tracking. Experiments show a particle filter adapting to background changes
can efficiently track objects and persons in natural scenes and results in higher tracking results
than the basic approach. Additionally, using an object tracker to follow the main character in video
games, we were able to explain a large amount of eye fixations higher than other saliency models in
terms of NSS score proving that tracking is an important top-down attention component.},
pages = {1-8},
month = {Jun},
year = {2012},
review = {full/wkshp},
type = {bu;td;mod;cv},
file = {http://ilab.usc.edu/publications/doc/Borji_etal12ego.pdf}
}
Downloads: 0
{"_id":{"_str":"5298a19f9eb585cc260007c3"},"__v":0,"authorIDs":[],"author_short":["Borji, A.","Frintrop, S.","Sihite, D.<nbsp>N.","Itti, L."],"bibbaseid":"borji-frintrop-sihite-itti-adaptiveobjecttrackingbylearningbackgroundcontext-2012","bibdata":{"html":"<div class=\"bibbase_paper\"> \n\n\n<span class=\"bibbase_paper_titleauthoryear\">\n\t<span class=\"bibbase_paper_title\"><a name=\"Borji_etal12ego\"> </a>Adaptive Object Tracking by Learning Background Context.</span>\n\t<span class=\"bibbase_paper_author\">\nBorji, A.; Frintrop, S.; Sihite, D. N.; and Itti, L.</span>\n\t<!-- <span class=\"bibbase_paper_year\">2012</span>. -->\n</span>\n\n\n\nIn\n<i>Proc. IEEE CVPR 2012, Egocentric Vision workshop, Providence, Rhode Island</i>, page 1-8, Jun 2012.\n\n\n\n\n\n<br class=\"bibbase_paper_content\"/>\n\n<span class=\"bibbase_paper_content\">\n \n \n \n <a href=\"javascript:showBib('Borji_etal12ego')\"\n class=\"bibbase link\">\n <!-- <img src=\"http://www.bibbase.org/img/filetypes/bib.png\" -->\n\t<!-- alt=\"Adaptive Object Tracking by Learning Background Context [bib]\" -->\n\t<!-- class=\"bibbase_icon\" -->\n\t<!-- style=\"width: 24px; height: 24px; border: 0px; vertical-align: text-top\"><span class=\"bibbase_icon_text\">Bibtex</span> -->\n BibTeX\n <i class=\"fa fa-caret-down\"></i></a>\n \n \n \n <a class=\"bibbase_abstract_link bibbase link\"\n href=\"javascript:showAbstract('Borji_etal12ego')\">\n Abstract\n <i class=\"fa fa-caret-down\"></i></a>\n \n \n \n\n \n \n \n</span>\n\n<div class=\"well well-small bibbase\" id=\"bib_Borji_etal12ego\"\n style=\"display:none\">\n <pre>@inproceedings{ Borji_etal12ego,\n author = {A. Borji and S. Frintrop and D. N. Sihite and L. Itti},\n title = {Adaptive Object Tracking by Learning Background Context},\n booktitle = {Proc. IEEE CVPR 2012, Egocentric Vision workshop, Providence, Rhode Island},\n abstract = {One challenge when tracking objects is to adapt the object representation depending on the scene context to\n account for changes in illumination, coloring, scaling, etc. Here, we present a solution that is\n based on our earlier approach for object tracking using particle filters and component-based\n descriptors. We extend the approach to deal with changing backgrounds by using a quick training phase\n with user interaction at the beginning of an image sequence. During this phase, some background\n clusters are learned along with object representations for those clusters. Next, for the rest of the\n sequence the best fitting background cluster is determined for each frame and the corresponding object\n representation is used for tracking. Experiments show a particle filter adapting to background changes\n can efficiently track objects and persons in natural scenes and results in higher tracking results\n than the basic approach. Additionally, using an object tracker to follow the main character in video\n games, we were able to explain a large amount of eye fixations higher than other saliency models in\n terms of NSS score proving that tracking is an important top-down attention component.},\n pages = {1-8},\n month = {Jun},\n year = {2012},\n review = {full/wkshp},\n type = {bu;td;mod;cv},\n file = {http://ilab.usc.edu/publications/doc/Borji_etal12ego.pdf}\n}</pre>\n</div>\n\n\n<div class=\"well well-small bibbase\" id=\"abstract_Borji_etal12ego\"\n style=\"display:none\">\n One challenge when tracking objects is to adapt the object representation depending on the scene context to account for changes in illumination, coloring, scaling, etc. Here, we present a solution that is based on our earlier approach for object tracking using particle filters and component-based descriptors. We extend the approach to deal with changing backgrounds by using a quick training phase with user interaction at the beginning of an image sequence. During this phase, some background clusters are learned along with object representations for those clusters. Next, for the rest of the sequence the best fitting background cluster is determined for each frame and the corresponding object representation is used for tracking. Experiments show a particle filter adapting to background changes can efficiently track objects and persons in natural scenes and results in higher tracking results than the basic approach. Additionally, using an object tracker to follow the main character in video games, we were able to explain a large amount of eye fixations higher than other saliency models in terms of NSS score proving that tracking is an important top-down attention component.\n</div>\n\n\n</div>\n","downloads":0,"bibbaseid":"borji-frintrop-sihite-itti-adaptiveobjecttrackingbylearningbackgroundcontext-2012","role":"author","year":"2012","type":"bu;td;mod;cv","title":"Adaptive Object Tracking by Learning Background Context","review":"full/wkshp","pages":"1-8","month":"Jun","key":"Borji_etal12ego","id":"Borji_etal12ego","file":"http://ilab.usc.edu/publications/doc/Borji_etal12ego.pdf","booktitle":"Proc. IEEE CVPR 2012, Egocentric Vision workshop, Providence, Rhode Island","bibtype":"inproceedings","bibtex":"@inproceedings{ Borji_etal12ego,\n author = {A. Borji and S. Frintrop and D. N. Sihite and L. Itti},\n title = {Adaptive Object Tracking by Learning Background Context},\n booktitle = {Proc. IEEE CVPR 2012, Egocentric Vision workshop, Providence, Rhode Island},\n abstract = {One challenge when tracking objects is to adapt the object representation depending on the scene context to\n account for changes in illumination, coloring, scaling, etc. Here, we present a solution that is\n based on our earlier approach for object tracking using particle filters and component-based\n descriptors. We extend the approach to deal with changing backgrounds by using a quick training phase\n with user interaction at the beginning of an image sequence. During this phase, some background\n clusters are learned along with object representations for those clusters. Next, for the rest of the\n sequence the best fitting background cluster is determined for each frame and the corresponding object\n representation is used for tracking. Experiments show a particle filter adapting to background changes\n can efficiently track objects and persons in natural scenes and results in higher tracking results\n than the basic approach. Additionally, using an object tracker to follow the main character in video\n games, we were able to explain a large amount of eye fixations higher than other saliency models in\n terms of NSS score proving that tracking is an important top-down attention component.},\n pages = {1-8},\n month = {Jun},\n year = {2012},\n review = {full/wkshp},\n type = {bu;td;mod;cv},\n file = {http://ilab.usc.edu/publications/doc/Borji_etal12ego.pdf}\n}","author_short":["Borji, A.","Frintrop, S.","Sihite, D.<nbsp>N.","Itti, L."],"author":["Borji, A.","Frintrop, S.","Sihite, D. N.","Itti, L."],"abstract":"One challenge when tracking objects is to adapt the object representation depending on the scene context to account for changes in illumination, coloring, scaling, etc. Here, we present a solution that is based on our earlier approach for object tracking using particle filters and component-based descriptors. We extend the approach to deal with changing backgrounds by using a quick training phase with user interaction at the beginning of an image sequence. During this phase, some background clusters are learned along with object representations for those clusters. Next, for the rest of the sequence the best fitting background cluster is determined for each frame and the corresponding object representation is used for tracking. Experiments show a particle filter adapting to background changes can efficiently track objects and persons in natural scenes and results in higher tracking results than the basic approach. Additionally, using an object tracker to follow the main character in video games, we were able to explain a large amount of eye fixations higher than other saliency models in terms of NSS score proving that tracking is an important top-down attention component."},"bibtype":"inproceedings","biburl":"http://ilab.usc.edu/publications/src/ilab.bib","downloads":0,"search_terms":["adaptive","object","tracking","learning","background","context","borji","frintrop","sihite","itti"],"title":"Adaptive Object Tracking by Learning Background Context","year":2012,"dataSources":["wedBDxEpNXNCLZ2sZ"]}