Storing and Recalling Information for Vision Localization. Siagian, C. & Itti, L. In IEEE International Conference on Robotics and Automation (ICRA), Pasadena, California, May, 2008.
abstract   bibtex   
In implementing a vision localization system, a crucial issue to consider is how to efficiently store and recall the necessary information, so that the robot is not only able to accurately localize itself, but does so in a timely manner. In the presented system, we discuss a strategy to minimize the amount of stored data by analyzing the strengths and weaknesses of several cooperating recognition modules and by using them through a prioritization scheme which orders the data entries from the most likely to match to the least likely. We validate the system through a series of experiments in three large scale outdoor environments: a building complex (126x180ft. area, 3583 testing images), a vegetation-filled park (270x360ft. area, 6006 testing images), and an open-field area (450x585ft. area, 8823 testing images) - each with its own set of challenges. Not only is the system able to localize in these environments (on average 3.46ft., 6.55ft., 12.96ft. of error, respectively), it does so while searching through only 7.35%, 3.50%, and 6.12% of all the stored information, respectively.
@inproceedings{ Siagian_Itti08icra,
  author = {C. Siagian and L. Itti},
  title = {Storing and Recalling Information for Vision Localization},
  abstract = {In implementing a vision localization system, a crucial
                  issue to consider is how to efficiently store and
                  recall the necessary information, so that the robot
                  is not only able to accurately localize itself, but
                  does so in a timely manner. In the presented system,
                  we discuss a strategy to minimize the amount of
                  stored data by analyzing the strengths and
                  weaknesses of several cooperating recognition
                  modules and by using them through a prioritization
                  scheme which orders the data entries from the most
                  likely to match to the least likely. We validate the
                  system through a series of experiments in three
                  large scale outdoor environments: a building complex
                  (126x180ft. area, 3583 testing images), a
                  vegetation-filled park (270x360ft.  area, 6006
                  testing images), and an open-field area
                  (450x585ft. area, 8823 testing images) - each with
                  its own set of challenges.  Not only is the system
                  able to localize in these environments (on average
                  3.46ft., 6.55ft., 12.96ft. of error, respectively),
                  it does so while searching through only 7.35%,
                  3.50%, and 6.12% of all the stored information,
                  respectively.},
  year = {2008},
  month = {May},
  booktitle = {IEEE International Conference on Robotics and Automation (ICRA),
Pasadena, California},
  type = {bu; sc},
  file = {http://ilab.usc.edu/publications/doc/Siagian_Itti08icra.pdf},
  review = {full/conf},
  if = {2008 acceptance rate: 43%}
}

Downloads: 0