Visual Search for an Object in a 3D Environment using a Mobile Robot


January 28, 2009


John K. Tsotsos


York University, Toronto, Canada


Consider the problem of visually finding an object in a mostly unknown space with
a mobile robot. It is clear that all possible views and images cannot be examined
in a practical system. Visual attention is a complex phenomenon; we view it as
a mechanism that optimizes the search processes inherent in vision. Here, we describe
a particular example of a practical robotic vision system that employs some
of these attentive processes. We cast this as an optimization problem, i.e., optimizing
the probability of finding the target given a fixed cost limit in terms of total
number of robotic actions required to find the visual target. Due to the inherent
intractability of this problem, we present an approximate solution and investigate
its performance and properties. We conclude that our approach is sufficient to solve this problem
and has additional desirable empirical characteristics.


John K. Tsotsos

John K. Tsotsos was born in Windsor, ON, Canada. He received the B.S., M.S., and Ph.D. degrees in computer science, all from the University of Toronto, Toronto, ON, Canada, in 1974, 1976, and 1980, respectively. From 1980 to 1999 he was a Professor in the Department of Computer Science and the Department of Medicine, University of Toronto. He is currently Professor of Computer Science at York University, Toronto, and Director of the York Center for Vision Research. He has served on numerous conference committees and the editorial boards of Computer Vision and Image Understanding, Image and Vision Computing Journal, Computational Intelligence, and AI & Medicine. His research focuses on biologically plausible models of visual attention, the development of a visually guided robots to assist physically disabled children, and perceptually guided robot control mechanisms. Dr. Tsotsos was the General Chair of the 7th IEEE International Conference on Computer Vision, Corfu, Greece, in 1999. More information is available at