Publikation

Vision-Based Distance and Position Estimation of Nearby Objects for Mobile Spatial Interaction

Outline:

C. Holzmann, M. Hochgatterer - Vision-Based Distance and Position Estimation of Nearby Objects for Mobile Spatial Interaction - Proceedings of the 16th International Conference on Intelligent User Interfaces (IUI 2011), Stanford University, Palo Alto, Vereinigte Staaten von Amerika, 2011

Abstract:

New mobile phone technologies are enablers for the emerging field of mobile spatial interaction, which refers to the direct access and manipulation of spatially-related information and services. Typical applications include the visualization of information about historical buildings or the discovery and selection of surrounding devices, by simply pointing to the real-world objects of interest. However, a major drawback is the required augmentation of the objects or knowledge about the environment, in order to be able to distinguish at which object the user is actually aiming at. We address this issue by estimating the distance and position of arbitrary objects within a mobile phone's line of sight, solely based on the information provided by its on-board sensors. This new approach uses stereo vision to estimate the distance to nearby objects, inertial sensors to measure the displacement of the camera between successive images, as well as GPS and a digital compass to get its absolute position and orientation. In this paper, we focus on the vision-based estimation of distances, and present the results of an experiment which demonstrates its accuracy and performance.