A Visual Sensor for Domestic Service Robots
DOI:
https://doi.org/10.12962/j25796216.v2.i1.37Abstract
In this study, we present a visual sensor for domestic service robots, which can capture both color information and three-dimensional information in real time, by calibrating a time of flight camera and two CCD cameras. The problem of occlusions is solved by the proposed occlusion detection algorithm. Since the proposed sensor uses two CCD cameras, missing color information of occluded pixels is compensated by one another. We conduct several evaluations to validate the proposed sensor, including investigation on object recognition task under occluded scenes using the visual sensor. The results revealed the effectiveness of proposed visual sensor. Keywords: Time of flight camera, visual sensor, camera calibration, occlusion detection, object recognition.References
T. Oggier, F. Lustenberger and N. Blanc, “Miniature 3D TOF Camera for Real-Time Imagingâ€, in Proc. of Perception and Interactive Technologies 2006, pp.212–216, June 2006.
K. Ohno, T. Nomura and S. Tadokoro, “Real-Time Robot Trajectory Estimation and 3D Map Construction Using 3D Cameraâ€, in Proc. of IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp.5279–5285, Oct. 2006.
S. May, D. Droeschel, D. Holz, C. Wiesen and S. Fuchs, “3D Pose Estimation and Mapping with Time-Of-Flight Camerasâ€, in Proc. of IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, Oct. 2008.
C. Beder, I. Schiller and R. Koch, “Real-Time Estimation of the Camera Path from a Sequence of Intrinsically Calibrated PMD Depth Imagesâ€, in Proc. of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol.XXXVII, pp.45–50, July 2008.
S. B. Gokturk and C. Tomasi, “3D Head Tracking Based on Recognition and Interpolation Using a Time-Of-Flight Depth Sensorâ€, in Proc. of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Vol.2, pp.211–217, July 2004.
D. W. Hansen, M. S. Hansen, M. Kirschmeyer, R. Larsen and D. Silvestre, “Cluster Tracking with Time-Of-Flight Camerasâ€, in Proc. of IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, pp.1–6, June 2008.
B. Bartczak, I. Schiller, C. Beder and R. Koch, “Integration of a TimeOf-Flight Camera into a Mixed Reality System for Handling Dynamic Scenes, Moving Viewpoints and Occlusions in Real-Timeâ€, in Proc. of International Symposium on 3D Data Processing, Visualization and Transmission, June 2008.
S. Fuchs and G. Hirzinger, “Extrinsic and Depth Calibration of TOFCamerasâ€, in Proc. of IEEE Conf. on Computer Vision and Pattern Recognition, pp.1–6, June 2008.
Y. M. Kim, D. Chan, C. Theobalt and S. Thrun, “Design and Calibration of a Multi-view TOF Sensor Fusion Systemâ€, in Proc. of IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, pp.1–7, June 2008.
A. Frick, B. Bartczack and R. Koch, “3D-TV LDV Content Generation with a Hybrid TOF-Multicamera Rigâ€, in Proc. of 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, June 2010.
Microsoft Kinect, http://www.xbox.com/en-us/kinect.
PrimeSense, http://www.primesense.com/.
MESA Imaging, http://www.mesa-imaging.ch/index.php
Z. Zhang, “A Flexible New Technique for Camera Calibrationâ€, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.22, No.11, pp.1330–1334, Nov. 2000.
H. H. Baker and T. 0. Binford, “Depth from Edge and Intensity based Stereoâ€, in Proc. of International Joint Conference on Artificial Intelligence, pp.631–638, Aug. 1981.
M. Attamimi, A. Mizutani, T. Nakamura, K. Sugiura, T. Nagai, N. Iwahashi, H. Okada and T. Omori, “Learning Novel Objects Using Out-of-Vocabulary Word Segmentation and Object Extraction for Home Assistant Robotsâ€, in Proc. of IEEE Int. Conf. on Robotics and Automation, pp.745–750, May 2010.
M. Attamimi, T. Araki, T. Nakamura, and T. Nagai, “Visual Recognition System for Cleaning Tasks by Humanoid Robotsâ€, International Journal of Advanced Robotic Systems, Vol. 10, No. 11, pp. 1–14, 2013.


