Localization of furniture parts by integrating range and intensity data robust against depths with low signal-to-noise ratio

Pascal Meissner*, Sven R. Schmidt-Rohr, Martin Loesch, Rainer Jaekel, Ruediger Dillmann

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

In this article we present an approach for localizing planar parts of furniture in depth data from range cameras. It estimates both their six-degree-of-freedom poses and their dimensions. The system has been designed for enabling robots to autonomously manipulate furniture. Range cameras are a promising sensor category for this application. As many of them provide data with considerable noise and distortions, detecting objects, for example, using canonical methods for range data segmentation or feature extraction, is complicated. In contrast, our approach is able to overcome these issues. This is done by combining concepts of 2D and 3D computer vision as well as integrating intensity and range information in multiple steps of our processing chain. Therefore it can be employed on range sensors with both low and high signal-to-noise ratios and in particular on time-of-flight cameras. This concept can be adapted to various object shapes. It has been implemented for object parts with shapes similar to ellipses as a proof-of-concept. For this, a state-of-the-art ellipse detection method has been enhanced regarding our application. (C) 2012 Elsevier B.V. All rights reserved.

Original languageEnglish
Pages (from-to)25-37
Number of pages13
JournalRobotics and autonomous systems
Volume62
Issue number1
Early online date8 Aug 2012
DOIs
Publication statusPublished - Jan 2014

Keywords

  • 3D computer vision
  • Localization
  • Range and intensity data
  • Furniture
  • Kinect
  • Time-of-flight camera

Fingerprint

Dive into the research topics of 'Localization of furniture parts by integrating range and intensity data robust against depths with low signal-to-noise ratio'. Together they form a unique fingerprint.

Cite this