Robust localization of furniture parts by integrating depth and intensity data suitable for range sensors with varying image quality

Pascal Meißner*, Sven R. Schmidt-Rohr, Martin Lösch, Rainer Jäkel, Rüdiger Dillmann

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingPublished conference contribution

3 Citations (Scopus)

Abstract

In this paper we present an approach to localize planar furniture parts in 3D range camera data for autonomous robot manipulation, that estimates both their six degree of freedom (DoF) poses and their dimensions. Range cameras are a promising sensor category for mobile robotics. Unfortunately, many of them come with a considerable measurement noise, that leads to difficulties when trying to detect objects or their parts e.g. using canonical methods for range image segmentation. In contrast, our approach is able to overcome these issues by combining concepts of 2D and 3D computer vision as well as integrating intensity and depth data on several levels of abstraction. Therefore it is not restricted to range sensors with high image quality and scales on cameras with lower image quality, too. This concept is generic and has been implemented for elliptical object parts as a proof of concept.

Original languageEnglish
Title of host publicationIEEE 15th International Conference on Advanced Robotics
Subtitle of host publicationNew Boundaries for Robotics, ICAR 2011
Pages48-54
Number of pages7
DOIs
Publication statusPublished - 28 Dec 2011
EventIEEE 15th International Conference on Advanced Robotics: New Boundaries for Robotics, ICAR 2011 - Tallinn, Estonia
Duration: 20 Jun 201123 Jun 2011

Publication series

NameIEEE 15th International Conference on Advanced Robotics: New Boundaries for Robotics, ICAR 2011

Conference

ConferenceIEEE 15th International Conference on Advanced Robotics: New Boundaries for Robotics, ICAR 2011
Country/TerritoryEstonia
CityTallinn
Period20/06/1123/06/11

Fingerprint

Dive into the research topics of 'Robust localization of furniture parts by integrating depth and intensity data suitable for range sensors with varying image quality'. Together they form a unique fingerprint.

Cite this