Adapting Everyday Manipulation Skills to Varied Scenarios

Pawel Gajewski, Paulo Ferreira, Georg Bartels, Chaozheng Wang, Frank Guerin, Bipin Indurkhya, Michael Beetz, Bartlomiej Sniezynski

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We address the problem of executing tool-using manipulation skills in scenarios where the objects to be used may vary. We assume that point clouds of the tool and target object can be obtained, but no interpretation or further knowledge about these objects is provided. The system must interpret the point clouds and decide how to use the tool to complete a manipulation task with a target object; this means it must adjust motion trajectories appropriately to complete the task. We tackle three everyday manipulations: scraping material from a tool into a container, cutting, and scooping from a container. Our solution encodes these manipulation skills in a generic way, with parameters that can be filled in at run-time via queries to a robot perception module; the perception module abstracts the functional parts for the tool and extracts key parameters that are needed for the task. The approach is evaluated in simulation and with selected examples on a PR2 robot.
Original languageEnglish
Title of host publicationProceedings 2019 International Conference on Robotics and Automation (ICRA)
PublisherIEEE Explore
Publication statusAccepted/In press - 26 Jan 2019
Event2019 International Conference on Robotics and Automation (ICRA) - Palais des congres de Montreal, Montreal, Canada
Duration: 20 May 201924 May 2019

Conference

Conference2019 International Conference on Robotics and Automation (ICRA)
CountryCanada
CityMontreal
Period20/05/1924/05/19

Fingerprint

Containers
Robots
Trajectories

Keywords

  • cs.RO

Cite this

Gajewski, P., Ferreira, P., Bartels, G., Wang, C., Guerin, F., Indurkhya, B., ... Sniezynski, B. (Accepted/In press). Adapting Everyday Manipulation Skills to Varied Scenarios. In Proceedings 2019 International Conference on Robotics and Automation (ICRA) [1245] IEEE Explore.

Adapting Everyday Manipulation Skills to Varied Scenarios. / Gajewski, Pawel; Ferreira, Paulo; Bartels, Georg; Wang, Chaozheng; Guerin, Frank; Indurkhya, Bipin; Beetz, Michael; Sniezynski, Bartlomiej.

Proceedings 2019 International Conference on Robotics and Automation (ICRA). IEEE Explore, 2019. 1245.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Gajewski, P, Ferreira, P, Bartels, G, Wang, C, Guerin, F, Indurkhya, B, Beetz, M & Sniezynski, B 2019, Adapting Everyday Manipulation Skills to Varied Scenarios. in Proceedings 2019 International Conference on Robotics and Automation (ICRA)., 1245, IEEE Explore, 2019 International Conference on Robotics and Automation (ICRA), Montreal, Canada, 20/05/19.
Gajewski P, Ferreira P, Bartels G, Wang C, Guerin F, Indurkhya B et al. Adapting Everyday Manipulation Skills to Varied Scenarios. In Proceedings 2019 International Conference on Robotics and Automation (ICRA). IEEE Explore. 2019. 1245
Gajewski, Pawel ; Ferreira, Paulo ; Bartels, Georg ; Wang, Chaozheng ; Guerin, Frank ; Indurkhya, Bipin ; Beetz, Michael ; Sniezynski, Bartlomiej. / Adapting Everyday Manipulation Skills to Varied Scenarios. Proceedings 2019 International Conference on Robotics and Automation (ICRA). IEEE Explore, 2019.
@inproceedings{16cd11878ea64ec6af9cb2345a7d94d6,
title = "Adapting Everyday Manipulation Skills to Varied Scenarios",
abstract = "We address the problem of executing tool-using manipulation skills in scenarios where the objects to be used may vary. We assume that point clouds of the tool and target object can be obtained, but no interpretation or further knowledge about these objects is provided. The system must interpret the point clouds and decide how to use the tool to complete a manipulation task with a target object; this means it must adjust motion trajectories appropriately to complete the task. We tackle three everyday manipulations: scraping material from a tool into a container, cutting, and scooping from a container. Our solution encodes these manipulation skills in a generic way, with parameters that can be filled in at run-time via queries to a robot perception module; the perception module abstracts the functional parts for the tool and extracts key parameters that are needed for the task. The approach is evaluated in simulation and with selected examples on a PR2 robot.",
keywords = "cs.RO",
author = "Pawel Gajewski and Paulo Ferreira and Georg Bartels and Chaozheng Wang and Frank Guerin and Bipin Indurkhya and Michael Beetz and Bartlomiej Sniezynski",
note = "This work is partially funded by: (1) AGH University of Science and Technology, grant No 15.11.230.318. (2) Deutsche Forschungsgemeinschaft (DFG) through the Collaborative Research Center 1320, EASE. (3) Elphinstone Scholarship from University of Aberdeen.",
year = "2019",
month = "1",
day = "26",
language = "English",
booktitle = "Proceedings 2019 International Conference on Robotics and Automation (ICRA)",
publisher = "IEEE Explore",

}

TY - GEN

T1 - Adapting Everyday Manipulation Skills to Varied Scenarios

AU - Gajewski, Pawel

AU - Ferreira, Paulo

AU - Bartels, Georg

AU - Wang, Chaozheng

AU - Guerin, Frank

AU - Indurkhya, Bipin

AU - Beetz, Michael

AU - Sniezynski, Bartlomiej

N1 - This work is partially funded by: (1) AGH University of Science and Technology, grant No 15.11.230.318. (2) Deutsche Forschungsgemeinschaft (DFG) through the Collaborative Research Center 1320, EASE. (3) Elphinstone Scholarship from University of Aberdeen.

PY - 2019/1/26

Y1 - 2019/1/26

N2 - We address the problem of executing tool-using manipulation skills in scenarios where the objects to be used may vary. We assume that point clouds of the tool and target object can be obtained, but no interpretation or further knowledge about these objects is provided. The system must interpret the point clouds and decide how to use the tool to complete a manipulation task with a target object; this means it must adjust motion trajectories appropriately to complete the task. We tackle three everyday manipulations: scraping material from a tool into a container, cutting, and scooping from a container. Our solution encodes these manipulation skills in a generic way, with parameters that can be filled in at run-time via queries to a robot perception module; the perception module abstracts the functional parts for the tool and extracts key parameters that are needed for the task. The approach is evaluated in simulation and with selected examples on a PR2 robot.

AB - We address the problem of executing tool-using manipulation skills in scenarios where the objects to be used may vary. We assume that point clouds of the tool and target object can be obtained, but no interpretation or further knowledge about these objects is provided. The system must interpret the point clouds and decide how to use the tool to complete a manipulation task with a target object; this means it must adjust motion trajectories appropriately to complete the task. We tackle three everyday manipulations: scraping material from a tool into a container, cutting, and scooping from a container. Our solution encodes these manipulation skills in a generic way, with parameters that can be filled in at run-time via queries to a robot perception module; the perception module abstracts the functional parts for the tool and extracts key parameters that are needed for the task. The approach is evaluated in simulation and with selected examples on a PR2 robot.

KW - cs.RO

UR - http://www.scopus.com/inward/record.url?scp=85071456745&partnerID=8YFLogxK

M3 - Conference contribution

BT - Proceedings 2019 International Conference on Robotics and Automation (ICRA)

PB - IEEE Explore

ER -