Transfer of Tool Affordance and Manipulation Cues with 3D Vision Data

Paulo Abelha, Frank Guerin

Research output: Working paper

4 Downloads (Pure)

Abstract

Future service robots working in human environments, such as kitchens, will face situations where they need to improvise. The usual tool for a given task might not be available and the robot will have to use some substitute tool. The robot needs to select an appropriate alternative tool from the candidates available, and also needs to know where to grasp it, how to orient it and what part to use as the end-effector. We present a system which takes as input a candidate tool's point cloud and weight, and outputs a score for how effective that tool is for a task, and how to use it. Our key novelty is in taking a task-driven approach, where the task exerts a top-down influence on how low level vision data is interpreted. This facilitates the type of 'everyday creativity' where an object such as a wine bottle could be used as a rolling pin, because the interpretation of the object is not fixed in advance, but rather results from the interaction between the bottom-up and top-down pressures at run-time. The top-down influence is implemented by transfer: prior knowledge of geometric features that make a tool good for a task is used to seek similar features in a candidate tool. The prior knowledge is learned by simulating Web models performing the tasks. We evaluate on a set of fifty household objects and five tasks. We compare our system with the closest one in the literature and show that we achieve significantly better results
Original languageEnglish
PublisherArXiv
Publication statusSubmitted - 13 Oct 2017

    Fingerprint

Keywords

  • cs.RO

Cite this