This paper introduces a novel approach to representing and learning
tool affordances by a robot. The tool representation described here uses a behavior-based approach
to ground the tool affordances in the
behavioral repertoire of the robot. The representation is learned
during a behavioral babbling stage in which the robot randomly chooses
different exploratory behaviors, applies them to the tool, and observes
their effects on environmental objects. The paper shows how the autonomously learned
affordance representation can be used to solve tool-using tasks by dynamically sequencing
the exploratory behaviors based on their expected outcomes.
The quality of the learned representation was tested on extension-of-reach tool-using
tasks.
@InProceedings{Stoytchev2005, author = {Alexander Stoytchev}, title = {Behavior-Grounded Representation of Tool Affordances}, booktitle = {Proceedings of IEEE International Conference on Robotics and Automation (ICRA)}, year = 2005, pages = {3071-3076}, }
The movies show testing trials for an extension of reach task with different tools.
Demonstration of autonomous adaptation under uncertainty. Learning is performed with a T-Hook. The movies show subsequent testing trials performed with an L-Hook. The robot "believes" it is still using the orignial T-Hook tool.