In this paper we present a series of haptic exploratory procedures, or EPs, implemented for a multi-fingered, articulated, sensate robot hand. These EPs are designed to extract specific tactile and kinesthetic information from an object via their purposive invocation by an intelligent robotic system. Taken together, they form an active robotic touch perception system to be used both in extracting information about the environment for internal representation and in acquiring grasps for manipulation. The theory and structure of this robotic haptic system is based upon models of human haptic exploration and information processing.
The haptic system presented utilizes an integrated robotic system consisting of a PUMA 560 robot arm, a JPL/Stanford robot hand, with joint torque sensing in the fingers, a wrist force/torque sensor, and a 256 element, spatially-resolved fingertip tactile array. We describe the EPs implemented for this system and provide experimental results which illustrate how they function and how the information which they extract may be used. In addition to the sensate hand and arm, the robot also contains structured-lighting vision and a Prolog-based reasoning system capable of grasp generation and object categorization. We present a set of simple tasks which show how both grasping and recognition may be enhanced by the addition of active touch perception.