This paper presents a neural model to solve
the visual-tactile-motor coordination problem in robotic applications. The proposed neural
controller is based on the VAMC (Vector Associative Map) model.
This algorithm is based on the human biological system and
has the ability of learning the mapping that establishes the
relationship between the spatial and the motor coordinates. These spatial
inputs are composed of visual and force parameters. The LINCE
stereohead carries out a visual detection process, detecting the positions
of the object and of the manipulator. The artificial tactile
skins placed over the two fingers of the gripper measure
the force distribution when an object is touched. The neural
controller has been implemented for robotic operations of reaching and
object grasping. The reaching process is fed back in order
to minimize the Difference Vector (DV) between the visual projections
of the object and the manipulator. The stable grasping task
processes the force distribution maps detected in the contact with
the two surfaces of the gripper, in order to direct
the object into the robotic fingers. Experimental results have demonstrated
the robustness of the model and the accuracy of the
final pick-and-place process.