We consider the problem of sensor-based motion planning for a three-dimensional robot arm manipulator operating among unknown obstacles. When every point of the robot body is subject to potential collision. The corresponding planning system must include these four basic components: sensor hardware; real-time signal/sensory data processing hardware/software; a local step planning subsystem that works at the basic sample rate of the arm; and finally, a subsystem for global planning. The arm sensor system developed at Yale University presents a proximity sensitive skin that covers the whole body of the arm and consists of an array of discrete active infrared sensors that detect obstacles by processing reflected light. The sensor data then undergoes low level processing via a step planning procedure, which converts sensor information into local normals at the contact points in the configuration space of the robot. This paper presents preliminary results on the fourth component, a real-time algorithm that realizes the upper, global level of planning. Based on the current collection of local normals, the algorithm generates preferable directions of motion around obstacles, so as to guarantee reaching the target position if it is reachable. Experimental results from testing the developed system are also discussed.