This paper presents a mobile robot platform, which performs both indoor and outdoor localization based on an intelligent low-cost depth–inertial fusion approach. The proposed sensor fusion approach uses depth-based localization data to enhance the accuracy obtained by the inertial measurement unit (IMU) pose data through a depth–inertial fusion. The fusion approach is based on feedforward cascade correlation networks (CCNs). The aim of this fusion approach is to correct the drift accompanied by the use of the IMU sensor, using a depth camera. This approach also has the advantage of maintaining the high frequency of the IMU sensor and the accuracy of the depth camera. The estimated mobile robot dynamic states through the proposed approach are deployed and examined through real-time autonomous navigation. It is shown that using both the planned path and the continuous localization approach, the robot successfully controls its movement toward the destination. Several tests were conducted with different numbers of layers and percentages of the training set. It is shown that the best performance is obtained with 12 layers and 80% of the pose data used as a training set for CCN. The proposed framework is then compared to the solution based on fusing the information given by the XSens IMU–GPS sensor and the Kobuki robot built-in odometry solution. As demonstrated in the results, an enhanced performance was achieved with an average Euclidean error of 0.091 m by the CCN, which is lower than the error achieved by the artificial neural network by 56%.