Knowledge of the physical properties of terrain surrounding a planetary exploration rover can be used to allow a rover system to fully exploit its mobility capabilities. Terrain classification methods provide semantic descriptions of the physical nature of a given terrain region. These descriptions can be associated with nominal numerical physical parameters, and/or nominal traversability estimates, to improve mobility prediction accuracy. Here we study the performance of multisensor classification methods in the context of Mars surface exploration. The performance of two classification algorithms for color, texture, and range features are presented based on maximum likelihood estimation and support vector machines. In addition, a classification method based on vibration features derived from rover wheel–terrain interaction is briefly described. Two techniques for merging the results of these “low-level” classifiers are presented that rely on Bayesian fusion and meta-classifier fusion. The performance of these algorithms is studied using images from NASA's Mars Exploration Rover mission and through experiments on a four-wheeled test-bed rover operating in Mars-analog terrain. Also a novel approach to terrain sensing based on fused tactile and visual features is presented. It is shown that accurate terrain classification can be achieved via classifier fusion from visual and tactile features.