It is well known that the presence of outliers can mis-estimate (underestimate or overestimate) the overall reserve in the chain-ladder method, when we consider a linear regression model, based on the assumption that the coefficients are fixed and identical from one observation to another. By relaxing the usual regression assumptions and applying a regression with randomly varying coefficients, we have a similar phenomenon, i.e., mis-estimation of the overall reserves. The lack of robustness of loss reserving regression with random coefficients on incremental payment estimators leads to the development of this paper, aiming to apply robust statistical procedures to the loss reserving estimation when regression coefficients are random. Numerical results of the proposed method are illustrated and compared with the results that were obtained by linear regression with fixed coefficients.