We study a time-delay regularization of the anisotropic
diffusion model for image denoising
of Perona and Malik [IEEE Trans. Pattern Anal. Mach. Intell12 (1990) 629–639], which has been proposed by Nitzberg and Shiota [IEEE Trans. Pattern Anal. Mach. Intell14 (1998) 826–835].
In the two-dimensional case, we show the convergence of a numerical
approximation and the existence of a weak solution. Finally, we show some
experiments on images.