The aim of this paper is to prove an inequality between relative entropy and the sum of average conditional relative entropies of the following form: for a fixed probability measure q on , ( is a finite set), and any probability measure on ,
(*)$$D(p||q){\rm{\le}}C \cdot \sum\limits_{i = 1}^n {{\rm{\mathbb{E}}}_p D(p_i ( \cdot |Y_1 ,{\rm{ }}...,{\rm{ }}Y_{i - 1} ,{\rm{ }}Y_{i + 1} ,...,{\rm{ }}Y_n )||q_i ( \cdot |Y_1 ,{\rm{ }}...,{\rm{ }}Y_{i - 1} ,{\rm{ }}Y_{i + 1} ,{\rm{ }}...,{\rm{ }}Y_n )),} $$
where pi(· |y1, …, yi−1, yi+1, …, yn) and qi(· |x1, …, xi−1, xi+1, …, xn) denote the local specifications for p resp. q, that is, the conditional distributions of the ith coordinate, given the other coordinates. The constant C depends on (the local specifications of) q.
The inequality (*) ismeaningful in product spaces, in both the discrete and the continuous case, and can be used to prove a logarithmic Sobolev inequality for q, provided uniform logarithmic Sobolev inequalities are available for qi(· |x1, …, xi−1, xi+1, …, xn), for all fixed i and fixed (x1, …, xi−1, xi+1, …, xn). Inequality (*) directly implies that the Gibbs sampler associated with q is a contraction for relative entropy.
In this paper we derive inequality (*), and thereby a logarithmic Sobolev inequality, in discrete product spaces, by proving inequalities for an appropriate Wasserstein-like distance.