Published online by Cambridge University Press: 27 August 2015
Concentration inequalities are fundamental tools in probabilistic combinatorics and theoretical computer science for proving that functions of random variables are typically near their means. Of particular importance is the case where f(X) is a function of independent random variables X = (X1, . . ., Xn). Here the well-known bounded differences inequality (also called McDiarmid's inequality or the Hoeffding–Azuma inequality) establishes sharp concentration if the function f does not depend too much on any of the variables. One attractive feature is that it relies on a very simple Lipschitz condition (L): it suffices to show that |f(X) − f(X′)| ⩽ ck whenever X, X′ differ only in Xk. While this is easy to check, the main disadvantage is that it considers worst-case changes ck, which often makes the resulting bounds too weak to be useful.
In this paper we prove a variant of the bounded differences inequality which can be used to establish concentration of functions f(X) where (i) the typical changes are small, although (ii) the worst case changes might be very large. One key aspect of this inequality is that it relies on a simple condition that (a) is easy to check and (b) coincides with heuristic considerations as to why concentration should hold. Indeed, given an event Γ that holds with very high probability, we essentially relax the Lipschitz condition (L) to situations where Γ occurs. The point is that the resulting typical changes ck are often much smaller than the worst case ones.
To illustrate its application we consider the reverse H-free process, where H is 2-balanced. We prove that the final number of edges in this process is concentrated, and also determine its likely value up to constant factors. This answers a question of Bollobás and Erdős.