Published online by Cambridge University Press: 18 January 2010
Let p and q be probability vectors with the same entropy h. Denote by B(p) the Bernoulli shift indexed by ℤ with marginal distribution p. Suppose that φ is a measure-preserving homomorphism from B(p) to B(q). We prove that if the coding length of φ has a finite 1/2 moment, then σ2p=σ2q, where σ2p=∑ ipi(−log pi−h)2 is the informational variance of p. In this result, the 1/2 moment cannot be replaced by a lower moment. On the other hand, for any θ<1, we exhibit probability vectors p and q that are not permutations of each other, such that there exists a finitary isomorphism Φ from B(p) to B(q) where the coding lengths of Φ and of its inverse have a finite θ moment. We also present an extension to ergodic Markov chains.