Information Estimates and Markov Random Fields
1999, v.5, №3, 269-291
Consider a random field, with values in some finite set $\Sigma\subset R$ and index set a cube $\Lambda_n \subset Z^d$. We show that in the vicinity (in the information- theoretic sense) of strongly mixing Markov fields, considering sub-blocks of variables indexed by $\Lambda_m\subset \Lambda_n$, the distribution on this smaller cube can be described precisely, even when the size of the cube grows with $n$. The general results are then applied to mean field perturbations of Gibbs measures (in particular, mean field perturbations of Ising models). The proofs use entropy arguments as well as (known) result on complete analyticity and mixing for the Ising model.
Keywords: information,relative entropy,Gibbs fields,Markov fields,weak-mixing,complete analyticity,mean-field