Information Inequalities and a Dependent Central Limit Theorem
2001, v.7, Issue 4, 627-645
We adapt arguments concerning information-theoretic convergence in the Central Limit Theorem to the case of dependent random variables under Rosenblatt mixing conditions. The key is to work with random variables perturbed by the addition of a normal random variable, giving us good control of the joint density and the mixing coefficient. We strengthen results of Takano and of Carlen and Soffer to provide entropy-theoretic, not weak convergence.
Keywords: normal convergence,entropy,Fisher information,mixing conditions