Entropy, Concentration of Probability and Conditional Limit Theorems

J.T. Lewis, C.-E. Pfister, W.G. Sullivan

1995, v.1, Issue 3, 319-386

ABSTRACT

We provide a framework in which a class of conditional limit theorems can be proved in a unified way. We introduce three concepts: a concentration set for a sequence of probability measures, generalizing the Weak Law of Large Numbers; conditioning with respect to a sequence of sets which satisfies a regularity condition; the asymptotic behaviour of the information gain of one sequence of probability measures with respect to another. These concepts are required for the statement of our main abstract result, Theorem 5.1, which describes the asymptotic behaviour of the information gain of a sequence of conditioned measures with respect to a sequence of tilted measures. Provided certain natural convexity assumptions are satisfied, it follows that conditional limit theorems are valid in great generality; this is the content of Theorem 6.1. We give several applications of the formalism, both for independent and weakly dependent random variables, extending in all cases previously known results. For the empirical measure, we provide a conditional limit theorem and give an alternative proof of the Large Deviation Principle. We discuss also the problem of equivalence of ensembles for lattice models in Statistical Mechanics.

Keywords: entropy,Large Deviation Principle,concentration,conditional limit theorem,LD-regularity,equivalence of ensembles

COMMENTS

Please log in or register to leave a comment


There are no comments yet