Sign up or log in for free. It helps support the project and unlocks personalized paper recommendations and new AI tools. .
Abstract The topic of this paper is testing the assumption of exchangeability, which is the standard assumption in mainstream machine learning. The common approaches are online testing by betting (such as conformal testing) and the older batch testing using p-values (as in classical hypothesis testing). The approach of this paper is intermediate in that we are interested in batch testing by betting; as a result, p-values are replaced by e-values. As a first step in this direction, this paper concentrates on the Markov model as alternative. The null hypothesis of exchangeability is formalized as a Kolmogorov-type compression model, and the Bayes mixture of the Markov model w.r. to the uniform prior is taken as simple alternative hypothesis. Using e-values instead of p-values leads to a computationally efficient testing procedure. Two appendixes discuss connections with the algorithmic theory of randomness; in particular, the test proposed in this paper can be interpreted as a poor manâs version of Kolmogorovâs deficiency of randomness.
This paper addresses the problem of testing exchangeability in statistical data using e-values instead of the more traditional p-values. The core idea is to formalize the null hypothesis of exchangeability using a Kolmogorov-type compression model and to use a Bayes mixture of the Markov model with a uniform prior as a simple alternative hypothesis. By using e-values, the author demonstrates that the testing procedure becomes computationally more efficient.
Key innovations include:
Prior ingredients needed for this paper include:
Action | Title | Date | Authors |
---|