# A Divergence Formula for Randomness and Dimension Short Version - Computer Science > Computational Complexity

Abstract: If $S$ is an infinite sequence over a finite alphabet $\Sigma$ and $\beta$ isa probability measure on $\Sigma$, then the {\it dimension} of $S$ withrespect to $\beta$, written $\dim^\betaS$, is a constructive version ofBillingsley dimension that coincides with the constructive Hausdorffdimension $\dimS$ when $\beta$ is the uniform probability measure. This papershows that $\dim^\betaS$ and its dual $\Dim^\betaS$, the {\it strongdimension} of $S$ with respect to $\beta$, can be used in conjunction withrandomness to measure the similarity of two probability measures $\alpha$ and$\beta$ on $\Sigma$. Specifically, we prove that the {\it divergence formula}$$\dim^\betaR = \Dim^\betaR =\CH\alpha - \CH\alpha + \D\alpha ||\beta$$ holds whenever $\alpha$ and $\beta$ are computable, positiveprobability measures on $\Sigma$ and $R \in \Sigma^\infty$ is random withrespect to $\alpha$. In this formula, $\CH\alpha$ is the Shannon entropy of$\alpha$, and $\D\alpha||\beta$ is the Kullback-Leibler divergence between$\alpha$ and $\beta$.

Author: Jack H. Lutz

Source: https://arxiv.org/