A Lower Bound on the Complexity of Approximating the Entropy of a Markov Source - Computer Science > Information TheoryReport as inadecuate




A Lower Bound on the Complexity of Approximating the Entropy of a Markov Source - Computer Science > Information Theory - Download this document for free, or read online. Document in PDF available to download.

Abstract: Suppose that, for any k \geq 1, \epsilon > 0 and sufficiently large$\sigma$, we are given a black box that allows us to sample characters from a$k$th-order Markov source over the alphabet \{0,

., \sigma - 1\}. Even ifwe know the source has entropy either 0 or at least \log \sigma - k, thereis still no algorithm that, with probability bounded away from 1 - 2, guessesthe entropy correctly after sampling at most \sigma - k^{k - 2 - \epsilon}characters.



Author: Travis Gagie

Source: https://arxiv.org/







Related documents