Algorithmic Polarization for Hidden Markov Models

10/03/2018
by   Venkatesan Guruswami, et al.
0

Using a mild variant of polar codes we design linear compression schemes compressing Hidden Markov sources (where the source is a Markov chain, but whose state is not necessarily observable from its output), and to decode from Hidden Markov channels (where the channel has a state and the error introduced depends on the state). We give the first polynomial time algorithms that manage to compress and decompress (or encode and decode) at input lengths that are polynomial both in the gap to capacity and the mixing time of the Markov chain. Prior work achieved capacity only asymptotically in the limit of large lengths, and polynomial bounds were not available with respect to either the gap to capacity or mixing time. Our results operate in the setting where the source (or the channel) is known. If the source is unknown then compression at such short lengths would lead to effective algorithms for learning parity with noise -- thus our results are the first to suggest a separation between the complexity of the problem when the source is known versus when it is unknown.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset