Forest Learning Universal Coding

08/01/2018
by   Joe Suzuki, et al.
0

This paper considers structure learning from data with n samples of p variables, assuming that the structure is a forest, using the Chow-Liu algorithm. Specifically, for incomplete data, we construct two model selection algorithms that complete in O(p^2) steps: one obtains a forest with the maximum posterior probability given the data, and the other obtains a forest that converges to the true one as n increases. We show that the two forests are generally different when some values are missing. Additionally, we present estimations for benchmark data sets to demonstrate that both algorithms work in realistic situations. Moreover, we derive the conditional entropy provided that no value is missing, and we evaluate the per-sample expected redundancy for the universal coding of incomplete data in terms of the number of non-missing samples.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset