Estimation of Entropy in Constant Space with Improved Sample Complexity

05/19/2022
by   Maryam Aliakbarpour, et al.
0

Recent work of Acharya et al. (NeurIPS 2019) showed how to estimate the entropy of a distribution 𝒟 over an alphabet of size k up to ±ϵ additive error by streaming over (k/ϵ^3) ·polylog(1/ϵ) i.i.d. samples and using only O(1) words of memory. In this work, we give a new constant memory scheme that reduces the sample complexity to (k/ϵ^2)·polylog(1/ϵ). We conjecture that this is optimal up to polylog(1/ϵ) factors.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset