Efficient Entropy Estimation for Stationary Time Series

04/11/2019
by   Alexander L Young, et al.
0

Entropy estimation, due in part to its connection with mutual information, has seen considerable use in the study of time series data including causality detection and information flow. In many cases, the entropy is estimated using k-nearest neighbor (Kozachenko-Leonenko) based methods. However, analytic results on this estimator are limited to independent data. In the article, we show rigorous bounds on the rate of decay of the bias in the number of samples, N, assuming they are drawn from a stationary process which satisfies a suitable mixing condition. Numerical examples are presented which demonstrate the efficiency of the estimator when applied to a Markov process with stationary Gaussian density. These results support the asymptotic rates derived in the theoretical work.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset