Causal inference in time series in terms of Rényi transfer entropy
Uncovering causal interdependencies from observational data is one of the great challenges of nonlinear time series analysis. In this paper, we discuss this topic with the help of information-theoretic concept known as Rényi information measure. In particular, we tackle the directional information flow between bivariate time series in terms of Rényi transfer entropy. We show that by choosing Rényi α parameter appropriately we can control information that is transferred only between selected parts of underlying distributions. This, in turn, provides particularly potent tool for quantifying causal interdependencies in time series, where the knowledge of "black swan" events such as spikes or sudden jumps are of a key importance. In this connection, we first prove that for Gaussian variables, Granger causality and Rényi transfer entropy are entirely equivalent. Moreover, we also partially extend this results to heavy-tailed α-Gaussian variables. These results allow to establish connection between autoregressive and Rényi entropy based information-theoretic approaches to data-driven causal inference. To aid our intuition we employ Leonenko et al. entropy estimator and analyze Rényi information flow between bivariate time series generated from two unidirectionally coupled Rössler systems. Notably, we find that Rényi transfer entropy not only allowed us to detect a threshold of synchronization but it also provided a non-trivial insight into the structure of a transient regime that exists between region of chaotic correlations and synchronization threshold. In addition, from Rényi transfer entropy we could reliably infer the direction of coupling - and hence causality, only for coupling strengths smaller that the onset value of transient regime, i.e. when two Rössler systems were coupled, but have not yet entered a synchronization.
READ FULL TEXT