Information Theory for Signal Analysis (ITSA)
This toolbox illustrates the use of auto-mutual information (mutual information between subsets of the same signal) and entropy rate as powerful tools to assess refined dependencies of any order in signal temporal dynamics.
It shows how two-point auto-mutual information and entropy rate unveil information conveyed by higher order statistic and thus capture details of temporal dynamics that are overlooked by the (two-point) correlation function. Notably, it presents how Auto Mutual Information (and entropy rate) permits to discriminate between several different non Gaussian processes, having exactly the same marginal distribution and covariance function.
Further, this toolbox proposes a generalization to multi-point auto-correlation that is able to assess higher order statistics and unveils the global dependence structure.
This methodology has been developed at ENS de Lyon during the PhD of Carlos Granero Belinchon under the supervision of Stéphane G. Roux and Nicolas B. Garnier, and has been described and used in the following articles :
- C. Granero-Belinchon, S.G. Roux, P. Abry and N. B. Garnier. "Probing high order dependencies with information theory."
submitted to IEEE Signal processing, october 2018.
- C. Granero-Belinchon, S.G. Roux and N. B. Garnier. "Kullback-Leibler divergence measure of intermittency: Application to turbulence." Phys. Rev. E, vol. 97 (1), 013107, 2018.
- C. Granero-Belinchon, S.G. Roux and N. B. Garnier. "Scaling of information in turbulence." EPL, vol. 115 (5), 58003, 2016.
- C. Granero-Belinchon, S.G. Roux, P. Abry, M. Doret and N. B. Garnier. "Information Theory to Probe Intrapartum Fetal Heart Rate Dynamics."
Entropy, vol. 19(12), 640, 2017.
This Matlab toolbox uses the fonction knnsearch which requires the "statistics" toolbox to be installed within Matlab. This toolbox is not suitable for analyzing large dataset but it's provided as a proof of concept.
Content.
The toolbox contains the following scripts :
- analyse_whitenoise.m : compute entropy rate and auto mutual information at differents lags for Gaussian and log-normal white noise.
- analyse_invariantnoise.m : compute entropy rate and auto mutual information at differents lags for two log-normal
with same marginal and same autocovariance.
the following functions :
- compute_entropy.m : computes the Shannon entropy of a signal for different embedding dimensions and time lags.
- compute_AMI.m : compute auto mutual informaion of a signal for different embedding dimensions and time lags.
- embed.m : embeds a one dimensional signal.
and a directory
data with some processes.