EntropyHub icon indicating copy to clipboard operation
EntropyHub copied to clipboard

Negative Approximate Entropy

Open JAC28 opened this issue 7 months ago • 0 comments

Although the Approximate Entropy is always described as a non-negative quantity, the implemented method can become negative in extreme cases, as the attached example in Python shows.

import numpy as np
import EntropyHub as EH

# Set the seed to ensure reproducibility
np.random.seed(42)

# Generate an array of 100 random numbers
rnd_data = np.random.rand(100)

print(EH.ApEn(rnd_data, m=2, r=0.2*np.std(rnd_data))[0][-1])
print(EH.ApEn(rnd_data[:12], m=2, r=0.2*np.std(rnd_data[:12]))[0][-1])

The result is:

0.6736246486917667
-0.09531017980432521

As far as I understand the algorithm correctly, this behaviour is mathematically correct and occurs if there are only self-matches, since for example for $N=12$ the elements $C_i^m$ are each $1/(N-m+1)$ and $C_i^{m+1}$ are each $1/(N-m+2)$. The corresponding logarithms are $-2.39789527$ and $-2.30258509$, making the difference $\phi^m-\phi^{m+1}$ negative. The implementation in the package antropy leads to the same results.

The whole thing is a very niche edge case and no longer occurs with the example data from $N=15$. Nevertheless, a note/ warning to the user would be useful in these cases.

JAC28 avatar Jun 28 '24 08:06 JAC28