pyEntropy icon indicating copy to clipboard operation
pyEntropy copied to clipboard

issue about small test set

Open Mr-CD opened this issue 4 years ago • 1 comments

Thank you all for providing this pyEntropy project. I am trying to run this sample_entropy as follow:

import numpy as np import pandas as pd from pyentrp import entropy as ent

ts = [0.1,0.2,0.3,0.4,0.5,0.6,0.7,0.8] sample_entropy = ent.sample_entropy(ts, 3, 0.2) # situation1: m+1=3, tolerance=0.2 sample_entropy1 = ent.sample_entropy(ts, 3, 0.21) # situation2: m+1=3, tolerance=0.21 print(sample_entropy) print(sample_entropy1)

Result is: [1.38629436 0.15415068 0.18232156] # correct [ 1.13497993 -0. -0. ] # no result, which is a problem

However,the result that I calculated by hand is 0.18232156 and 0.20067069 ( situation1 and 2) The algorithm I applied is here: https://en.wikipedia.org/wiki/Sample_entropy which is your reference too.

In addition, I ran the same test set on the code provided by the wikipedia, the result is 0.17185025692665928 and 0.10536051565782628 ( situation1 and 2)

Mr-CD avatar Dec 24 '20 07:12 Mr-CD

There might be differences in calculation between wiki and implementation here. I've provided implementation details in comments together with references. Please check them

nikdon avatar Jun 24 '21 21:06 nikdon

Seems to be fixed with #28

ts = [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8]
se1 = ent.sample_entropy(ts, 3, 0.2)  # situation1: m+1=3, tolerance=0.2
se2 = ent.sample_entropy(ts, 3, 0.21)  # situation2: m+1=3, tolerance=0.21
print(se1)
print(se1)

>>> [1.38629436 0.15415068 0.18232156]
>>> [1.38629436 0.15415068 0.18232156]

nikdon avatar Jun 17 '23 08:06 nikdon