Potential bug using setDatarecordDuration
Hello! I was experimenting with the setDatarecordDuration() in EdfWriter.
From the documentation it says:
Sets the datarecord duration. The default value is 100000 which is 1 second. ATTENTION: the argument “duration” is expressed in units of 10 microSeconds!
But, I tried running the following code:
import pyedflib
f = pyedflib.EdfWriter(path,1)
f.setDatarecordDuration(100000)
f.close()
and got the following error:
Is this a bug or am I misunderstanding how the function works?
Thank you!
Okay, there definitively seems to be some mismatch and wrong documentation in our code base, see also #238 .
The record duration is set here in seconds, not nanoseconds.
Hopefully I (or somebody else) gets around fixing this soon.
Thanks for the quick response @skjerns! Is this something I could help with? Also, just curious does that mean that the record duration does not support values less than one second? Or does it support values from .001 to 60 like it says later in the documentation?
Mhh, I think this might be a bit more complicated, so unless you feel confident diving into the code base there's not much to do right now. But feel free to look into it!
It supports values from 0.01 to 60 afaik. Do you know what the record_duration parameter does? It simply sets the block size of the EDF file, so it's just a technical term for EDF-insides
Superseeded by #268
My recommendation is to not use setDatarecordDuration, but if you need, you can use it with seconds, which is now also reflected in the docstring.
Sets the datarecord duration. The default value is 1 second.
The datarecord duration must be in the range 0.00001 to 60 seconds.
Usually, the datarecord duration is calculated automatically to
ensure that all sample frequencies are representable, nevertheless,
you can overwrite the datarecord duration manually. This can, however,
lead to unexpected side-effects in the sample frequency calculations.