AtanasAtanasovIpsos

Results 5 comments of AtanasAtanasovIpsos

Thanks for the reply. I will try playing with the write_sav and will see if I can produce such a file.

Here is an example of a code that will generate about 84.6MB of a file that cannot be read back due to the same error. ``` import random import pandas...

The second option would be the best one for me. Or something like: ``` df, meta = pyreadstat.read_sav('DataFile.sav', metadataonly=True,safetylimits=False) ```

you are right about the bigger files. now that I think more about it, removing the limit seems also good solution :)

+1 is there a timeline for when can we expect this feature?