bigstatsr
bigstatsr copied to clipboard
Question about memory
I was wondering if you could give me a sense if the memory use of a big_spLinReg I am running seems to be appropriate, and if not if there was a way to reduce the memory requirements. My dataset is 300 individuals x 23626816 sites stored as a double FBM, and in order to run efficiently with many cores I need ~600Gb of memory. Does this seem correct to you? Just wondering if I am doing something wrong or if there are ways to reduce memory usage without sacrificing efficiency here.
Thanks!