svtyper
svtyper copied to clipboard
memory error
Hi, I've been experiencing a memory error in a couple samples
File "/frazer01/software/speedseq-20170419/bin/svtyper", line 1806, in
I have tried using max_reads argument (used 2000) to limit reads sampled in high read depth regions- but I have noticed that this resulted in many sites not getting genotyped (./.). Any ideas? One of the samples has a totally normal insert size distribution, the other is slightly skewed low (attached). How much memory should these jobs be consuming?
Thanks!
We've been using SVTyper on 30-50x human WGS with under 10 GB of memory (with no limit on --max_reads). However, we exclude certain regions of the genome from LUMPY in which we frequently see excessive read pileups.
You are correct that the insert size distribution is slightly skewed, but it does not look severe enough to cause major problems. How much RAM is this process using and can you determine a particular variant that is crashing SVTyper?
Thanks for the reply, I think the issue occurred on a heavily used cluster node- I had only been setting aside 4gb of mem per job- this was generally successful- making 32GB available fixed the problem (although I'm sure it didn't use half of that).