cnv_facets
cnv_facets copied to clipboard
Cannot allocate memory error
Hi @dariober, I've been trying to troubleshoot cnv_facets.R (thank you btw) but i'm always stuck after the pile up files have been generated. I've posted on Biostar and people seem to suggest memory.limit (doesn't work on linux, i'm running it on an HPC), or ulimit (doesn't work either). Here is a link to my post on biostar https://www.biostars.org/p/444452/#444462 (similar issue, although for that i was trying the original source from mskcc page).
I have 16Gb x 16Gb RAM running on each node
Here is my command:
Rscript $FACETS
-n $ALIGNED_BAM_FILE_NORMAL
-t $ALIGNED_BAM_FILE_TUMOR
-vcf $DBSNP
-o $OUTDIR/$PATIENT_ID
--snp-nprocs 8
-cv 25 400
Here is an example error log.
Finished in 408.650000 seconds. Finished in 405.490000 seconds. Finished in 609.840000 seconds. Finished in 262.830000 seconds. Finished in 543.280000 seconds. Finished in 487.650000 seconds. [2020-06-19 23:30:47] Loading file /exports/igmm/eddie/WGS/variants/cnv/facets/E13.csv.gz... [2020-06-19 23:38:04] Plotting histogram of coverage... Error: cannot allocate vector of size 4.1 Gb Execution halted
Any suggestions?
A
This seems to be related to R rather than to facets. Here's some thoughts...
-
Any chance you are running a 32 bit version of R? Could you send the output of
sessionInfo()
(making sure you use the version of R that you use for the facets jobs) -
When you say you have 16 Gb x 16 Gb of RAM, do you mean you have a node with 256 Gb? On biostars you say you submit the job with 32 Gb but are you sure you have that much? I'm just wondering if submitting a jobs with more RAM than you can afford will make the scheduler fall back to some default memory allocation. After submitting the job, check you have as much memory as you think.
-
For the time being, you could skip the generation of coverage plots with the
--no-cov-plot
option but I suspect you will incur in memory limits later in the analysis. -
How many SNPs do you have in your input VCF? It may be that you don't need that many. (This won;t fix the issue though...)