velocyto.R
velocyto.R copied to clipboard
show.velocity.on.embedding.cor -error: libgomp: Out of memory allocating
Dear all,
I am trying to run velocity analysis on my sample, following the seurat-wrappers/velocyto.R tutorial.
Everything goes smoothly till the step where I have to plot the velocity vector:
show.velocity.on.embedding.cor(emb = Embeddings(object = bm, reduction = "umap"), vel = Tool(object = bm, slot = "RunVelocity"), n = 200, scale = "sqrt", cell.colors = ac(x = cell.colors, alpha = 0.5), cex = 0.8, arrow.scale = 3, show.grid.flow = TRUE, min.grid.cell.mass = 0.5, grid.n = 40, arrow.lwd = 1, do.par = FALSE, cell.border.alpha = 0.1)
Error I get : delta projections ... sqrt libgomp: Out of memory allocating 463856469312 bytes
Can anyone help me solve this ?
Many thanks.
Best, Moheb
p.s I am running R in windows linux subsystem, R version 3.6.3 (2020-02-29) -- "Holding the Windsock" Platform: x86_64-pc-linux-gnu (64-bit)
How much memory do you have? The calculation is requesting ~432 GB memory. Either move the calculation to a bigger machine, or filter/subsample your dataset. An example of filtering can be seen here: http://pklab.med.harvard.edu/velocyto/notebooks/R/DG1.nb.html emat <- emat[,colSums(emat)>=1e3]
Subsampling could be performed on your cell names: cell.names <- sapply(count.matrices, colnames) #or rownames depending on your format subsamples <- sample(cell.names, 3e4) #this extracts 30k cells randomly
@rrydbirk , indeed my computer's memory was the limitation. I have moved to a machine with more memory and my analysis worked.
Thanks alot for your answer, the subsampling is indeed a very nice idea.