deepTools
deepTools copied to clipboard
computeMatrix time cost
Hi, Do you have an estimate on how the time cost for computeMatrix scales with -R? I have a bed with 825510 entries and I could threshold that to a smaller size. I would want it to run in a couple of days at most. Thanks! Aaron
(base) [aaron@c4-log1 aaron]$ deeptools --version deeptools 3.5.1 (base) [aaron@c4-log1 aaron]$ python --version Python 3.8.8 (base) [aaron@c4-log1 aaron]$ cat ../diazlab_code/slurm/mkmtx.sh #!/bin/bash #SBATCH --nodes=1 #SBATCH --ntasks=30 #SBATCH --mem=10G #SBATCH --time=2-06:00:00 #SBATCH --output=%x-%j.out
/diazlab/data2/bin/deeptools/bin/computeMatrix scale-regions -S /diazlab/data2/aaron/SB28_ATRX-KO.bigWig -R /diazlab/data2/aaron/MA0139.1.mm10.bed --outFileName /diazlab/data2/aaron/SB28_ATRX-KO.mtx.gz -b 500 -a 500 -p 30
deepTools parallelizes over contig chunks, so that's the primary limiter. This will probably still take a while, but I expect a day would be more than enough.