Failed to allocate sufficient memory. Please refer to the manual for instructions on memory usage.
diamond version 2.0.8 The command: diamond blastx -b4 -c1 -d $db -q $s -a $base -t temp/
diamond v2.0.8.146 (C) Max Planck Society for the Advancement of Science Documentation, support and updates available at http://www.diamondsearch.org
#CPU threads: 32 Scoring parameters: (Matrix=BLOSUM62 Lambda=0.267 K=0.041 Penalties=11/1) Temporary directory: temp/ #Target sequences to report alignments for: 25 Opening the database... [0.051s] Database: /home/ls/tirza/DB/CO-ARBitrator/CO-ARBitrator_2.0_aa_reformatted_diamond.dmnd (type: Diamond database, sequences: 1286433, letters: 272826736) Block size = 4000000000 Opening the input file... [0.132s] Opening the output file... [0s] Loading query sequences... [0.745s] Masking queries... [0.494s] Building query seed set... [0.219s] The host system is detected to have 270 GB of RAM. It is recommended to increase the block size for better performance using these parameters : -b8 -c1 Algorithm: Double-indexed Building query histograms... [0.095s] Allocating buffers... [0s] Loading reference sequences... [0.436s] Masking reference... [0.969s] Initializing temporary storage... [0s] Building reference histograms... [0.404s] Allocating buffers... [0s] Processing query block 1, reference block 1/1, shape 1/2. Building reference seed array... [0.338s] Building query seed array... [0.068s] Computing hash join... [0.161s] Building seed filter... [0.005s] Searching alignments... [16432.3s] Processing query block 1, reference block 1/1, shape 2/2. Building reference seed array... [0.353s] Building query seed array... [0.073s] Computing hash join... [0.151s] Building seed filter... [0.003s] Searching alignments... [22759.9s] Deallocating buffers... [0.009s] Clearing query masking... [0.019s] Computing alignments... [0.078s] Failed to allocate sufficient memory. Please refer to the manual for instructions on memory usage.
I also tried with b8 and c1 and it also failed with insufficient memory.
What am I doing wrong?
Thanks! Tirza
Please try a lower block size, for example -b2.
Still same error: #CPU threads: 32 Scoring parameters: (Matrix=BLOSUM62 Lambda=0.267 K=0.041 Penalties=11/1) Temporary directory: temp/ #Target sequences to report alignments for: 25 Opening the database... [0.058s] Database: /home/ls/tirza/Oren_Levi/Natalie/DB/CO-ARBitrator/CO-ARBitrator_2.0_aa_reformatted_diamond.dmnd (type: Diamond database, sequences: 1286433, letters: 272826736) Block size = 2000000000 Opening the input file... [0.121s] Opening the output file... [0s] Loading query sequences... [1.188s] Masking queries... [0.805s] Building query seed set... [0.36s] The host system is detected to have 270 GB of RAM. It is recommended to increase the block size for better performance using these parameters : -b8 -c1 Algorithm: Double-indexed Building query histograms... [0.205s] Allocating buffers... [0s] Loading reference sequences... [0.434s] Masking reference... [1.022s] Initializing temporary storage... [0s] Building reference histograms... [0.486s] Allocating buffers... [0s] Processing query block 1, reference block 1/1, shape 1/2. Building reference seed array... [0.431s] Building query seed array... [0.148s] Computing hash join... [0.224s] Building seed filter... [0.006s] Searching alignments... [22081.8s] Processing query block 1, reference block 1/1, shape 2/2. Building reference seed array... [0.425s] Building query seed array... [0.148s] Computing hash join... [0.211s] Building seed filter... [0.002s] Searching alignments... [32315.3s] Deallocating buffers... [0.006s] Clearing query masking... [0.026s] Computing alignments... [0.063s] Failed to allocate sufficient memory. Please refer to the manual for instructions on memory usage.
Your data seems to be very untypical in that the files are small, but a very large number of hits is generated. You can try the latest release that I posted today, which reduces memory use in this stage, and also try the option --bin 64 (or higher if that fails) which will divide the seed hit data across more files.