folder not empty error when using multiple libraries
Description of bug
Dear community
I'm working on running SPAdes with seven paired-end libraries. I have plenty of memory (500 GB), and I monitored SPAdes' memory usage closely — it never exceeded 220 GB. However, it suddenly stopped after nearly 24 hours, reporting that the folder was not empty. Besides, SPAdes complains about memory, which, as I said, I allocated more than enough, I guess.
I appreciate your help.
spades.log
Command line: /home/User/miniconda3/envs/spades/bin/spades.py --meta --threads 55 --memory 500 -o /data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged --pe1-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_1.fastq --pe1-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_2.fastq --pe2-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_1.fastq --pe2-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_2.fastq --pe3-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_1.fastq --pe3-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_2.fastq --pe4-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_1.fastq --pe4-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_2.fastq --pe5-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_1.fastq --pe5-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_2.fastq --pe6-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_1.fastq --pe6-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_2.fastq --pe7-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_1.fastq --pe7-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_2.fastq
System information: SPAdes version: 4.2.0 Python version: 3.13.7 OS: Linux-4.18.0-553.5.1.el8_10.x86_64-x86_64-with-glibc2.28
Output dir: /data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged Mode: read error correction and assembling Debug mode is turned OFF
Dataset parameters: Metagenomic mode Reads: Library number: 1, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Library number: 2, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Library number: 3, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Library number: 4, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Library number: 5, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Library number: 6, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Library number: 7, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Read error correction parameters: Iterations: 1 PHRED offset will be auto-detected Corrected reads will be compressed Assembly parameters: k: [21, 33, 55] Repeat resolution is enabled Mismatch careful mode is turned OFF MismatchCorrector will be SKIPPED Coverage cutoff is turned OFF Assembly graph output will use GFA v1.2 format Other parameters: Dir for temp files: /data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged/tmp Threads: 55 Memory limit (in Gb): 500
======= SPAdes pipeline started. Log can be found here: /data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged/spades.log
/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_1.fastq: max reads length: 150 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_2.fastq: max reads length: 150 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_1.fastq: max reads length: 150 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_2.fastq: max reads length: 150 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_1.fastq: max reads length: 150 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_2.fastq: max reads length: 150 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_1.fastq: max reads length: 150 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_2.fastq: max reads length: 150 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_1.fastq: max reads length: 150 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_2.fastq: max reads length: 150 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_1.fastq: max reads length: 150 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_2.fastq: max reads length: 150 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_1.fastq: max reads length: 150 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_2.fastq: max reads length: 150
Reads length: 150
===== Before start started.
===== Read error correction started.
===== Read error correction started.
== Running: /home/User/miniconda3/envs/spades/bin/spades-hammer /data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged/corrected/configs/config.info
0:00:00.000 1M / 20M INFO General (main.cpp : 76) Starting BayesHammer, built from N/A, git revision N/A 0:00:00.004 1M / 20M INFO General (main.cpp : 77) Loading config from "/data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged/corrected/configs/config.info" 0:00:00.009 1M / 20M INFO General (main.cpp : 79) Maximum # of threads to use (adjusted due to OMP capabilities): 55 0:00:00.010 1M / 20M INFO General (memory_limit.cpp : 55) Memory limit set to 500 Gb 0:00:00.010 1M / 20M INFO General (main.cpp : 87) Trying to determine PHRED offset 0:00:00.011 1M / 20M INFO General (main.cpp : 93) Determined value is 33 0:00:00.011 1M / 20M INFO General (hammer_tools.cpp : 40) Hamming graph threshold tau=1, k=21, subkmer positions = [ 0 10 ] 0:00:00.011 1M / 20M INFO General (main.cpp : 114) Size of aux. kmer data 24 bytes === ITERATION 0 begins === 0:00:00.011 1M / 20M INFO K-mer Counting (kmer_data.cpp : 284) Estimating k-mer count 0:00:00.221 881M / 926M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_1.fastq" 0:04:27.309 881M / 932M INFO K-mer Counting (kmer_data.cpp : 298) Processed 76663388 reads 0:04:27.314 881M / 932M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_2.fastq" 0:08:54.109 881M / 932M INFO K-mer Counting (kmer_data.cpp : 298) Processed 153326776 reads 0:08:54.110 881M / 932M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_1.fastq" 0:14:05.132 881M / 932M INFO K-mer Counting (kmer_data.cpp : 298) Processed 241839330 reads 0:14:05.137 881M / 932M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_2.fastq" 0:19:14.618 881M / 932M INFO K-mer Counting (kmer_data.cpp : 298) Processed 330351884 reads 0:19:14.620 881M / 932M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_1.fastq" 0:24:27.322 881M / 932M INFO K-mer Counting (kmer_data.cpp : 298) Processed 419749993 reads 0:24:27.324 881M / 932M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_2.fastq" 0:29:38.765 881M / 932M INFO K-mer Counting (kmer_data.cpp : 298) Processed 509148102 reads 0:29:38.767 881M / 932M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_1.fastq" 0:33:40.687 881M / 932M INFO K-mer Counting (kmer_data.cpp : 298) Processed 578619029 reads 0:33:40.687 881M / 932M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_2.fastq" 0:37:43.509 881M / 932M INFO K-mer Counting (kmer_data.cpp : 298) Processed 648089956 reads 0:37:43.509 881M / 932M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_1.fastq" 0:42:11.780 881M / 932M INFO K-mer Counting (kmer_data.cpp : 298) Processed 724842943 reads 0:42:11.786 881M / 932M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_2.fastq" 0:46:39.644 881M / 932M INFO K-mer Counting (kmer_data.cpp : 298) Processed 801595930 reads 0:46:39.646 881M / 932M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_1.fastq" 0:50:55.878 881M / 938M INFO K-mer Counting (kmer_data.cpp : 298) Processed 875160648 reads 0:50:55.879 881M / 938M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_2.fastq" 0:55:13.561 881M / 1033M INFO K-mer Counting (kmer_data.cpp : 298) Processed 948725366 reads 0:55:13.563 881M / 1033M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_1.fastq" 0:59:55.728 881M / 1033M INFO K-mer Counting (kmer_data.cpp : 298) Processed 1029223640 reads 0:59:55.733 881M / 1033M INFO K-mer Counting (kmer_data.cpp : 289) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_2.fastq" 1:04:39.401 881M / 1033M INFO K-mer Counting (kmer_data.cpp : 298) Processed 1109721914 reads 1:04:39.406 881M / 1033M INFO K-mer Counting (kmer_data.cpp : 303) Total 1109721914 reads processed 1:04:40.922 881M / 1033M INFO K-mer Counting (kmer_data.cpp : 306) Estimated 41102073561 distinct kmers 1:04:40.924 1M / 1033M INFO K-mer Counting (kmer_data.cpp : 310) Filtering singleton k-mers mimalloc: warning: unable to allocate aligned OS memory directly, fall back to over-allocation (173958758400 bytes, address: 0x7f0b11943000, alignment: 67108864, commit: 1) 1:06:07.195 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_1.fastq" 1:47:35.051 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 76663388 reads 1:47:35.053 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_2.fastq" 2:24:52.077 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 153326776 reads 2:24:52.091 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_1.fastq" 3:03:39.688 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 241839330 reads 3:03:39.690 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_2.fastq" 3:42:20.373 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 330351884 reads 3:42:20.375 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_1.fastq" 4:19:46.312 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 419749993 reads 4:19:46.314 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_2.fastq" 4:58:34.729 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 509148102 reads 4:58:34.731 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_1.fastq" 5:27:50.928 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 578619029 reads 5:27:50.930 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_2.fastq" 5:57:37.761 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 648089956 reads 5:57:37.763 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_1.fastq" 6:29:33.591 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 724842943 reads 6:29:33.593 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_2.fastq" 7:02:08.922 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 801595930 reads 7:02:08.928 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_1.fastq" 7:32:42.616 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 875160648 reads 7:32:42.622 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_2.fastq" 8:03:44.873 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 948725366 reads 8:03:44.874 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_1.fastq" 8:37:06.670 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 1029223640 reads 8:37:06.671 163G / 163G INFO K-mer Counting (kmer_data.cpp : 316) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_2.fastq" 9:11:00.252 163G / 163G INFO K-mer Counting (kmer_data.cpp : 325) Processed 1109721914 reads 9:11:00.254 163G / 163G INFO K-mer Counting (kmer_data.cpp : 330) Total 1109721914 reads processed 9:11:00.290 163G / 163G INFO General (kmer_index_builder.hpp : 308) Splitting kmer instances into 16 files using 55 threads. This might take a while. 9:11:00.306 163G / 163G INFO General (file_limit.hpp : 43) Open file limit set to 1024 9:11:00.306 163G / 163G INFO General (kmer_splitter.hpp : 96) Memory available for splitting buffers: 2.04766 Gb 9:11:00.306 163G / 163G INFO General (kmer_splitter.hpp : 104) Using cell size of 4194304 9:11:00.329 194G / 194G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_1.fastq" mimalloc: warning: unable to allocate aligned OS memory directly, fall back to over-allocation (1790967808 bytes, address: 0x7f05a2000000, alignment: 67108864, commit: 1) mimalloc: warning: thread 0x7f33ad578700: mimalloc: warning: thread 0x7f33a6d6b700: mimalloc: warning: thread 0x7f33a856e700: mimalloc: warning: thread 0x7f33aa572700: unable to allocate aligned OS memory directly, fall back to over-allocation (1795162112 bytes, address: 0x7f09cf000000, alignment: 67108864, commit: 1) mimalloc: warning: thread 0x7f33add79700: unable to allocate aligned OS memory directly, fall back to over-allocation (1795162112 bytes, address: 0x7f088e400000, alignment: 67108864, commit: 1) mimalloc: warning: thread 0x7f33a7d6d700: unable to allocate aligned OS memory directly, fall back to over-allocation (1795162112 bytes, address: 0x7f07b8400000, alignment: 67108864, commit: 1) unable to allocate aligned OS memory directly, fall back to over-allocation (1795162112 bytes, address: 0x7f0a3a000000, alignment: 67108864, commit: 1) mimalloc: warning: thread 0x7f33acd77700: mimalloc: warning: thread 0x7f33ac576700: mimalloc: warning: thread 0x7f33a8d6f700: mimalloc: warning: thread 0x7f33aad73700: unable to allocate aligned OS memory directly, fall back to over-allocation (1795162112 bytes, address: 0x7f0823400000, alignment: 67108864, commit: 1) unable to allocate aligned OS memory directly, fall back to over-allocation (1795162112 bytes, address: 0x7f0aa5000000, alignment: 67108864, commit: 1) mimalloc: warning: thread 0x7f33a756c700: mimalloc: warning: thread 0x7f33ab574700: mimalloc: warning: thread 0x7f33a9d71700: unable to allocate aligned OS memory directly, fall back to over-allocation (1795162112 bytes, address: 0x7f060cc00000, alignment: 67108864, commit: 1) unable to allocate aligned OS memory directly, fall back to over-allocation (1795162112 bytes, address: 0x7f0aa5000000, alignment: 67108864, commit: 1) unable to allocate aligned OS memory directly, fall back to over-allocation (1790967808 bytes, address: 0x7f074d800000, alignment: 67108864, commit: 1) unable to allocate aligned OS memory directly, fall back to over-allocation (1790967808 bytes, address: 0x7f08f9400000, alignment: 67108864, commit: 1) unable to allocate aligned OS memory directly, fall back to over-allocation (1795162112 bytes, address: 0x7f0aa5000000, alignment: 67108864, commit: 1) unable to allocate aligned OS memory directly, fall back to over-allocation (1795162112 bytes, address: 0x7f06e2800000, alignment: 67108864, commit: 1) unable to allocate aligned OS memory directly, fall back to over-allocation (1790967808 bytes, address: 0x7f0677c00000, alignment: 67108864, commit: 1) mimalloc: warning: thread 0x7f33abd75700: unable to allocate aligned OS memory directly, fall back to over-allocation (1790967808 bytes, address: 0x7f0a39400000, alignment: 67108864, commit: 1) 9:15:09.241 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 14452400 reads mimalloc: warning: thread 0x7f33a9d71700: unable to allocate aligned OS memory directly, fall back to over-allocation (1799356416 bytes, address: 0x7f0aa4c00000, alignment: 67108864, commit: 1) 9:19:24.609 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 28986660 reads 9:23:44.252 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 43572992 reads 9:28:13.467 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 58103378 reads 9:32:46.855 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 72651718 reads 9:34:15.594 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 76663388 reads 9:34:15.599 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_2.fastq" 9:38:34.184 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 91553802 reads 9:42:56.970 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 106325877 reads 9:47:19.093 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 121033590 reads 9:51:43.273 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 136269246 reads 9:56:11.114 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 151695010 reads 9:56:50.358 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 153326776 reads 9:56:50.358 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_1.fastq" 10:01:13.822 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 167883934 reads 10:23:16.586 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_2.fastq" 10:31:59.877 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 271208763 reads 10:49:17.223 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_1.fastq" 11:16:07.364 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_2.fastq" 11:42:16.040 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_1.fastq" 11:51:00.294 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 538486672 reads 12:02:24.078 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_2.fastq" 12:21:26.411 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_1.fastq" 12:43:01.333 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_2.fastq" 13:04:28.751 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_1.fastq" 13:26:52.253 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_2.fastq" 13:48:26.997 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_1.fastq" 14:13:13.845 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 98) Processing "/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_2.fastq" 14:26:06.971 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 108) Processed 1074550677 reads 14:36:16.885 194G / 216G INFO K-mer Splitting (kmer_data.cpp : 113) Total 1109721914 reads processed 14:36:19.090 163G / 216G INFO General (kmer_index_builder.hpp : 314) Starting k-mer counting. 14:36:19.108 163G / 216G ERROR General (mmapped_reader.hpp : 52) mmap(2) failed. Reason: Cannot allocate memory. Error code: 12 === Stack Trace === 14:36:19.108 163G / 216G ERROR General (mmapped_reader.hpp : 52) mmap(2) failed. Reason: Cannot allocate memory. Error code: 12 === Stack Trace === 14:36:19.108 163G / 216G ERROR General (mmapped_reader.hpp : 52) mmap(2) failed. Reason: Cannot allocate memory. Error code: 12 === Stack Trace === 14:36:19.108 163G / 216G ERROR General (mmapped_reader.hpp : 52) mmap(2) failed. Reason: Cannot allocate memory. Error code: 12 === Stack Trace === 14:36:19.108 163G / 216G ERROR General (mmapped_reader.hpp : 52) mmap(2) failed. Reason: Cannot allocate memory. Error code: 12 === Stack Trace === 14:36:19.108 163G / 216G ERROR General (mmapped_reader.hpp : 52) mmap(2) failed. Reason: Cannot allocate memory. Error code: 12 === Stack Trace === 14:36:19.108 163G / 216G ERROR General (mmapped_reader.hpp : 52) mmap(2) failed. Reason: Cannot allocate memory. Error code: 12 14:36:19.108 163G / 216G ERROR General (mmapped_reader.hpp : 52) mmap(2) failed. Reason: Cannot allocate memory. Error code: 12 14:36:19.108 163G / 216G ERROR General (mmapped_reader.hpp : 52) mmap(2) failed. Reason: Cannot allocate memory. Error code: 12 === Stack Trace === 14:36:19.108 163G / 216G ERROR General (mmapped_reader.hpp : 52) mmap(2) failed. Reason: Cannot allocate memory. Error code: 12=== Stack Trace === === Stack Trace === === Stack Trace === 14:36:19.108 163G / 216G ERROR General (mmapped_reader.hpp : 52) mmap(2) failed. Reason: Cannot allocate memory. Error code: 12 === Stack Trace === /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x1d339) [0x5579af3f0339] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x346f0) [0x5579af4076f0] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x493b3) [0x5579af41c3b3] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x63843) [0x5579af436843] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x654b0) [0x5579af4384b0] /home/User/miniconda3/envs/spades/bin/../lib/libgomp.so.1(+0x19ec4) [0x7f33aeef1ec4] /usr/lib64/libpthread.so.0(+0x81ca) [0x7f33aead31ca] /usr/lib64/libc.so.6(clone+0x43) [0x7f33adfb78d3] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x1d339) [0x5579af3f0339] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x346f0) [0x5579af4076f0] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x493b3) [0x5579af41c3b3] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x63843) [0x5579af436843] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x654b0) [0x5579af4384b0] /home/User/miniconda3/envs/spades/bin/../lib/libgomp.so.1(+0x19ec4) [0x7f33aeef1ec4] /usr/lib64/libpthread.so.0(+0x81ca) [0x7f33aead31ca] /usr/lib64/libc.so.6(clone+0x43) [0x7f33adfb78d3] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x1d339) [0x5579af3f0339] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x346f0) [0x5579af4076f0] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x493b3) [0x5579af41c3b3] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x63843) [0x5579af436843] /home/User/miniconda3/envs/spades/bin/spades-hammer(+0x654b0) [0x5579af4384b0] /home/User/miniconda3/envs/spades/bin/../lib/libgomp.so.1(+0x19ec4) [0x7f33aeef1ec4] /usr/lib64/libpthread.so.0(+0x81ca) [0x7f33aead31ca] /usr/lib64/libc.so.6(clone+0x43) [0x7f33adfb78d3]
== Error == system call for: "['/home/User/miniconda3/envs/spades/bin/spades-hammer', '/data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged/corrected/configs/config.info']" finished abnormally, OS return value: 12 None
In case you have troubles running SPAdes, you can report an issue on our GitHub repository github.com/ablab/spades Please provide us with params.txt and spades.log files from the output directory. [Errno 39] Directory not empty: '/data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged/tmp/hammer_ajj3_epz/kmer_splitter_JV9jJU' Traceback (most recent call last): File "/home/User/miniconda3/envs/spades/bin/spades.py", line 652, in main jobs = executor.execute(commands) File "/home/User/miniconda3/envs/spades/share/spades/spades_pipeline/executors/executor_local.py", line 37, in execute command.run(self.log) ~~~~~~~~~~~^^^^^^^^^^ File "/home/User/miniconda3/envs/spades/share/spades/spades_pipeline/commands_parser.py", line 66, in run support.sys_call(self.to_list(), log) ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^ File "/home/User/miniconda3/envs/spades/share/spades/spades_pipeline/support.py", line 311, in sys_call sys_error(cmd, log, proc.returncode) ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/User/miniconda3/envs/spades/share/spades/spades_pipeline/support.py", line 92, in sys_error error(err_msg, log, exit_code=exit_code) ~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/User/miniconda3/envs/spades/share/spades/spades_pipeline/support.py", line 60, in error shutil.rmtree(current_tmp_dir) ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "/home/User/miniconda3/envs/spades/lib/python3.13/shutil.py", line 763, in rmtree _rmtree_safe_fd(stack, onexc) ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^ File "/home/User/miniconda3/envs/spades/lib/python3.13/shutil.py", line 707, in _rmtree_safe_fd onexc(func, path, err) ~~~~~^^^^^^^^^^^^^^^^^ File "/home/User/miniconda3/envs/spades/lib/python3.13/shutil.py", line 658, in _rmtree_safe_fd os.rmdir(name, dir_fd=dirfd) ~~~~~~~~^^^^^^^^^^^^^^^^^^^^ OSError: [Errno 39] Directory not empty: '/data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged/tmp/hammer_ajj3_epz/kmer_splitter_JV9jJU'
== Error == exception caught: <class 'OSError'>
In case you have troubles running SPAdes, you can report an issue on our GitHub repository github.com/ablab/spades Please provide us with params.txt and spades.log files from the output directory.
SPAdes log can be found here: /data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged/spades.log
Thank you for using metaSPAdes! If you use it in your research, please cite:
Nurk, S., Meleshko, D., Korobeynikov, A. and Pevzner, P.A., 2017. metaSPAdes: a new versatile metagenomic assembler. Genome research, 27(5), pp.824-834. doi.org/10.1101/gr.213959.116
params.txt
Command line: /home/User/miniconda3/envs/spades/bin/spades.py --meta --threads 55 --memory 500 -o /data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged --pe1-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_1.fastq --pe1-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_2.fastq --pe2-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_1.fastq --pe2-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_2.fastq --pe3-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_1.fastq --pe3-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_2.fastq --pe4-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_1.fastq --pe4-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_2.fastq --pe5-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_1.fastq --pe5-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_2.fastq --pe6-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_1.fastq --pe6-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_2.fastq --pe7-1 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_1.fastq --pe7-2 /data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_2.fastq
System information: SPAdes version: 4.2.0 Python version: 3.13.7 OS: Linux-4.18.0-553.5.1.el8_10.x86_64-x86_64-with-glibc2.28
Output dir: /data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged Mode: read error correction and assembling Debug mode is turned OFF
Dataset parameters: Metagenomic mode Reads: Library number: 1, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone01_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Library number: 2, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone02_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Library number: 3, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone03_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Library number: 4, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone04_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Library number: 5, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone06_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Library number: 6, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone08_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Library number: 7, library type: paired-end orientation: fr left reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_1.fastq'] right reads: ['/data2/User_Remote/All_data/HiAS/User_HiAS/0_Fq_Illumina/Fq/zone/Zone10_2.fastq'] interlaced reads: not specified single reads: not specified merged reads: not specified Read error correction parameters: Iterations: 1 PHRED offset will be auto-detected Corrected reads will be compressed Assembly parameters: k: [21, 33, 55] Repeat resolution is enabled Mismatch careful mode is turned OFF MismatchCorrector will be SKIPPED Coverage cutoff is turned OFF Assembly graph output will use GFA v1.2 format Other parameters: Dir for temp files: /data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged/tmp Threads: 55 Memory limit (in Gb): 500
SPAdes version
4.2.0
Operating System
Red Hat Enterprise Linux 8.10
Python Version
3.13.7
Method of SPAdes installation
conda
No errors reported in spades.log
- [x] Yes
[Errno 39] Directory not empty: '/data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged/tmp/hammer_ajj3_epz/kmer_splitter_JV9jJU'
This is an I/O error on your end. Essentially after SPAdes tried to remove the temporary folder it still appeared as non-empty. Usually this indicates issue with network filesystem or so. I would suggest you to contact your system administrator.
Besides, SPAdes complains about memory, which, as I said, I allocated more than enough, I guess.
You might think that you "allocated more than enough", though in reality it might not be so. In this particular case your OS failed to fulfill SPAdes' request for memory allocation. There is no way to circumvent this issue rather than terminate the assembly. As far as I can see, you're having more than 1 billion reads. This is... quite a lot. What is the expected genome size?
[Errno 39] Directory not empty: '/data2/User_Remote/All_data/HiAS/For_Gh/spades_short_read_assembly_merged/tmp/hammer_ajj3_epz/kmer_splitter_JV9jJU'
This is an I/O error on your end. Essentially after SPAdes tried to remove the temporary folder it still appeared as non-empty. Usually this indicates issue with network filesystem or so. I would suggest you to contact your system administrator.
Besides, SPAdes complains about memory, which, as I said, I allocated more than enough, I guess.
You might think that you "allocated more than enough", though in reality it might not be so. In this particular case your OS failed to fulfill SPAdes' request for memory allocation. There is no way to circumvent this issue rather than terminate the assembly. As far as I can see, you're having more than 1 billion reads. This is... quite a lot. What is the expected genome size?
Thanks for your reply
Those 7 libraries were sequenced from biofilm samples in a wastewater plant system. The idea was to make a large assembly and then bin it. Some tools, such as Maxbin2, can give you contig abundance if you use assembly and read from each zone separately in each run. Then one can easily track how the abundance of a specific contig /bin changed over the zones.
Since we are working with the metagenome, estimating the genome size is not straightforward; however, I estimate it to be around 3 Mb to 6 Mb.
Regarding memory, can I change the swap size on my system? can SPAdes use swap ?
Thanks again
Since we are working with the metagenome, estimating the genome size is not straightforward; however, I estimate it to be around 3 Mb to 6 Mb.
You wanted to say 3 to 6 Gb? As this is a metagenome? Its unlikely you're wanting to assemble a single genome? Why do you use normal isolate mode then (that expects a single evenly-covered genome) and not a metagenomic mode?
Since we are working with the metagenome, estimating the genome size is not straightforward; however, I estimate it to be around 3 Mb to 6 Mb.
You wanted to say 3 to 6 Gb? As this is a metagenome? Its unlikely you're wanting to assemble a single genome? Why do you use normal isolate mode then (that expects a single evenly-covered genome) and not a metagenomic mode?
I am using the --meta flag, not the isolate flag. 3 to 6 Mb refers to the ideal situation (if we are really able to recover a complete bacterial genome from our metagenome assembly)
3 to 6 Mb refers to the ideal situation (if we are really able to recover a complete bacterial genome from our metagenome assembly)
But this is just a single genome, not the whole metagenome. The whole metagenome likely will be very large:
1:04:40.922 881M / 1033M INFO K-mer Counting (kmer_data.cpp : 306) Estimated 41102073561 distinct kmers
So, you're having approx 40 billion distinct k-mers. With theI would probably suggest you to do a heavy quality-based trimming prior to assembly as the read error correction would require no less than 24 bytes per each distinct k-mer.
3 to 6 Mb refers to the ideal situation (if we are really able to recover a complete bacterial genome from our metagenome assembly)
But this is just a single genome, not the whole metagenome. The whole metagenome likely will be very large:
1:04:40.922 881M / 1033M INFO K-mer Counting (kmer_data.cpp : 306) Estimated 41102073561 distinct kmers
So, you're having approx 40 billion distinct k-mers. With theI would probably suggest you to do a heavy quality-based trimming prior to assembly as the read error correction would require no less than 24 bytes per each distinct k-mer.
Right, the metagenome assembly file for each sample alone is approximately 2.5 GB; however, co-assembly failed, and I am unsure of the resulting size. Any specefic trimming ? data are already trimmed for low quality bases and adapter contamination