svmu
svmu copied to clipboard
LastZ fail (big genome) & SVMU segmentation fault
Hello, I'm encountering an issue that several people have raised here but I can't find how they fixed it I am not able to run lastZ because the genomes are too big. so I am running the svmu with an empty file for it. Yet, it starts running and after a while do " Segmentation fault (core dumped) svmu". any idea how I can do? thanks for your help Claire
Are you using delta file from nucmer or nucmer -maxmatch?
On Fri, Jul 23, 2021, 07:17 Claire Mérot @.***> wrote:
Hello, I'm encountering an issue that several people have raised here but I can't find how they fixed it I am not able to run lastZ because the genomes are too big. so I am running the svmu with an empty file for it. Yet, it starts running and after a while do " Segmentation fault (core dumped) svmu". any idea how I can do? thanks for your help Claire
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/mahulchak/svmu/issues/23, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABZQH2HUZJFVTG6JOQ2OU2DTZF2THANCNFSM5A4C5EBA .
hi Mahul, thanks for your answer. These are the lines of codes I used (so no -maxmatch, should I?)
#run nucmer
nucmer -t $NB_CPU
"$INPUT_FOLDER"/"$REFERENCE" "$INPUT_FOLDER"/"$QUERY"
-p "$OUTPUT_FOLDER"/sam2ref
#run last Z
lastz "$INPUT_FOLDER"/"$REFERENCE"[multiple] "$INPUT_FOLDER"/"$QUERY"[multiple] --chain
--format=general:name1,strand1,start1,end1,name2,strand2,start2,end2 > "$OUTPUT_FOLDER"/sam_lastz.txt
#this step failed with error "FAILURE: in load_fasta_sequence for 02_genomes/normal_chrs.fasta, sequence length 2,146,164,252+10,920,800 exceeds maximum (2,147,483,637)" -> I gave up but used the empty file for next step
#run svmu svmu "$OUTPUT_FOLDER"/sam2ref.delta "$INPUT_FOLDER"/"$REFERENCE" "$INPUT_FOLDER"/"$QUERY" l "$OUTPUT_FOLDER"/sam_lastz.txt "$OUTPUT_FOLDER"/sam2ref_svmu
thanks for your help! Claire