Benjamin Buchfink

Results 445 comments of Benjamin Buchfink

It should be fixed in the latest release.

This should now also be fixed in the latest release.

No, to cluster on multiple nodes some manual work is needed. I have added a small howto: https://github.com/bbuchfink/diamond/wiki/How-to-cluster-huge-datasets

I can reproduce the problem. Will try to provide a fix in the next days.

This should be fixed in version 2.1.1.

`--ntasks-per-core=2` is to activate hyperthreading, maybe the SLURM syntax is a bit confusing here.

This clearly looks like an error has occurred here. Have you tried to rerun the sample? You can also try using `--masking 0`. If the error persists, it would be...

Ok thanks for reporting this, I need to look into the memory use there.

You can use `--max-target-seqs 1` without worry normally, the effect on the accuracy is small. To not affect the algorithm you can add the `--no-ranking` option.

Sorry for this unfortunate issue, this was implemented to handle old NCBI headers. I think at least a warning message should be given in these cases.