bonito
bonito copied to clipboard
outputting error
Hi there,
I was trying to train my own basecaller, here's what i did -
# index ground truth sequences
minimap2 -x map-ont -d groundtruth/groundtruth.mmi groundtruth/gt.fa
# basecall and save ctc
bonito basecaller [email protected] --save-ctc --reference groundtruth/gt.fa groundtruth/ > ctc_data/calls.sam
But ran into error:
> loading model [email protected]
> loading reference
> outputting aligned sam
> calling: 0 reads [00:00, ? reads/s]Exception in thread Thread-7:
Traceback (most recent call last):
File "/home/miniconda3/lib/python3.9/threading.py", line 973, in _bootstrap_inner
self.run()
File "/home/miniconda3/lib/python3.9/site-packages/bonito/multiprocessing.py", line 110, in run
for item in self.iterator:
File "/home/miniconda3/lib/python3.9/site-packages/bonito/crf/basecall.py", line 67, in <genexpr>
(read, compute_scores(model, batch, reverse=reverse)) for read, batch in batches
File "/home/miniconda3/lib/python3.9/site-packages/bonito/crf/basecall.py", line 35, in compute_scores
sequence, qstring, moves = beam_search(
File "/home/miniconda3/lib/python3.9/site-packages/koi/decode.py", line 13, in beam_search
raise TypeError('Expected fp16 but received %s' % scores.dtype)
TypeError: Expected fp16 but received torch.float32
Hello,
I am trying to train a basecaller using the same command and have received the exact same error, and was hoping for some insight. Thank you in advance for your time and consideration.
A few weeks ago there was an issue with the koi package... I had to change Line 11 in requirements.txt to be ont-koi==0.1.1 (not 0.1.0) and that fixed it for me.