end2end-asr-pytorch
end2end-asr-pytorch copied to clipboard
INCOMPLETE TRAINING OF CUSTOM DATASET
Hello, I got the following error while training on google colab and I am still unable to completely train my dataset
VALID SET 0 LOSS:2.7531 CER:103.64%: 41% 2312/5635 [02:02<07:07, 7.77it/s]Traceback (most recent call last):
File "/content/drive/My Drive/end2end-asr-pytorch-master/train.py", line 117, in
Hi @jafarOlamide sorry for a late reply. I am going to check the error.
Dear @jafarOlamide,
The error is a bit odd since it stopped in the middle of the evaluation. Have you find any workaround to solve this issue? Because I didn't get the same error when I run using other datasets.
I've not been able to find a way around the issue. I ran this in google colab. The error might be coming from it and I've been able to figure it out
!python /content/drive/"My Drive"/end2end-asr-pytorch-master/train.py --train-manifest-list /content/drive/"My Drive"/end2end-asr-pytorch-master/data/manifests/tr_yoruba_train_manifest.csv --valid-manifest-list /content/drive/"My Drive"/end2end-asr-pytorch-master/data/manifests/my_yoruba_valid_manifest_testing.csv --test-manifest-list /content/drive/"My Drive"/end2end-asr-pytorch-master/data/manifests/ts_yoruba_test_manifest_trial.csv --cuda --batch-size 12 --labels-path /content/drive/"My Drive"/end2end-asr-pytorch-master/data/labels/yoruba_character_labels.json --lr 1e-4 --name aishell_drop0.1_cnn_batch12_4_vgg_layer4 --save-folder save/ --save-every 5 --feat_extractor vgg_cnn --dropout 0.1 --num-layers 4 --num-heads 8 --dim-model 512 --dim-key 64 --dim-value 64 --dim-input 161 --dim-inner 2048 --dim-emb 512 --shuffle --min-lr 1e-6 --k-lr 1
Okay. You can reopen this issue if you still have the problem with the code.
Sorry, I wasnt able to figure it out, that was a typographical error, but the above was what I ran on colab. You could please check if there are any errors
I reopen this issue @jafarOlamide as requested.
@jafarOlamide : So, did you change collate_fn
function?
not yet, been seroiusly occupied lately