wenet
wenet copied to clipboard
How to add new words during fine-tuning?
Hi,a pre-trained model's unit.txt contains 1000 words. When fine-tuning based on this pre-trained model, there are 10 new words not in the unit.txt. At this point, adding these 10 new words to the end of unit.txt and assigning them new numbers, is this approach feasible?
freeze other modules except the outputlayer(ctc output && attention decoder output), add new words to your unit.txt ,modify the output size and then tune the model
THX @fclearner,I will try out what you suggested later.
@fclearner embedd layer in the decoder doesn't need to be changed??
@fclearner embedd layer in the decoder doesn't need to be changed??
Found modules match the output size and change it
provide examples ?
provide examples ?
Try to print the model size or visualize it with netron
This issue has been automatically closed due to inactivity.
This issue was closed because it has been inactive for 7 days since being marked as stale. Please reopen if you'd like to work on this further.