Shoku Yanagawa
Shoku Yanagawa
Thank you for quick reply. There is German GPT-2 repo in the huggingface/transformers. https://huggingface.co/anonymous-german-nlp/german-gpt2 This repo might be your fork.
I know huggingface/tokenizers generates merges.txt and vocab.json. https://github.com/huggingface/tokenizers However I doubt is this solution. Anyway I feel your repo is active. I will push pull request when I need. Best...
Solved rebuild 1.10.0.a0 from source tree. It works. tch-rs example build is OK. USE_MKLDNN=OFF flag is also important.
I have same trouble. peft v0.4.0.dev0
Thisi problem is related #433, not a Japanese specific. reason: special_token_map.json is empty. Now it's fixed. You can close the issue. Vicuna 13-b-1.1 speakes Japanese fluently even though we expect...
I have same kind of problem. python 3.10 I have checked #129 and the related topic "Use relative imports for the sequencer as well".