Liang

Results 2 comments of Liang

Please refer to [#14561](https://github.com/huggingface/transformers/issues/14561) . Also if you would like to set the same value (e.g. `model_max_len`) as you train tokenizer before, you can modify `PreTrainedTokenizerFast` as following ``` PreTrainedTokenizerFast(tokenizer_object=tokenizer,...

@Narsil You're right, `tokenizers` has no special treatment of special tokens. As describled in https://huggingface.co/course/chapter6/8?fw=pt: > To wrap the tokenizer in a PreTrainedTokenizerFast, we can either pass the tokenizer we...