Arthur
Arthur
To put the tokenizer in the folder run: ```python tokenizer.save_pretrained("path_to_folder") ```
That very strange, I can't reproduce your error at all. If you open any colab, the script that you share will work. Quick fixes are probably: `pip install --upgrade transformers`.
Hey! Just got back from holidays, this should be my main focus in the coming days!
Sorry! Seem like I had to postpone this! If anyone want to take over feel free to do it, otherwise will be my priority once https://github.com/huggingface/transformers/pull/23909 is merge!
#27734 should help with some of the issues in the mean time
Cool! Thanks for this contribution! Pretty sure that this can also be applied to `SwitchTransformers` (as it implements the similar procedure) and ~`MT5`~ LongT5
cc @Narsil as this might follow the latest update of the `return_stimestamps`
I suspect the logits processor @Narsil but this is strange that it didn’t came up before
@alextomana, did you try comparing the `generation_config` as mentioned above? About the silence or what not, not really sure
cc @Narsil maybe an edge case that was not handle (and that was previously ignored) let's be more permissive on the last timestamps + will check with the provided example...