Arthur

Results 51 issues of Arthur

This is a draft pull request. # What does this PR do? This PR will progressively add the [Jukebox](https://openai.com/blog/jukebox/) model to the hub. It is linked to [#16870](https://github.com/huggingface/transformers/issues/16870). # Currently...

New model
WIP

# What does this PR do? Fixes #18776, by taking care of the particular case of absolute scope modifications ## Who can review?

TensorFlow
Core: Modeling

# What does this PR do? fixes #19019 by replacing the construction of the `eos_mask` in the `SequenceClassification`. Also adds a test to make sure that long sequence are properly...

# What does this PR This PR updates the way we generation TF and FLAX to fix the breaking changes that we had. It also adds support for the timestamps...

# What does this PR do? Fixes #19888, by allowing the user to `normalise` the input audio before computing the MEL spectrogra,.

# What does this PR do? Adresses the issues with OPT where `use_fast = True` does not use the Fast GPT2 tokenizer. A follow PR should add a warning when...

# Generation config I know it has just been added so it is normal! But the following are missing (and are pretty intuitive w.r.t our other objects such as configs,...

# What does this PR do? Refactor both Deberta and DebertaV2 to make them more compatible with the overall transformers library Should fix a bunch of issues related to torch-scripting...

# What does this PR do? Fixes #21300 To-Dos: - [x] Conversion script and original weights available [here](https://huggingface.co/ArthurZ/fairseq-nllb-moe) - [x] Converted checkpoints and configuration file available: - [moe-128](https://huggingface.co/ArthurZ/nllb-moe-128) experts -...

# What does this PR do? Should fix the backward compatibility issue with `model.config.forced_decoder_ids = ...` and should help users who want to generate with timestamps. Fixes #21937 and #21878