transformers
transformers copied to clipboard
π€ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
# What does this PR do? This PR aims at integrating X-Codec model toΒ `transformers`. The X-Codec model is a neural audio codec that integrates semantic information from self-supervised models...
### Feature request [Pruna](https://github.com/PrunaAI/pruna) is an open source AI model Optimization framework. As discussed with @SunMarc, it would be nice to load Pruna optimized models through the `transformers.from_pretrained` interface as...
# What does this PR do? Fixes https://github.com/huggingface/transformers/issues/38521. I checked with fast tokenizers' implementation of `word_to_char` and saw no different in the time taken, so I think this can be...
# What does this PR do? β οΈ This PR depends on and shouldn't be undrafted until both are merged: https://github.com/huggingface/transformers/pull/38757 https://github.com/huggingface/transformers/pull/38756 Should be regularly rebased from the temporary aggregated branch...
# What does this PR do? Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks...
# What does this PR do? Several things added to this PR: - Idefics2/3 + smolvlm fast image processors. Cc @andimarafioti :) - Improvements in the base fast image processors...