website
website copied to clipboard
Bump transformers from 4.45.2 to 4.48.0
Bumps transformers from 4.45.2 to 4.48.0.
Release notes
Sourced from transformers's releases.
v4.48.0: ModernBERT, Aria, TimmWrapper, ColPali, Falcon3, Bamba, VitPose, DinoV2 w/ Registers, Emu3, Cohere v2, TextNet, DiffLlama, PixtralLarge, Moonshine
New models
ModernBERT
The ModernBert model was proposed in Smarter, Better, Faster, Longer: A Modern Bidirectional Encoder for Fast, Memory Efficient, and Long Context Finetuning and Inference by Benjamin Warner, Antoine Chaffin, Benjamin Clavié, Orion Weller, Oskar Hallström, Said Taghadouini, Alexis Galalgher, Raja Bisas, Faisal Ladhak, Tom Aarsen, Nathan Cooper, Grifin Adams, Jeremy Howard and Iacopo Poli.
It is a refresh of the traditional encoder architecture, as used in previous models such as BERT and RoBERTa.
It builds on BERT and implements many modern architectural improvements which have been developed since its original release, such as:
- Rotary Positional Embeddings to support sequences of up to 8192 tokens.
- Unpadding to ensure no compute is wasted on padding tokens, speeding up processing time for batches with mixed-length sequences.
- GeGLU Replacing the original MLP layers with GeGLU layers, shown to improve performance.
- Alternating Attention where most attention layers employ a sliding window of 128 tokens, with Global Attention only used every 3 layers.
- Flash Attention to speed up processing.
- A model designed following recent The Case for Co-Designing Model Architectures with Hardware, ensuring maximum efficiency across inference GPUs.
- Modern training data scales (2 trillion tokens) and mixtures (including code ande math data)
- Add ModernBERT to Transformers by
@warner-benjaminin #35158Aria
The Aria model was proposed in Aria: An Open Multimodal Native Mixture-of-Experts Model by Li et al. from the Rhymes.AI team.
Aria is an open multimodal-native model with best-in-class performance across a wide range of multimodal, language, and coding tasks. It has a Mixture-of-Experts architecture, with respectively 3.9B and 3.5B activated parameters per visual token and text token.
- Add Aria by
@aymeric-roucherin #34157TimmWrapper
We add a
TimmWrapperset of classes such that timm models can be loaded in as transformer models into the library.Here's a general usage example:
import torch from urllib.request import urlopen from PIL import Image from transformers import AutoConfig, AutoModelForImageClassification, AutoImageProcessorcheckpoint = "timm/resnet50.a1_in1k" img = Image.open(urlopen( 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/beignets-task-guide.png' ))
image_processor = AutoImageProcessor.from_pretrained(checkpoint) </tr></table>
... (truncated)
Commits
6bc0fbc[WIP] Emu3: add model (#33770)59e28c3Fix flex_attention in training mode (#35605)7cf6230push a fix for nowd6f446fwhen filtering we can't use the convert script as we removed them8ce1e95[test-all]af2d7caAdd Moonshine (#34784)42b8e79ModernBert: reuse GemmaRotaryEmbedding via modular + Integration tests (#35459)e39c9f7v4.48-release8de7b1bAdd flex_attn to diffllama (#35601)1e3ddcbModernBERT bug fixes (#35404)- Additional commits viewable in compare view
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebasewill rebase this PR@dependabot recreatewill recreate this PR, overwriting any edits that have been made to it@dependabot mergewill merge this PR after your CI passes on it@dependabot squash and mergewill squash and merge this PR after your CI passes on it@dependabot cancel mergewill cancel a previously requested merge and block automerging@dependabot reopenwill reopen this PR if it is closed@dependabot closewill close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot show <dependency name> ignore conditionswill show all of the ignore conditions of the specified dependency@dependabot ignore this major versionwill close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor versionwill close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependencywill close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the Security Alerts page.