Easy-Transformer icon indicating copy to clipboard operation
Easy-Transformer copied to clipboard

[Proposal] Add frequency-based RoPE support for Llama 3.1 models

Open frances720 opened this issue 1 year ago • 1 comments

Proposal

Add support for frequency-based RoPE (Rotary Position Embedding) smoothing in the TransformerLens library to match Llama 3.1’s architecture.

Motivation

Llama 3.1 uses frequency-based smoothing in its positional embeddings to handle long-range dependencies more effectively. However, the current version of TransformerLens does not support this feature, limiting the ability to properly analyze Llama 3.1 models.

Pitch

Implement frequency-based RoPE smoothing to enhance positional encoding in Llama 3.1 models. This would improve TransformerLens’s compatibility with Llama 3.1 and provide a better tool for analyzing long-sequence tasks.

Alternatives

Continue using TransformerLens with standard RoPE, but this would not fully support Llama 3.1’s unique architecture. Screen Shot 2024-09-08 at 10 09 07 PM

Checklist

  • [x] I have checked that there is no similar issue in the repo (required)

frances720 avatar Sep 09 '24 05:09 frances720

I have a PR for it but when I ran git push --set-upstream origin frances/llama31_rope it returned 403

frances720 avatar Sep 09 '24 05:09 frances720

@frances720 Sorry for the late reply! It appears that you may be trying to write your branch to the TransformerLens repo? You need to make your PR from your fork. If you need help on this, you can reach me on the slack channel. Let me know if you need an invite!

bryce13950 avatar Sep 23 '24 19:09 bryce13950

This has been resolved in a recent release

bryce13950 avatar Nov 03 '24 22:11 bryce13950