Padding side for knn-transfomers
In this link, they say that decoder architectures should have left padding. In the code repository, you do right padding at the input side. Can you explain the reason?
Is the right padding OK?
Hi Hossam,
First, I haven't read the document you refer to, but I don't agree that decoders "should" have left padding. They can work either way with the right engineering
Second, rlthe design of padding is orthogonal to our implementation. That is, the approach demonstrated in this repo can work with any padding.
We copied the standard language modeling example by Huggingface and applied our approach.
Best, Uri
On Sat, Oct 26, 2024 at 17:13 Hossam Amer @.***> wrote:
In this link https://huggingface.co/docs/transformers/llm_tutorial#wrong-padding-side, they say that decoder architectures should have left padding. In the code repository, you do right padding at the input side. Can you explain the reason?
Is the right padding OK?
— Reply to this email directly, view it on GitHub https://github.com/neulab/knn-transformers/issues/16, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADSOXMERLJV67ROG6YSGQ2TZ5QHYZAVCNFSM6AAAAABQVEOE6OVHI2DSMVQWIX3LMV43ASLTON2WKOZSGYYTMMJUG44TINY . You are receiving this because you are subscribed to this thread.Message ID: @.***>