llama.cpp
llama.cpp copied to clipboard
Support request - Google MADLAD400-10B
Hello,
Is it possible to implement Llamacpp support for this great translation model from Google? Google already made GGUFs (q4 and q6)?
google/madlad400-10b-mt
https://huggingface.co/google/madlad400-10b-mt/tree/main
There was already a feature request in December 2023 (https://github.com/ggerganov/llama.cpp/issues/4316) that was closed due to inactivity. Various people had expressed interest in this (including me) but I am unable to contribute code.
My bad, I checked only open issues as it wasn't working with those ggufs I mentioned earlier.
I tested half a dozen MT models and this one was really good, significantly better than others, at least for languages I tested (Slovenian and Croatian to English). I doubt that people are not in need for translation LLMs, because there are a lot of posts about searching for good MT models on Reddit. Even so, T5 architecture support would bring also other features that T5s models excel at.
I apologize for re-oppening the issue, but whats done is done..
Please, can you reconsider providing support for T5 architecture?
This issue was closed because it has been inactive for 14 days since being marked as stale.