RTranslator icon indicating copy to clipboard operation
RTranslator copied to clipboard

[Question]How can I use the Madlad model?

Open DHLee-94 opened this issue 4 months ago • 2 comments

I would like to try using Madlad models or hugging face models.

Most of the Hugging Face models only have encoder.onnx and decoder.onnx, but I don't know what to do. Please help.

DHLee-94 avatar Aug 25 '25 02:08 DHLee-94

Hi @DHLee-94, I'm having a conversation about the same topic here. For Madlad the official support will come in the future (as soon as I will have enough time to work on the app), probably it will be much easier for you to wait the official support.

niedev avatar Aug 31 '25 10:08 niedev

I would like to try using Madlad models or hugging face models.

Most of the Hugging Face models only have encoder.onnx and decoder.onnx, but I don't know what to do. Please help.

#153 I implemented a framework can run optimum export model(encoder.onnx+decoder_merge.onnx+tokenizer.json), may be it can help you (with some modify?i guess). onnx-runtime is too slow, if you want to enjoy fast infer, select a good runtime(with more inference acceleration) is very imporant.

nwdxlgzs avatar Sep 02 '25 13:09 nwdxlgzs

+1 The current NLLB model is inaccurate in translation in some cases, and even completely wrong.

For example, in Chinese, 我要下班了 ("I'm off work"), it translates into English is "I'm going to work right now." The result of the translation is completely opposite to the original input.

I think it's better to add an option of another model, or add some parameters to adjust current model.

lesca avatar Dec 11 '25 06:12 lesca

@lesca yeah, the NLLB has very low quality with very short sentences, the Madlad support (and Mozilla models) will come with RTranslator 3.0, I'm currently working on it full time, the release will be out in a few months.

niedev avatar Dec 11 '25 14:12 niedev