Ruihang Lai

Results 136 comments of Ruihang Lai

We have merged the Phi support early this week, so let's close this issue.

Sorry for the delayed response. @chenqi2013 would you like to try if @sherelynyap's suggestion works?

Thank you @Sing-Li for reporting! That is because the `mlc-chat-config.json` in the prebuilt weight repo was not updated. I just updated the `conv_template` field https://huggingface.co/mlc-ai/gorilla-openfunctions-v1-q4f16_1-MLC/commit/e83c4a2bbb4735c1ccde096dae0df635dd172310 and I think it should...

Hey @Sing-Li, sorry for the late reply. Just updated these two repositories. If I remember correctly, there might still be some output formatting issue for the function calling of gorilla...

Thank you @Sing-Li for checking again. This issue https://github.com/mlc-ai/mlc-llm/issues/2121#issuecomment-2049258529 also reports the similar error. We will look into that.

Hi @Sing-Li @ollmer, we have fixed this issue in the latest pip package. Please update the packages and try again, thank you!

Thank you @alphaarea. If you are saying the output/input relevance issue, it is not related to the “canonical simplification of LE”. We will track this and will look into it...

Thank you @ricar0. We are recently pushing the feature of JSONFFIEngine in MLC that hopefully can bring multi modality to platforms like Android/iOS. Please stay tuned :-)

@MrRace Just wondering how the issue is going? Do you still have the error after applying what @Kartik14 mentioned? #1993 also introduces something that may be helpful.

Hi @Vinaysukhesh98 thank you for reporting. Could you re-compile the model with `python -m mlc_llm compile ...`? Also see this thread where the same issue happens. I believe model recompilation...