open_llama icon indicating copy to clipboard operation
open_llama copied to clipboard

FastChat support for Open_llama_3b_v2 inference - help sought

Open RDouglasSharp opened this issue 1 year ago • 0 comments
trafficstars

I use FastChat as the framework for both training and dialog-based inference, and FastChat supports Meta/Llama. I was excited to try the 3B state Open-Llama model, and the FastChat finetuning scripts all work perfectly with open_llama_3b_v2. Oddly, the FastChat inference framework does not work with my finetuned model, or with the original model. Has anyone figured out how to get FastChat fastchat.serve.cli to support openlm-research models?

RDouglasSharp avatar Mar 17 '24 02:03 RDouglasSharp