TAO JIANG
TAO JIANG
你解决了吗?我也报这个错
> 我知道为什么... 在 中`actor.py`,检查点被保存为键“model”的值,而在保存 LLaMA 模型时则不需要。  > > 所以在 中`llama_model.py`,`def load_checkpoints`应该这样修改:  Thank you, after reading your answer, I successfully got the code to run on multiple GPUs, but...
> She seems to be Japanese, does Japan also use QQ?😊
> You have good eyesight!!
> > > > > > > > > You have good eyesight!! > > Did you run it successfully? I think we can provide a QQ group chat account...
> > > > > > > > > > > > > > > > > You have good eyesight!! > > > > > > > > >...
> Known problem sadly :( I just didn't know how to fix it - I'm assuming ur calling `get_chat_template` more than once in the notebook correct? No, I just got...
I also want to know if it can support onnx model