yxxx

Results 5 comments of yxxx

add stop parameter, it works for me ``` data = { 'inputs': prompt, 'parameters' : { 'max_new_tokens': 1024, 'stop': ["", "", "", "

> Yes it is. And hf-chat sends that stop token currently. Why does my local deployment of llama3-70b-instruct perform worse than Hugging Chat when answering the same questions? Hugging Chat...

> > Yes it is. And hf-chat sends that stop token currently. > > Why does my local deployment of llama3-70b-instruct perform worse than Hugging Chat when answering the same...

我暂时没有时间补充writeup,主页添了一些别人提交的writeup你可以参考一下

看看堆栈有啥报错