Matthew Xiong
Results
1
comments of
Matthew Xiong
It looks like this behavior depends on what model you are using, try to change to chat model like Llama-2-7b-chat-hf will solve this issue.