LangChain-ChatGLM-Webui
LangChain-ChatGLM-Webui copied to clipboard
加载BELLE和Vicuna模型后,提问回答报错?
可以正常加载chatglm-6B-int8并且正常问答,但是加载BELLE-7b和Vicuna-7b模型后,进行提问,页面出现ERROR,同时后台报错如下信息:
TypeError: The current model class (LlamaModel) is not compatible with .generate()
, as it doesn't have a language model head. Please use one of the following classes instead: {'LlamaForCausalLM'}
代码断点定位在KnowledgeBasedChatLLM类的get_knowledge_based_answer函数的这一句上 result = knowledge_chain({"query": query})