Langchain-Chatchat icon indicating copy to clipboard operation
Langchain-Chatchat copied to clipboard

Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM 等语言模型的本地知识库问答 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM) QA app with la...

Results 929 Langchain-Chatchat issues
Sort by recently updated
recently updated
newest added

目前在载入了知识库后,单张V100 32G在回答垂直领域的问题时也需要20S以上,没有打字机逐字输出的使用体验还是比较煎熬的....

如果通过设计system_template,让模型在搜索到的文档都不太相关的情况下回答“我不知道”

you must pass the application as an import string to enable "reload" or "workers" 这是啥原因呢

Traceback (most recent call last): File "/home/heyiheng/work/langchain/langchain-ChatGLM/webui.py", line 155, in chatbot = gr.Chatbot([[None, init_message], [None, model_status.value]], File "/home/young/anaconda3/envs/sakura/lib/python3.10/site-packages/gradio/components.py", line 3862, in __init__ IOComponent.__init__( File "/home/young/anaconda3/envs/sakura/lib/python3.10/site-packages/gradio/components.py", line 185, in __init__ else...

ValidationError: 1 validation error for HuggingFaceEmbeddings model_kwargs extra fields not permitted (type=value_error.extra) Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████| 8/8 [00:12

作者您好~ 我看到目前支持的模型有chatglm-6b-int4-qe,chatglm-6b-int4,chatglm-6b,chatyuan。请问是否支持基于chatGLM finetune后的模型呢(ptuning或者lora)? 我在加载自己本地finetune后的checkpoint的时候,遇到如下警告,模型没有加载成功。 Some weights of ChatGLMForConditionalGeneration were not initialized from the model checkpoint at (我的路径)/ChatGLM-6B/ptuning/output/adgen-chatglm-6b-pt-2048-2e-2/checkpoint-3000 and are newly initialized: ['transformer.layers.5.mlp.dense_4h_to_h.bias', 'transformer.layers.13.mlp.dense_4h_to_h.bias', 'transformer.layers.21.attention.dense.weight', 'transformer.layers.3.mlp.dense_4h_to_h.bias', 'transformer.layers.17.mlp.dense_h_to_4h.weight', 'transformer.layers.19.attention.dense.bias', 'transformer.layers.6.mlp.dense_h_to_4h.weight', 'transformer.layers.18.input_layernorm.bias', 'transformer.layers.22.input_layernorm.weight',...

最好能改进下输出代码片段的格式,目前输出的格式还不友好。 比如 Q: 请给我一个k8s创建pod的manifest A: 好的,以下是一个基本的 KubernetesPod manifest,您可以根据需要进行修改: apiVersion: v1kind: Podmetadata: name: my-podspec: containers: - name: my-container image: my-image:latest ports: - containerPort: 8080 - name: my-other-container image: my-image:latest ports: -...

目前的webui.py和cli_demo.py并没有正确的把VECTOR_SEARCH_TOP_K和LLM_HISTORY_LEN参数进行传入,因此我进行了修改,让这两个参数传入LocalDocQA类

Mac 运行 python3 ./webui.py 报 TypeError: The type of ChatGLM.callback_manager differs from the new default value; if you wish to change the type of this field, please use a type...