ChatGLM-6B
ChatGLM-6B copied to clipboard
[Feature] 请问可否考虑增加一个通过LangChain调用的Demo
Is your feature request related to a problem? Please describe.
No response
Solutions
为便于开展基于本地数据/文件索引的chatglm应用,可否考虑增加一个通过LangChain调用ChatGLM模型的Demo
Additional context
No response
我根据这个 尝试了一下,
报错
Traceback (most recent call last): File "/data/wuhaixu/mygpt/ChatGLM-6B/tests/test_langchain.py", line 7, in <module> llm_chain = LLMChain(prompt=prompt, llm=HuggingFaceHub(repo_id="THUDM/chatglm-6b", model_kwargs={"temperature":0, "max_length":64})) File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__ pydantic.error_wrappers.ValidationError: 1 validation error for HuggingFaceHub __root__ Task not specified in the repository. Please add it to the model card using pipeline_tag (https://huggingface.co/docs#how-is-a-models-type-of-inference-api-and-widget-determined) (type=value_error)
问了一下chatgpt, 他说 ` It sounds like you may be encountering an issue with a machine learning model repository where a task is not specified for a particular model. To solve this problem, you can add a pipeline_tag to the model card that specifies the task or use case for which the model was designed.
A pipeline_tag is a metadata field that provides information about the intended use case for a particular model. For example, if the model was trained for sentiment analysis, you could add a pipeline_tag for "sentiment analysis." This will make it easier for users to understand the model's purpose and suitability for their own use cases.
To add a pipeline_tag, you will need to access the model card for the repository and look for the metadata fields. Depending on the repository platform, this could be called something like "tags" or "metadata." Add the appropriate pipeline_tag to indicate the task for which the model was designed.
If you do not have permission to edit the model card or if the repository does not support pipeline_tags, you can contact the repository owner or submit an issue to request that the task be added to the model card. `
所以,可能要请repository owner,把task加到model card上。
@NLPpupil 目前已经通过创建CustomLLM类成功实现了,后面准备发个repo
@imClumsyPanda
@NLPpupil 目前已经通过创建CustomLLM类成功实现了,后面准备发个repo
帅呀,求教怎么弄的?在线等,挺急的。
@NLPpupil 可以参考一下 https://langchain.readthedocs.io/en/latest/modules/llms/examples/custom_llm.html 和 https://gist.github.com/DamascusGit/40bf1e9fd63974a95d071e3508e3f7c3
@imClumsyPanda
谢谢大佬,我好像会了
https://langchain.readthedocs.io/en/latest/modules/llms/examples/custom_llm.html
等一手大佬的repo
@zhongtao93 已经改用langchain直接实现啦,晚上回去我上传一下,也实现了利用fastapi搞了个api版本的,晚上我先上传直接实现的版本。
@zhongtao93 来咯 https://github.com/imClumsyPanda/langchain-ChatGLM
大佬牛掰在 2023年3月31日,20:24,imClumsyPanda @.***> 写道: @zhongtao93 来咯 https://github.com/imClumsyPanda/langchain-ChatGLM
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>