langchain
langchain copied to clipboard
DOC: How to create a ChatModel from Huggingface using local llm
Checklist
- [X] I added a very descriptive title to this issue.
- [X] I included a link to the documentation page I am referring to (if applicable).
Issue with current documentation:
langchain_community.llms.huggingface_hub.HuggingFaceHub
was deprecated, and langchain_community.chat_models.huggingface.ChatHuggingFace
only Works with HuggingFaceTextGenInference, HuggingFaceEndpoint, and HuggingFaceHub
So how to create a ChatHuggingFace instance from local huggingface Model Now??
source :https://api.python.langchain.com/en/latest/chat_models/langchain_community.chat_models.huggingface.ChatHuggingFace.html#langchain_community.chat_models.huggingface.ChatHuggingFace https://api.python.langchain.com/en/latest/llms/langchain_community.llms.huggingface_hub.HuggingFaceHub.html#langchain_community.llms.huggingface_hub.HuggingFaceHub
Idea or request for content:
document says HuggingFaceHub was deprecated, i should use the HuggingFaceEndPoint, but that is two difference integrate method.
I am alse confused by this method, LangChain can't use huggingface chat model locally. Did you find a way to use it?
I am alse confused by this method, LangChain can't use huggingface chat model locally. Did you find a way to use it?
there's a library called langchain-huggingface in https://github.com/langchain-ai/langchain/tree/master/libs/partners/huggingface, but can't find any docs about it.
Is it deprecated because it now got moved to the separate library? That's what I would expect.
But the chat in the library still only supports the API endpoints according to its docstring:
Works with HuggingFaceTextGenInference
, HuggingFaceEndpoint
,
and HuggingFaceHub
LLMs.