langchainjs
langchainjs copied to clipboard
Support llama.cpp
It would be nice to support bindings from llama.cpp for question generation and embeddings. This is already supported in the main library.
This would be actually a great feature to have in place! Do you guys by any chance plan to add this anytime soon?
cc @nfcampos @hwchase17
+1 for the feature!
+1 for the feature!
+1 for the feature!
+1 yes please!
+1 for the feature!
- 1
+1 for this!
would love to use a local model with LangChain.js!
I think this may be the support we were asking for: https://js.langchain.com/docs/modules/model_io/models/llms/integrations/llama_cpp
I think this may be the support we were asking for: https://js.langchain.com/docs/modules/model_io/models/llms/integrations/llama_cpp
good job,it seems developed by catai👍
There seems to be some issues with using it and Langchain agents. https://github.com/hwchase17/langchainjs/discussions/2486
Hi, @hmd-ai
I'm helping the langchainjs team manage their backlog and am marking this issue as stale. The issue you raised requests the addition of support for bindings from llama.cpp for question generation and embeddings, with several users expressing interest and support for this feature. Additionally, there has been a potential solution shared by bklynate, along with some highlighted issues with using it in Langchain agents.
Could you please confirm if this issue is still relevant to the latest version of the langchainjs repository? If it is, please let the langchainjs team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you!