langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Integration with BLOOM, FLAN, GPT-NEO, GPT-J etc.

Open jasonmhead opened this issue 2 years ago • 4 comments

What would be the best area to look up how to integrate Langchain with with BLOOM, FLAN, GPT-NEO, GPT-J etc., outside of a pre-existing cloud service? So it could be used locally, or with a API setup by a developer.

And if not available, what are the things and steps that should be considered to develop and contribute integration code?

jasonmhead avatar Feb 18 '23 13:02 jasonmhead

You can use huggingface LLM to load your custom LLM there.

This is a good place to start https://langchain.readthedocs.io/en/latest/modules/llms/integrations.html

nqbao avatar Feb 18 '23 23:02 nqbao

Would someone incur any storage or compute costs?

What if someone wanted to host their own model? How would they go about developing that?

On Sun, Feb 19, 2023, 8:46 AM Bao Nguyen @.***> wrote:

You can use huggingface LLM to load your custom LLM there.

— Reply to this email directly, view it on GitHub https://github.com/hwchase17/langchain/issues/1138#issuecomment-1435793862, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABO3B52GBHRG6SNMGBKFPSDWYFNNDANCNFSM6AAAAAAVAK6TQE . You are receiving this because you authored the thread.Message ID: @.***>

jasonmhead avatar Feb 19 '23 03:02 jasonmhead

@jasonmhead one way would be to upload your model to huggingface hub

batmanscode avatar Feb 19 '23 23:02 batmanscode

@jasonmhead you can use Manifest to run your own model locally https://langchain.readthedocs.io/en/latest/modules/llms/integrations/manifest.html

vzeman avatar Feb 25 '23 13:02 vzeman

Hi, @jasonmhead! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, the issue is about integrating LangChain with various services like BLOOM, FLAN, GPT-NEO, GPT-J, etc. You were seeking guidance on how to develop and contribute integration code if it is not already available. In the comments, users suggested using Huggingface LLM to load custom LLM and uploading the model to Huggingface hub. Another user mentioned using Manifest to run the model locally.

Before we close this issue, we wanted to check with you if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

Thank you for your understanding and contribution to the LangChain project!

dosubot[bot] avatar Sep 19 '23 16:09 dosubot[bot]

Hi, @jasonmhead! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, the issue is about integrating LangChain with various services like BLOOM, FLAN, GPT-NEO, GPT-J, etc. You were seeking guidance on how to develop and contribute integration code if it is not already available. In the comments, users suggested using Huggingface LLM to load custom LLM and uploading the model to Huggingface hub. Another user mentioned using Manifest to run the model locally.

Before we close this issue, we wanted to check with you if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

Thank you for your understanding and contribution to the LangChain project!

dosubot[bot] avatar Sep 19 '23 16:09 dosubot[bot]