gpt4free icon indicating copy to clipboard operation
gpt4free copied to clipboard

Request for documentation on integrating Langchain with GPT4Free

Open PhillipRt opened this issue 1 year ago • 5 comments

Hello GPT4Free maintainers,

I've been exploring the GPT4Free project and would like to use it in conjunction with Langchain (python.langchain.com). I'm particularly interested in using Langchain to run queries against GPT4All in the context of a single documentary knowledge source, as mentioned in blog.ouseful.info. However, I haven't found any explicit documentation or examples on how to integrate Langchain with GPT4Free.

I would appreciate it if you could provide some guidance or documentation on how to achieve this integration. In particular, I'm looking for information on:

  1. How to make API requests to GPT4Free using Langchain.
  2. How to process the generated text from GPT4Free using Langchain (e.g., translation, summarization, etc.).

Any assistance or pointers in the right direction would be greatly appreciated. Thank you for your time and your work on this project!

PhillipRt avatar May 08 '23 21:05 PhillipRt

Same Question 👍🏽

toukoum avatar May 09 '23 14:05 toukoum

Amaizing

white-elephant-li avatar May 12 '23 07:05 white-elephant-li

waiting for this!

bhaskoro-muthohar avatar May 14 '23 08:05 bhaskoro-muthohar

Hello World

from typing import Optional, List, Mapping, Any
from langchain.llms.base import LLM
import gpt4free
from gpt4free import Provider

class EducationalLLM(LLM):
    
    @property
    def _llm_type(self) -> str:
        return "custom"

    def _call(self, prompt: str, stop: Optional[List[str]] = None) -> str:
        return gpt4free.Completion.create(Provider.You, prompt=prompt)
        

llm = EducationalLLM()
 
from langchain.prompts import PromptTemplate

prompt = PromptTemplate(
    input_variables=["product"],
    template="What is a good name for a company that makes {product}? Just tell one and only the name",
)

from langchain.chains import LLMChain
chain = LLMChain(llm=llm, prompt=prompt)

print(chain.run("colorful socks"))

lindermanqms1984 avatar May 15 '23 12:05 lindermanqms1984

Hello World

from typing import Optional, List, Mapping, Any
from langchain.llms.base import LLM
import gpt4free
from gpt4free import Provider

class EducationalLLM(LLM):
    
    @property
    def _llm_type(self) -> str:
        return "custom"

    def _call(self, prompt: str, stop: Optional[List[str]] = None) -> str:
        return gpt4free.Completion.create(Provider.You, prompt=prompt)
        

llm = EducationalLLM()
 
from langchain.prompts import PromptTemplate

prompt = PromptTemplate(
    input_variables=["product"],
    template="What is a good name for a company that makes {product}? Just tell one and only the name",
)

from langchain.chains import LLMChain
chain = LLMChain(llm=llm, prompt=prompt)

print(chain.run("colorful socks"))

Not respecting the stop words will cause the LLM to hallucinate tool usage and response. I have attached an example of how to truncate the response from gpt4free when there is no stop word support in the create method:

class EducationalLLM(LLM):
    
    @property
    def _llm_type(self) -> str:
        return "custom"
    
    def _call(self, prompt: str, stop: Optional[List[str]] = None) -> str:
        out = gpt4free.Completion.create(Provider.You, prompt=prompt)
        if stop:
            stop_indexes = (out.find(s) for s in stop if s in out)
            min_stop = min(stop_indexes, default=-1)
            if min_stop > -1:
                out = out[:min_stop]
        return out

Faleij avatar May 18 '23 15:05 Faleij

can we use agents with this?

biegaj avatar Aug 02 '23 16:08 biegaj

Yes 👍 you can use everything from langchain !

toukoum avatar Aug 02 '23 19:08 toukoum

Updated code-

from typing import Optional, List, Mapping, Any
from langchain.llms.base import LLM
# import gpt4free
# from gpt4free import Provider
from langchain.prompts import PromptTemplate
from langchain.chains import LLMChain
import g4f


class EducationalLLM(LLM):

    @property
    def _llm_type(self) -> str:
        return "custom"

    def _call(self, prompt: str, stop: Optional[List[str]] = None) -> str:
        out = g4f.ChatCompletion.create(
            model=g4f.models.gpt_4,
            messages=[{"role": "user", "content": prompt}],
        )  #
        if stop:
            stop_indexes = (out.find(s) for s in stop if s in out)
            min_stop = min(stop_indexes, default=-1)
            if min_stop > -1:
                out = out[:min_stop]
        return out


llm = EducationalLLM()

prompt = PromptTemplate(
    input_variables=["product"],
    template="What is a good name for a company that makes {product}? Just tell one and only the name",
)

chain = LLMChain(llm=llm, prompt=prompt)

print(chain.run("colorful socks"))

Dipeshpal avatar Oct 13 '23 09:10 Dipeshpal

I create a project to export g4f to openai api, it may help. https://github.com/tgscan-dev/g4f-api

tgscan-dev avatar Oct 21 '23 04:10 tgscan-dev

I create a project to export g4f to openai api, it may help. https://github.com/tgscan-dev/g4f-api

That is awesome! Does that mean we can now integrate gpt4free into AutoGPT, GPT-Pilot, Agents, etc. by providing the local ip address for the openai key? Or is there going to be any issues with some of the responses such that only simple-text/based response is available making it unusable for AutoGPT? @xtekky I read your comments about AutoGPT saying something similar.

keithorange avatar Oct 21 '23 23:10 keithorange

I create a project to export g4f to openai api, it may help. https://github.com/tgscan-dev/g4f-api

That is awesome! Does that mean we can now integrate gpt4free into AutoGPT, GPT-Pilot, Agents, etc. by providing the local ip address for the openai key? Or is there going to be any issues with some of the responses such that only simple-text/based response is available making it unusable for AutoGPT? @xtekky I read your comments about AutoGPT saying something similar.

@xtekky https://github.com/xtekky/gpt4free/issues/258#issuecomment-1528761387

keithorange avatar Oct 22 '23 00:10 keithorange

Yes 👍 you can use everything from langchain !

hey @toukoum I was trying to use langchain with g4f with agents,but its not working.

eg: openai-tools are not working for g4f results

cool-dev-guy avatar Jan 25 '24 02:01 cool-dev-guy

how can i use bind_tools from langchin with g4f ? please help

azmi2104 avatar Jun 15 '24 12:06 azmi2104

Updated: `from typing import Optional, List, Mapping, Any from langchain.llms.base import LLM from langchain.prompts import PromptTemplate from langchain.chains import LLMChain import g4f from g4f.client import Client

class EducationalLLM(LLM):

@property
def _llm_type(self) -> str:
    return "custom"

def _call(self, prompt: str, stop: Optional[List[str]] = None, model: str = "gpt-4o") -> str:
    client = Client()
    response = client.chat.completions.create(
        model=model,
        messages=[{"role": "user", "content": prompt}],
    )
    out = response.choices[0].message.content

    if stop:
        stop_indexes = (out.find(s) for s in stop if s in out)
        min_stop = min(stop_indexes, default=-1)
        if min_stop > -1:
            out = out[:min_stop]
    return out

llm = EducationalLLM()

prompt = PromptTemplate( input_variables=["product"], template="What is a good name for a company that makes {product}? Just tell one and only the name", )

chain = LLMChain(llm=llm, prompt=prompt)

print(chain.run("colorful socks"))`

avijitbhuin21 avatar Aug 10 '24 15:08 avijitbhuin21