gpt4all icon indicating copy to clipboard operation
gpt4all copied to clipboard

[Feature] Use LocalDocs from the python bindings without the GUI

Open npham2003 opened this issue 2 years ago • 5 comments

Issue you'd like to raise.

I'd like to use GPT4All to make a chatbot that answers questions based on PDFs, and would like to know if there's any support for using the LocalDocs plugin without the GUI. I know that getting LocalDocs support for server mode is in progress, but until that's done is there any way to use the plugin locally through the python library?

Suggestion:

No response

npham2003 avatar Jul 03 '23 06:07 npham2003

It's implemented as part of the chat GUI; the Python bindings sit at a lower level.

You'd have to come up with something yourself.

Although I know there is at least one other project which does something similar, called privateGPT -- I've never tried that myself, however.

cosmic-snow avatar Jul 07 '23 02:07 cosmic-snow

It's implemented as part of the chat GUI; the Python bindings sit at a lower level.

You'd have to come up with something yourself.

Although I know there is at least one other project which does something similar, called privateGPT -- I've never tried that myself, however.

PrivateGPT works well - I would suggest it. GPT4All comes with a built in server mode from my understanding. Refer to the documentation here: https://docs.gpt4all.io/gpt4all_chat.html#localdocs-roadmap

Server Mode GPT4All Chat comes with a built-in server mode allowing you to programmatically interact with any supported local LLM through a very familiar HTTP API. You can find the API documentation here.

Enabling server mode in the chat client will spin-up on an HTTP server running on localhost port 4891 (the reverse of 1984). You can enable the webserver via GPT4All Chat > Settings > Enable web server.

Begin using local LLMs in your AI powered apps by changing a single line of code: the base path for requests.

And here is the code you can use:

import openai

openai.api_base = "http://localhost:4891/v1"
#openai.api_base = "https://api.openai.com/v1"

openai.api_key = "not needed for a local LLM"

# Set up the prompt and other parameters for the API request
prompt = "Who is Michael Jordan?"

# model = "gpt-3.5-turbo"
#model = "mpt-7b-chat"
model = "gpt4all-j-v1.3-groovy"

# Make the API request
response = openai.Completion.create(
    model=model,
    prompt=prompt,
    max_tokens=50,
    temperature=0.28,
    top_p=0.95,
    n=1,
    echo=True,
    stream=False
)

# Print the generated completion
print(response)

DoingFedTime avatar Jul 08 '23 13:07 DoingFedTime

Would like to see Golang bindings for LocalDoc as well.

vodkadrunkinski avatar Mar 29 '24 05:03 vodkadrunkinski

And here is the code you can use:

@DoingFedTime How do you use the LocalDocs with that code?

It seems it's not supported via API https://github.com/nomic-ai/gpt4all/issues/1200

vodkadrunkinski avatar Mar 29 '24 05:03 vodkadrunkinski