gptel
gptel copied to clipboard
Support for privateGPT, including support for RAG functions like rspecting context and parsing of sources
Hi, as suggested in our recent discussion #305, here's a PR that adds support for interacting with privateGPT. Its based on the openai-backend, with some additions. The focus was to enable the use of two specific key words, as described in the API reference:
-
use_context
: directs the LLM to use the context of documents that have been "ingested" -
include_sources
: directs the LLM to also return information about the sources of the context (so far: file name and page number, if appropriate).
I changed the code of gptel-openai.el
to send the additional keywords to the llm server, and to correctly parse the answer if sources are provided.
I tested the code with different queries, and so far it works for my use case. However, there is at least one missing feature: currently, the two new keywords use_context
and include_sources
are hardcoded to t
. It would probably be better to make them configurable when creating the backend. Unfortunately, I'm not sure how to do that correctly.