LightRAG icon indicating copy to clipboard operation
LightRAG copied to clipboard

[Feature Request]: Multiple Prompt.py files for different use-cases

Open frederikhendrix opened this issue 8 months ago • 5 comments

Do you need to file a feature request?

  • [ ] I have searched the existing feature request and this feature request is not already filed.
  • [x] I believe this is a legitimate feature request, not just a question or bug.

Feature Request Description

Currently I am planning on using the LightRAG implementation for multiple different use-cases. To guide the AI a bit better for my specific use-case I change the prompt, the examples, the rag_response and this ultimately leads to a better result.

But when I want to use it for a different use-case I'm not able to have a second or third prompt.py file which I can insert at different endpoints.

Maybe it is an idea to add to query_param variable the name of the prompt.py file you want to use, also to the settings and set it to default there?

And then do something like this:

Directory structure:

my_application/
└── prompts/
    ├── __init__.py
    ├── default.py
    └── alternative.py

prompts/init.py:

import os
from typing import Any, Dict
from . import default, alternative

def get_prompts(prompt_type: str = None) -> Dict[str, Any]:
    # Use the provided prompt_type, or fall back to the environment variable if not provided.
    style = (prompt_type if prompt_type is not None else os.getenv("PROMPTS_STYLE", "default")).lower()
    
    if style == "alternative":
        return alternative.PROMPTS
    return default.PROMPTS

GRAPH_FIELD_SEP = default.GRAPH_FIELD_SEP  # You can also decide based on style if needed.

Now you can simply import from the package:

from prompts import GRAPH_FIELD_SEP, get_prompts

# Using default (either through env or as fallback)
PROMPTS = get_prompts()

# Explicitly using the alternative configuration
PROMPTS_alt = get_prompts(prompt_type="alternative")

This way depending on your use-case you can opt for a specific prompt.py file with desciptive prompts.

Additional Context

No response

frederikhendrix avatar Apr 11 '25 08:04 frederikhendrix

Multiple prompt templates also applied during the document indexing stage. Do we have a comprehensive and user-friendly solution that allows users to select different templates during both indexing and query phases?

danielaskdd avatar Apr 12 '25 00:04 danielaskdd

Proposal: Three Methods for Prompt Template Selection

I recommend implementing the following three approaches to select prompt templates by name:

  1. Environment Variable: Configure via DEFAULT_PROMPT_TEMPLATE.
  2. API Endpoint: Expose an API to allow users to dynamically switch the active template.
  3. Query Parameter: Support per-request template selection by passing a prompt_template parameter.

@LarFii

danielaskdd avatar Apr 12 '25 08:04 danielaskdd

Multiple Sets of Prompt Templates Requirement

  • Template Types: Entity-relationship extraction templates, query templates, entity-relationship merging templates

  • Template Versions: Each template type has multiple versions. Users can switch between different versions and set the current and default versions. The default version will be used as a fallback in case the specified version does not exist.

  • Prompt Macros: Each version’s template file can contain multiple prompt macros, which support variable substitution.

  • Version Fallback: If a certain prompt macro is missing in a specific version, it will be automatically replaced by the macro from the default version. The default version must include all prompt macros.

  • Variable Substitution Logic: Variables in prompt macros can have default values. If a substitution value is not provided during use, the default value will be used automatically. For macros without default values, substitution variables must be provided when used. If a substitution is provided for a non-existent variable, it will be ignored and a warning will be logged.

Implementation Plan

We plan to create a prompt_template directory and set up a subdirectory for each template type. Each version of a prompt will be stored as a file within the corresponding subdirectory. The file format will be Jinja2.

danielaskdd avatar May 22 '25 01:05 danielaskdd

Hi @danielaskdd Thanks for pointed me to this feature change, which is really welcomed/wanted one question, is API / WebUI support will be included? managing and switching between multiple set of templates could be tedious for end-user - the reason why I started with the same thoughts but ended up with a single template in #1610

jerrywang121 avatar May 23 '25 08:05 jerrywang121

Certainly! The API should, at a minimum, support the following functionalities:

  1. Temporarily switch the query prompt version using query parameters.
  2. Provide a separate API to list template types and their versions, and to set the default version for each template type.

Do you have any suggestions or additional requirements? @jerrywang121

danielaskdd avatar May 23 '25 10:05 danielaskdd

Talking about using template, actually I have some thoughts on the current keyword query & extraction logic.

Due to different business domain knowledge, LLM might not have enough background or correct understanding on some specific "terms" or "business process". Providing entity types for LLM to extract HL and LL keywords to extract entity name and keyword for query might not working well.

If we can provide a list of "entity name" (It's definition can be provided or retrieved from DB which is summarized/provided before) for LLM to extract their relationship in the context when processing document, and direct provide a list of "entity name" for it to query, would it be more effective?

Could these be achieved by using different templates? Or some implementation logics might have to be adjusted?

kenspirit avatar Jun 29 '25 01:06 kenspirit

I suggest providing a complete document processing pipeline for each file/batch.

  • File Loader: standard, MinerU, Docling, Tika, etc. (MinerU support multi modal rag)
  • Graph Extraction Prompt Template: Standard and user-defined prompts (with the ability to inject necessary knowledge).
  • LLM Model:

@LarFii

danielaskdd avatar Jun 29 '25 02:06 danielaskdd