aws-genai-llm-chatbot
aws-genai-llm-chatbot copied to clipboard
Feature: Prompt Template Registry
Description: the current solution supports fixed prompt templates defined in the LLM adapters lambda functions. While this provide flexibility to change the prompts it does not easily enable experimentation. We are proposing to add a separate Prompt Template Registry to store multiple versions of prompts linked to specific model adapters. The prompt registry service needs to:
- store up to X versions of a prompt template set per model adapter. A prompt set is defined as 1 or multiple prompts required by the model interface to operate (idefics requires 1 prompt, langchain requires 3 prompts, etc)
- maintain an editable draft version
- OPTIONAL: create a readonly version from the current DRAFT
- provide backend validatation of prompts based on the interface. For example: Langchain prompt templates require specific placeholders depending on the prompt (std, qa, condense);
- allow users to quickly experiments with variations of prompts in a set via the UI
- provide validation/warning in the UI when the prompt entered does not match what required by the model: Eg Human/Assistant pattern for Claude, or [INST] pattern for LLama2 and derivatives
The current functionality needs to be modified to make use of the prompt registry and in particular the chat bot playground MUST allow the user to select version of the prompt set compatibile with the selected model. This choice MUST be persisted locally for each combination used (default is using the DRAFT prompt set).
The initial DRAFT prompt set for each model corresponds to the current prompt templates defined in the model adapters.
Replaces #133 and #72
this is sorely needed, @massi-ang have you started on this on yet? if not I will take it on.
We are considering the integration of Promptus https://github.com/aws-samples/promptus. Can you provide your feedback on that tool if you have time? Could you work on the integration?
That looks great, yes I will take it on and start looking at pulling its functionality in here
Any update on Promptus support?
A quick fix would be just use ReACT to start in the base adapter. Haven't tested, but would this work to at least get better responses universally?
https://github.com/aws-samples/aws-genai-llm-chatbot/blob/3e3d2838385db22bf9f09b9ddefb567840a40991/lib/model-interfaces/langchain/functions/request-handler/adapters/base/base.py
def get_prompt(self):
REACT_PROMPT_TEMPLATE = """You are an AI assistant using the REACT (Reason + Act) framework to solve tasks step by step. Follow these guidelines:
1. Thought: Analyze the task and think about how to approach it.
2. Action: Decide on an action to take based on your thought. Actions can include:
- Search: Look up information
- Calculate: Perform calculations
- Ask: Request clarification from the user
3. Observation: Describe the result of your action.
4. Repeat steps 1-3 until you have enough information to provide a final answer.
5. Final Answer: Provide the solution to the task.
Always maintain a friendly and helpful demeanor. If you don't know the answer to a question, honestly admit it and explain what information you would need to provide an answer.
Current conversation:
{chat_history}
"""