dspy icon indicating copy to clipboard operation
dspy copied to clipboard

add system prompt design to GPT model

Open xdite opened this issue 1 year ago • 5 comments

It seems missing a key feature: "system prompt" "system prompt" impacts gpt-model significantly at least let as can put in config or somewhere else?

xdite avatar Feb 08 '24 00:02 xdite

Good point. We need to think harder about chat model support. We may put the instructions in the system prompt, or something else.

Out of curiosity, what would you like to put in the system prompt?

okhat avatar Feb 11 '24 05:02 okhat

Second. Would be great to modify it per module.

AriMKatz avatar Feb 11 '24 07:02 AriMKatz

Out of curiosity, what would you like to put in the system prompt?

Want to use the same pipeline across projects, with varying background context. The pipelines capture the cognitive processes well, those don't vary across projects. But the background context changes from project to project. (Maybe there is a diff way to do this other than system prompt? I'm new to DSPy.)

marcgreen avatar Feb 13 '24 15:02 marcgreen

Good point. We need to think harder about chat model support. We may put the instructions in the system prompt, or something else.

Out of curiosity, what would you like to put in the system prompt?

My best scenarios are almost:

Packing what I want GPT to do into a Prompt Class structure, similar to object-oriented programming, but with a very programmatic way of writing prompts.

And putting the data and context in the User Prompt.

system prompt = method, user prompt = user input data

GPT API will be significantly work that way.It is much stronger than putting everything in the user prompt.

xdite avatar Feb 16 '24 22:02 xdite

It may not just be the specific LM, but also the native language. I work at S&N (Germany)

jmaushake avatar Feb 22 '24 11:02 jmaushake

+1. Have experimented with guiding behavior in system prompt as well as in user prompt. For certain (admittedly raunchy) behaviors, putting them in the system prompt works, whereas shoving it in the user prompt results in "I'm sorry, I cannot assist.." etc.

DSPy is unusable for building complex gpt-based chatting products at the moment, imo. Will explore other models or other tools in the meantime.

The ideal behavior for me would be that for models which have a chat completion API, DSPy at the very least builds a chat history of messages with roles, as the api expects, instead of putting everything in a single user message. Perhaps having a special "chat history" input parameter, along with either letting us specify the system prompts, or intelligently building the system prompts.

adam-simple avatar Mar 10 '24 19:03 adam-simple

Implemented. #618

xdite avatar Mar 21 '24 08:03 xdite