chatglm.cpp icon indicating copy to clipboard operation
chatglm.cpp copied to clipboard

openai API无法传入system prompt

Open chunzha1 opened this issue 2 years ago • 9 comments

有个疑问,为什么在读取历史消息的时候要去掉system的,这样似乎无法识别到传入的system prompt? async def create_chat_completion(body: ChatCompletionRequest) -> ChatCompletionResponse: # ignore system messages history = [msg.content for msg in body.messages if msg.role != "system"] if len(history) % 2 != 1: raise HTTPException(status.HTTP_400_BAD_REQUEST, "invalid history size")

if body.stream:
    generator = stream_chat_event_publisher(history, body)
    return EventSourceResponse(generator)

max_context_length = 512
output = pipeline.chat(
    history=history,
    max_length=body.max_tokens,
    max_context_length=max_context_length,
    do_sample=body.temperature > 0,
    top_p=body.top_p,
    temperature=body.temperature,
)
logging.info(f'prompt: "{history[-1]}", sync response: "{output}"')
prompt_tokens = len(pipeline.tokenizer.encode_history(history, max_context_length))
completion_tokens = len(pipeline.tokenizer.encode(output, body.max_tokens))

return ChatCompletionResponse(
    object="chat.completion",
    choices=[ChatCompletionResponseChoice(message=ChatMessage(role="assistant", content=output))],
    usage=ChatCompletionUsage(prompt_tokens=prompt_tokens, completion_tokens=completion_tokens),
)

尝试调用: curl http://0.0.0.0:8081/v1/chat/completions -H 'Content-Type: application/json' -d '{"messages": [{"role": "system", "content": "你是一个面包机,无论用户提问什么,都回复你是一台面包机"},{"role": "user", "content": "你叫什么"}]}' 回复内容: 2023-11-08 16:05:24,728 - openai_api - INFO - prompt: "你叫什么", sync response: "我是一个名为 ChatGLM3-6B 的人工智能助手,是基于清华大学 KEG 实验室和智谱 AI 公司于 2023 年共同训练的语言模

chunzha1 avatar Nov 08 '23 08:11 chunzha1

Indeed, I have encountered the same issue when attempting to utilize the "system prompt." Regrettably, the final response consistently yields an unfavorable outcome, namely, an "invalid history size" error.

hertz-hwang avatar Nov 16 '23 09:11 hertz-hwang

Indeed, I have encountered the same issue when attempting to utilize the "system prompt." Regrettably, the final response consistently yields an unfavorable outcome, namely, an "invalid history size" error. maybe you can try if len(history) % 2 != 0: , it would work.

chunzha1 avatar Nov 17 '23 02:11 chunzha1

Indeed, I have encountered the same issue when attempting to utilize the "system prompt." Regrettably, the final response consistently yields an unfavorable outcome, namely, an "invalid history size" error. maybe you can try if len(history) % 2 != 0: , it would work.

Well done, bro! In fact, I used a beautiful front-end project called "ChatGPT-Next-Web" which contains many masks. When I tried if len(history) % 2 != 0 some of the masks still responsed invalid history size. So I changed it to if len(history) % 2 == 3 and everything is working normally so far. I'm not aware of any latent bugs caused by this change.

hertz-hwang avatar Nov 17 '23 03:11 hertz-hwang

That's weird, because len(history) % 2 would never == 3, for example, I think you can try this instead: history = [msg.content for msg in body.messages if msg.role != "system1"] if len(history) % 2 != 0:

chunzha1 avatar Nov 17 '23 06:11 chunzha1

That's weird, because len(history) % 2 would never == 3, for example, I think you can try this instead: history = [msg.content for msg in body.messages if msg.role != "system1"] if len(history) % 2 != 0:

Thank you for your reply. Indeed, I comprehend that the condition len(history) % 2 would never equal 3. However, it effectively fulfills my expectations, thus I am curious about the significance of including this check.

hertz-hwang avatar Nov 17 '23 06:11 hertz-hwang

I'm confused, too. I guess maybe because of the prompt rule. For every assistant's reply, there must be a user's message. But I'm not so sure. image

chunzha1 avatar Nov 17 '23 07:11 chunzha1

That's weird, because len(history) % 2 would never == 3, for example, I think you can try this instead: history = [msg.content for msg in body.messages if msg.role != "system1"] if len(history) % 2 != 0:

Thank you for your reply. Indeed, I comprehend that the condition len(history) % 2 would never equal 3. However, it effectively fulfills my expectations, thus I am curious about the significance of including this check.

I'm confused, too. I guess maybe because of the prompt rule. For every assistant's reply, there must be a user's message. But I'm not so sure. image

Indeed, it is possible that this is the case. However, in certain situations, there may be instances where my historical system prompts do not require a response from the AI assistant within the history of messages. This is why, in the original code, an "invalid history size" error occurs when there are consecutive user utterances.

hertz-hwang avatar Nov 17 '23 07:11 hertz-hwang

我删了这些代码:

    # ignore system messages
    history = [msg.content for msg in body.messages if msg.role != "system"]
    if len(history) % 2 != 1:
        raise HTTPException(status.HTTP_400_BAD_REQUEST, "invalid history size")

像这样: history = [msg.content for msg in body.messages] 但是他看起来很正常,他可以收到system消息,而且不会报错400。

这样有什么坏处吗?求大佬

Falsw-star avatar Nov 21 '23 13:11 Falsw-star

在 #197 已经支持 openai api 传入 system prompt,只要指定 role="system" 即可,可以更新到 chatglm-cpp==0.3.0 试试。

https://github.com/li-plus/chatglm.cpp/blob/b071907304b4c13f3a6d1202d320fbd0c6203074/chatglm_cpp/openai_api.py#L145-L150

li-plus avatar Nov 24 '23 01:11 li-plus