LLamaSharp icon indicating copy to clipboard operation
LLamaSharp copied to clipboard

Cannot add a user message after another user message

Open hswlab opened this issue 1 year ago • 6 comments

I'm not sure what I'm doing wrong, but I'm getting following error after updating from 0.8.1 to 0.9.1 Cannot add a user message after another user message (Parameter 'message')

Previously there was just a string parameter in ChatAsync but this was replaced by ChatHistory.Message, so I updated my code with the new type as follows image

I'm usually calling this code twice in a row If a new session is starting. The first time I'm calling it with a promt that describes, how the AI should behave. The second time I'm setting the regular question which is comming from the user input. After the first call I'm getting no response from the model, just like before the update, but if I'm calling this code a second time with my regular user question I'm getting this error. Do you have an advice?

By the way, I'm using ignore antipromts to hide the user names in the response, probably this is a problem now, because the ai is not giving a response after the first call, so it looks like I'm sending 2x a message without awaiting a response?

hswlab avatar Jan 14 '24 17:01 hswlab

I'm not very familiar with the higher level ChatSession stuff, but shouldn't the first message be a AuthorRole.System type message, instead of a user message?

@philippjbauer worked on an overhaul of ChatSession and associated classes, so he might have a better answer.

martindevans avatar Jan 15 '24 01:01 martindevans

I currently tried to replase Role User by System at the first call, but now the first call breaks wit the error message: "Message must be a user message (Parameter 'message')"

Basically I have a similar implementation like in the example: ChatSessionStripRoleName.cs In this example, the ChatAsync method seems to use just the user role. image

The Llamasharp web project seems to run without any problems, maybe I can learn something from it next weekend. I suspect it's because of my initial text prompt. Maybe it needs to be worded a little differently for the new parameter. We'll see.

hswlab avatar Jan 15 '24 19:01 hswlab

@hswlab can you post your code here?

The chat session requires an optional System message and then alternating User and Assistant messages. I suspect there is an issue in your calling code, perhaps?

philippjbauer avatar Feb 04 '24 19:02 philippjbauer

@philippjbauer Unfortunately I haven't found the time to watch it yet. But I can post an excerpt from my code here. This method is called only once at the start of a conversation. In this method I tell the chatbot how it should behave for the rest of the conversation.

 private static async Task CreateNewTopic()
 {
     try
     {

	...

         // reset session
         if (_chatSession?.Executor?.Context != null)
         {
             try
             {
                 _chatSession.Executor.Context.Dispose();
                 _chatSession = null;
             }
             finally { }
         }

	...

         if (!string.IsNullOrEmpty(modelPath))
         {

	     ...
             // split possible "name firstname" into "name:", "firstname:"
             string prompt = string.Empty;
             string userName = SettingsManager.GetDbSetting("UserName") ?? string.Empty;
             List<string> antiPrompts = userName.Split(new char[] { ',', ' ' }).Select(x => $"{x.Trim()}:").ToList();
             List<string> ignoreAntiPrompts = antiPrompts;
             ILLamaExecutor executor = null;

             switch (modelName)
             {
		 ...
                 case "vicuna-7b":
                 case "llama-2-7B":
                 case "openchat_3.5":
                     {

                         prompt =
                            $"A chat between a curious user named {userName} and an artificial intelligence assistant named Kati. " +
                            $"Kati gives helpful, detailed, and polite answers to the user's questions, addresses the user informally and simulates emotions by using emojis. " +
                            $"Code inside Kati's answers is always wrapped in ``` for readability. " +
                            $"Kati prefers to answer in {LanguageManager.Translate("German")}, but she can also answer in another language if she is asked. ";
                         prompt = $"{userName}: {{prompt}} Kati:\r\n";


                         antiPrompts =
                             antiPrompts.Concat(new List<string> { "USER:" }).ToList();


                         ignoreAntiPrompts =
                             ignoreAntiPrompts.Concat(new List<string> { "Kati:", "ASSISTANT:" }).ToList();


                         // Load a model
                         ModelParams parameters = new ModelParams(modelPath)
                         {
                             ContextSize = 1024,
                             Seed = 1337,
                             GpuLayerCount = 5
                         };


                         // Session Executor
                         using LLamaWeights model = LLamaWeights.LoadFromFile(parameters);
                         LLamaContext context = model.CreateContext(parameters);
                         executor = new InteractiveExecutor(context);

                         break;
                     }
                 ...
             }


             _antiPrompts = antiPrompts.ToArray();
             _chatSession = new ChatSession(executor).WithOutputTransform(new LLamaTransforms.KeywordTextOutputStreamTransform(ignoreAntiPrompts, redundancyLength: 8));
             _chatCancellationTokenSource = new CancellationTokenSource(TimeSpan.FromMinutes(int.Parse(SettingsManager.GetDbSetting("RequestTimeout") ?? "0")));

             await foreach (var text in _chatSession.ChatAsync(
                 message: new ChatHistory.Message(AuthorRole.User, prompt),
                 inferenceParams: new InferenceParams() { Temperature = 0.6f, AntiPrompts = _antiPrompts, MaxTokens = -1 },
                 cancellationToken: _chatCancellationTokenSource.Token
                 ))
             {
                 break;
             }


             ...
         }

         MakeNewConversation = false;

     }
     catch (Exception)
     {
         throw;
     }
 }

This method is always called on user message submit. At start, the session is initial, so the method above is getting called first.

        internal static async Task<ChatResponse?> DoChatAsync(string message, OnUpdateCallback callback)
        {
            ChatResponse? chatResponse = null;

            try
            {

                // New Topic
                if (MakeNewConversation == true || _chatSession == null)
                {
                    await CreateNewTopic();
                }


                // Do Chat
                if (_chatSession != null)
                {

                    // Response Stream
                    string resultMessage = string.Empty;
                    _chatCancellationTokenSource = new CancellationTokenSource(TimeSpan.FromMinutes(int.Parse(SettingsManager.GetDbSetting("RequestTimeout") ?? "0")));
                    await foreach (var text in _chatSession.ChatAsync(
                        message: new ChatHistory.Message(AuthorRole.User, message),
                        inferenceParams: new InferenceParams() { Temperature = 0.6f, AntiPrompts = _antiPrompts, MaxTokens = -1 },
                        cancellationToken: _chatCancellationTokenSource.Token
                        ))
                    {

                       ...
                    }

		...

                }
            }
            catch (Exception)
            {
                throw;
            }

            return chatResponse;

        }

after CreateNewTopic(); is called, i can't call ChatAsync a second time without getting an error. I suspect the problem is in the description of my prompt in the CreateNewTopic method. Originally there were no problems with it. With the last update this has probably changed.

hswlab avatar Feb 06 '24 18:02 hswlab

I think I have found the problem.

I replaced the first ChatAsync call with my initial prompt by: _chatSession.History.AddMessage(AuthorRole.System, prompt); now there is no error anymore. image

hswlab avatar Feb 06 '24 20:02 hswlab

@hswlab can you post your code here?

The chat session requires an optional System message and then alternating User and Assistant messages. I suspect there is an issue in your calling code, perhaps?

You're right, the normal flow should follow the format you mentioned. However, there are instances where third-party requests do not adhere to this format. For example, I've previously encountered cases where KernelMemory produced two consecutive User messages (though I don't recall the exact details due to the passage of time). When using OpenAI's service, this wasn't an issue, but switching to a local chat session resulted in errors due to these constraints.

Ideally, the chat session should primarily serve as a wrapper for the inference format. Imposing overly strict restrictions might not be beneficial.

sangyuxiaowu avatar Jun 27 '24 08:06 sangyuxiaowu