llama.cpp icon indicating copy to clipboard operation
llama.cpp copied to clipboard

response dont see underscore

Open Tasarinan opened this issue 1 year ago • 3 comments

Hello, when I use Llama.cpp as inference server, I found the result will have the following problem.

for example, if the expected result is a_b_c but the result from Llama.cpp will be a_bc. anyone know the reason?

Tasarinan avatar Feb 05 '24 08:02 Tasarinan

how can I provide info?

for example: the output should be TestSctpConnection.xx_SCTPConnection_WithDataTransferAndStreamFailure but in fact the output is TestSctpConnection.xx_SCTPConnectionWithDataTransferAndStreamFailure

Tasarinan avatar Feb 07 '24 14:02 Tasarinan

This issue is stale because it has been open for 30 days with no activity.

github-actions[bot] avatar Mar 18 '24 01:03 github-actions[bot]

Well. Seems the server has a bug with markdown formatting of the generated source code. Underscored ivars are mispresented italic and pointers are bold

I traced the issue back to the following javascript // poor mans markdown replacement const Markdownish = (params) =>

Screenshot 2024-03-19 at 23 10 33

The highlighted function is called gtk_label_set_text()

vaddieg avatar Mar 19 '24 22:03 vaddieg

This issue was closed because it has been inactive for 14 days since being marked as stale.

github-actions[bot] avatar May 04 '24 01:05 github-actions[bot]

This server-ui bug has really bad consequences for the use. The llama.cpp project is so great but you can't really use it if the display of things is just wrong. Meanwhile there is the ‘New UI’ but the things that are displayed there are still wrong. As long as e.g. an ‘a_b_c’ becomes an ‘abc’ you can't use codes etc without reworking them all the time.

It seems the server has a bug with markdown formatting or there is no properly working markdown parser. It was mentioned that the solution would be a "real streaming markdown parser": https://github.com/ggerganov/llama.cpp/issues/5788#issuecomment-1971540421

There have already been various issues on this topic! Can't we somehow bring the issues and people together and tackle the problem?: https://github.com/ggerganov/llama.cpp/issues/5335 https://github.com/ggerganov/llama.cpp/issues/5788 https://github.com/ggerganov/llama.cpp/issues/7023 https://github.com/ggerganov/llama.cpp/issues/3723 (https://github.com/ggerganov/llama.cpp/blob/master/examples/server/public/index.html#L861)

Otherwise is there at least an interim solution until this bug is fixed? I have tried to change things in this line without success: https://github.com/ggerganov/llama.cpp/blob/1613ef8d8eb2479ba55c4d598e08c8f3f18a0fed/examples/server/public/index.html#L888

It's crazy that such a great and highly developed tool simply displays the wrong things. I'm not perfectly familiar with coding and github, is there a great person who can fix this bug?

Optiuse avatar Jun 11 '24 11:06 Optiuse