llama.cpp
llama.cpp copied to clipboard
Server enhancements - grammar segfault and helper titles.
I noticed the server example segfaults on a null context if the grammar can't be parsed. This is fixed.
I also added titles with helpful mouseover explanations in the UI to help people understand n_predict, top_k, etc:
Aaah smacked down by the dreaded trailing whitespace. Reviewbot must be a python coder.
Oh is that why they were written that way? Yes that would be great for readability.
On Tue, Jan 23, 2024, 07:52 Georgi Gerganov @.***> wrote:
@.**** approved this pull request.
Nice!
Should we change the deps.sh script to output raw string literals?
— Reply to this email directly, view it on GitHub https://github.com/ggerganov/llama.cpp/pull/5080#pullrequestreview-1838054091, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABZP3DFPUBKHSGYS7EHHHCDYP5T25AVCNFSM6AAAAABCFOHDUOVHI2DSMVQWIX3LMV43YUDVNRWFEZLROVSXG5CSMV3GSZLXHMYTQMZYGA2TIMBZGE . You are receiving this because you authored the thread.Message ID: @.***>
After rebasing and fixing the EditorConfig Checker we can merge
Thx. :+1: https://github.com/ggerganov/llama.cpp/pull/5221
I think this PR is quite outdated - probably not relevant anymore?
Good question. I'll take another look but I think the segfault fix on no grammar is still important. Unless another PR fixed it.
Is this bug still present? Just chasing up older PRs to make sure it's not obsolete
It looks like @ggerganov fixed this directly so no need anymore. https://github.com/ggerganov/llama.cpp/blob/e2b065071c5fc8ac5697d12ca343551faee465cc/common/sampling.cpp#L17