llama.cpp
llama.cpp copied to clipboard
fix coloring of last `n_batch` of prompt, and refactor line input
https://github.com/ggerganov/llama.cpp/blob/721311070e31464ac12bef9a4444093eb3eaebf7/main.cpp#L980-L983
This can fail to colorize the last params.n_batch
part of the prompt correctly because embd
was just loaded with those tokens and not printed, yet.
Can confirm the bug and the PR fixes it and is certainly cleaner. Can you resolve the conflicts and maybe take along the one-liner in #283, also related to colors? Thanks.
@sw merge if you approve it