Claes Fransson

Results 52 comments of Claes Fransson

Hi, I solved this by increasing the `max_input_length` in config.toml like this (for `model.chat.local`): ``` [completion] # Maximum length of the input prompt, in UTF-8 characters. The default value is...

Actually, I still get a lot of repeating thinking, unfortunately. I've tried to experiment with different values of `max_input_length` and `max_decoding_tokens` under` [model.chat.local]`, but haven't found any setting which works...