khoj
khoj copied to clipboard
Роль, промт
Good afternoon:) Can you add the ability to add your own PROMT and Raleigh? Also add the ability to adjust the temperature as I would like to set 0.1-0.3. Also tell me in the settings is specified model its context 128000 tokens but when sending to the chat long context 5000 tokens error comes out.
414 Request-URI слишком большой
Еще подскажите в настройках указана модель ее контекст 128000 токенов но при отправке в чат длинный контекст 5000 токенов вылезает ошибка.
Hi @den47999 ! We've limited the token limit used for this model. Thanks for pointing this out, I'll update it.
You'd like a custom prompt in the cloud instance?
I want the number of tokens to be 128000 as it helps me with writing code, as well as add temperature selection and promt roles