[BUG] Answer not complete
Thanks for this extension! This will be really helpful. I noticed that the answers often are not complete and stop in the middle of the sentence. That is independent of the used model. Any ideas?
Confirmed. I also found the situation.
So right now, there is a fixed amount of new tokens generated in ExtendSelection and that number is hard-coded to 70. I plan to make that configurable in settings.
Please see the newest release for the ability to set the max token count: https://github.com/balisujohn/localwriter/releases/tag/v0.0.5
thank you for your immediate update. Could you explain the added options and give advice on what the user should enter/select for certain use cases?
I added some explanations in the "Settings" section of the README.
Also, if you are finding that it always starts a new sentence instead of continuing your sentence with extend selection and you are using ollama, you can use text-generation-webui instead and this problem will not occur. I am looking into how to make ollama work correctly, but it seems to produce a different behavior than text-generation-webui for the same API request.