Martin Bielik
Martin Bielik
Looks nice, you could do similar thing but without &filetype annotation with a selection boundary: ```vim let g:vim_ai_chat = { \ "options": { \ "selection_boundary": "```", \ }, \} ```
In this case I can imagine making it a little bit smarter - if the boundary is ```, it is considered a standard markdown boundary and the filetype is added...
Right, there is currently no built-in support for other AI providers. However it is possible to configure plugin with any OpenAI-compatible API. There are currently many projects providing OpenAI compatible...
Thanks for the detailed explanation! I think "effective max" is a good idea. It is however quite challenging to count tokens with a bare python without using any external deps/libs....
Maybe for counting tokens it would be just fine to use a simple approximation like 1 token ~= ¾ words as described here: https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-them Also it might make sense to...
Hi, ignoring any JSONDecodeError error does not feel right. Could you find out what exactly LM Studio sends that makes the command fail? Maybe we could handle it specifically
Hi, sure, it will be a nice feature. I need to find some time to review and test
Hi, I am sorry but I don't understand what are you trying to accomplish. Could you be more specific?
I see, in other words you are trying to edit a buffer based on the context from the chat. What could help you is to ask in chat to "Apply...
I believe so. That is just to sketch out the idea, the implementation could be of course more complicated