vim-ai
vim-ai copied to clipboard
Enable use of virtual text rather than buffer for inserting streaming completions
Hi martin, thanks again for this plugin. It is a dream to work with.
One issue I face is that the streaming completion, when entering into a code file of a syntax that I have auto linters and so on for, it will stream fast the new text into the buffer but then the linters are trying to process each version of the file after each token was inserted, clobbering my Vim for a while until it's all been processed out. The quickfix window will scream through a bunch of syntax errors as the tokens were emitted mid-word and so on.
The other GPT Vim plugin that I know, but don't like as much, solves this by using the virtual text indicator: issue, resolution.
Unfortunately this is way more sophisticated vimscript than I am able to understand at a glance. I would love to learn how to implement this but it will probably be a while.
Hi, thanks for reporting this! I think what could help in your case is to turn off diagnostic before completion and turn it on again once it is complete. I am not aware of any generic way how to toggle diagnostic in vim/neovim, it is probably specific to a plugin you use for linting/syntax. So you could try to write a custom function/command that would do the above.
I only mention that I am using vim with plugins YouCompleteMe and Ale configured for different kind of languages and I don't notice any clutter related to diagnostic while AI is generating response.
In case of using virtual text instead of buffer, sounds like a nice feature. I am not quite familiar with virtual text but any contributions are welcome!