Martin Bielik
Martin Bielik
Thanks @dfadev I applied your patch and it works well. Now it is possible to use o1 models. This is an example how to do it with a role: ```ini...
Hi, thanks for the prompt review! I couldn't recall the issue with the last visual selection, therefore I tested all use cases I could think of and hope this is...
In the commit above, there was an intended breaking change: plugin does not include current line into the prompt (unless it is visually selected). Is this still an issue then?
The motivation is that hitting `:AI` without selection should just quickly answer questions, without user thinking what line is cursor on. If they want to include the line, they can...
I see, now I remember this weird issue. What do you think of using `histget(':', -1)` to determine if it is visual selection, range or none? (visual selection should start...
The funny thing is that I can reproduce all buggy scenarios above with the version before the commit, e.g. https://github.com/madox2/vim-ai/commit/ea83fdc59b64843f570c94d34d97c3131663d3e8 With `histget` I identified one problem - it would probably...
In this https://github.com/madox2/vim-ai/commit/3122b848ee4b32986a5dda739c3ea73b4e696304 I have re-enabled single line ranges. I still wonder what was the reason `g:vim_ai_is_selection_pending` was introduced, because it doesn't seem to fix issues above.
The way I currently think of MCP is that implementing it is quite provider specific. It might also require more alignment as it would allow llm to be autonomous to...
Great job! New `openai-mcp` provider sounds great. One of the reasons for moving mcp logic to the external provider is that I don't want to introduce 3rd party python dependency...
I am quite hesitating introducing external dependency even though it's not hard. Many vim installations comes with pre built python and would have no option to set it up. I...