Martin Bielik

Results 51 comments of Martin Bielik
trafficstars

There are tools like [litellm](https://github.com/BerriAI/litellm) that expose openai-compatible proxy to most of the llm providers. But like you said, you need to host it yourself or run locally

This plugin is currently possible to configure with OpenAI-competible http endpoints. Does Copilot chat expose http API?

Hi, thanks for reporting this! I think what could help in your case is to turn off diagnostic before completion and turn it on again once it is complete. I...

What is the motivation behind this feature? Enabling full markdown got into conflict with the current aichat file highlighting: ![image](https://github.com/madox2/vim-ai/assets/15847857/fcdac29c-134d-4afe-a1f0-a22419d05291)

Yes, the screenshot is taken with `code_syntax_enabled = 1`. To problem is that markdown is applied to the whole file, not just to the fenced replies.

I am not sure is syntax highlighting of the user input is needed. If you for instance use ChatGPT on web, you cannot use formatting your prompt. And it wouldn't...

Still having the same issue (see screenshot in the comments above). I think you tried to fix it with >>>

Yes, I think that this solution is much better and I would like to replace the current highlighting with this one. But first I need to get this working. I...

I have just prototyped non streaming support on the branch: [support-non-streaming](https://github.com/madox2/vim-ai/tree/support-non-streaming), but I didn't have time to properly test. Let me know if that works for you