model.nvim
model.nvim copied to clipboard
Neovim plugin for interacting with LLM's and building editor integrated prompts.
Hello, thank you for a great plugin! Sometimes it is convenient to prepare the "assistant" response before sending the whole prompt to an LLM. E.g. I can type manually: ```...
Hi! I wanted to ask for support for [TGI](https://github.com/huggingface/text-generation-inference) as a provider. I can probably work on this later this week.
Just some basic linting. At some point adding (something like) [Pre-Commit](https://pre-commit.com/) might be nice. I'm in favor of line length 79 (or less) because I'm one of the silly people...
Hi there! I like the abstraction of this plugin. Is there any way to make it work with github copilot?
- [ ] give a full setup example with options using lazy.nvim - [ ] better document how to author prompts (input, context) - [ ] guide to adding providers...
We should notify if a request_completion finishes successfully with no text
This PR adds a Groq Cloud API provider to the defaults. Groq's API is already blazingly fast, so we probably don't need streaming (atleast for text modalities). The chat example...
Apologies in advanced if I'm wrong, but I don't think there's any support for prompt templates (I didn't find anything in the README, atleast)? If not, it'll be really nice...