llm
llm copied to clipboard
Access large language models from the command-line
Some LLM providers, such as openrouter, can use its built-in web_search function by add extra json parameters for input: { "model": "openai/gpt-4.1", "plugins": [ { "id": "web", "max_results": 20, //...
Please add lmstudio.ai local or my favorite https://github.com/Mozilla-Ocho/llamafile This is good for my own code
I was trying to query responses for a specific uuid that I know should appear there and got an error ``` $ llm logs --query "CFF90D7C-A2EC-4437-AEBF-5BA515430241" Traceback (most recent call...
It would be helpful if invoking `llm chat` started the software directly in multi-line mode instead of requiring `!multi` after launch. Could we add a parameter like `--multi` (for example,...
This PR adds new functionality to the cli to perform batch inference on a file of prompts and write the result to a file. The approach is simple, using a...
When the template name is also the name of a directory in the current directory, the llm command fails: `$ llm --model my_model --template also_a_directory_name -- "Hello"` ``` Traceback (most...
Let me know if there's anything else I should be doing to be a good maintainer of these tools!
added MCP plugin
After doing a `brew upgrade` on my Mac, whenever I open a new terminal I see the following warning: `bash: complete: nosort: invalid option name` I'm using the default bash:...