llm
llm copied to clipboard
Access large language models from the command-line
I am going to add and implement this plugin. Plugin repo: https://github.com/h0rv/llm-watsonx
This is Python 3.12.0 on Windows 11 in a venv with "pip install llm llm-gpt4all". llm -m Meta-Llama-3-8B-Instruct "What's a double dual?" works, but "llm chat -m Meta-Llama-3-8B-Instruct" results in:...
This is an implementation of `prefill` in the API. It's not yet implemented in the CLI -- submitting this for review first. We've done the OpenAI implementation of it. OpenAI...
These commits add the `remove` functionality to `llm keys`, add a test for that option, and update the relevant docs section.
Claude 3 and other models (like Reka) support prefill, where you can construct a chat but set the first tokens of the model's reply. I use that in `datasette-query-assistant` here:...
Frequently the LLM starts typing an answer and I want to give it more feedback because it's not quite typing the right thing. I don't know how to stop the...
while working with a system prompt for latex generation, i found that `$` characters in a template system prompt that don't refer to a template variable will cause an error...
For Claude models a common pattern is to provide examples as a bunch of fake user/assistant messages: ```bash curl https://api.anthropic.com/v1/messages \ --header "x-api-key: $ANTHROPIC_API_KEY" \ --header "anthropic-version: 2023-06-01" \ --header...
Inspired by this comment in https://simonwillison.net/2024/Apr/22/llama-3/#local-llama-3-70b-instruct-with-llamafile > One warning about this approach: if you use LLM like this then every prompt you run through llamafile will be stored under the...
Allow setting all options via environment variables using `LLM` as the default prefix. Note that for groups (subcommands) the prefix includes group name; `LLM_{group:upper}`. Example: ```shell $ LLM_CHAT_MODEL_ID=mistral-7b-instruct-v0 llm chat...