llm icon indicating copy to clipboard operation
llm copied to clipboard

Access large language models from the command-line

Results 100 llm issues
Sort by recently updated
recently updated
newest added

I need to clear the chat history before getting a response in my python code. Is there and easy way to do this?

Is there a way to reinstall or delete a model? One of them got corrupted on download and I can't figure out how to remove it or reinstall it.

Hi, I've been tinkering lately with using a meta prompt to generate better prompts. The idea is roughly that you give `python for fibonacci sequence` to an llm that takes...

To avoid unintentionally sending requests that are too large to the new 128k-context models and incurring unnecessary costs, it's crucial to put in place some safeguards. The parameter `--max-input-tokens` should...

It would be nice if we had the ability to rerun the previous call to the LLM (with different parameters). This could also be implemented in chat as well.

Hi! I've loved every second of using `llm` so I decided to code a langchain agent plugin for it. It's still in its infancy and changing a lot but giving...

I'm not sure this is in scope for this library, but do you plan to make it possible to upload files to the api as part of a chat completion...

`llm logs rm ` will remove the conversation with the given ID from the database. This removal should completely remove the data from the database, and not just add a...

https://twitter.com/simonw/status/1694089359514104094 A useful trick is sometimes to feed a model a prior conversation that includes things that the model didn't actually say - things like "Sure, I'd be happy to...

design

Currently, `llm logs list` does not merge the messages that share a conversation ID. It would be better if they were, in fact, merged. Current behavior: ``` # 2023-11-29T03:48:48 conversation:...