llm
llm copied to clipboard
[Feature Request] - Add cli option to show installed models
Hello! I'm new to the project and it wasn't quite clear to me how to see what models I have installed locally after installing them via gpt4all. After researching I see they are listed in ~/.cache/gpt4all
It would be nice to be able to run llm models --installed and/or if there was an asterisk, [I], or some other indicator in the standard llm models listing that denoted installed models
Maybe this functionality is better left to the plugin? I'm not sure how the plugins interface with the main application and if they are able to implement subcommands.