server
server copied to clipboard
Localhosted language model (like llama.cpp)
I don't like that you have to pay openai and send them the data to use the new AI smart picker tools.
I think it'd be great if we could use localhosted models like llama.cpp.