NanoLLM
NanoLLM copied to clipboard
Server/Client architecture
Hello @dusty-nv !
I think the nanoLLM project could benefit from offering a server-client architecture, a little like what ollama does with ollama serve
and the ollama client API.
This way, the client for nanoLLM could easily be run outside of the docker container, and even on another device. I know you already worked on a web GUI, but it's not exactly the same use case IMHO.