docs icon indicating copy to clipboard operation
docs copied to clipboard

Running Local LLM using FastAPI and Ollama

Open sbenhoff007 opened this issue 1 year ago • 0 comments

FastAPI provides high-performance API framework to expose LLM capabilities as a service . Ollama offers effeicient way to download and run LLM models. By combining the strength of FastAPI, Ollama and Docker, users can deploy Local LLM on their local infrastructure flawlessly.

sbenhoff007 avatar Oct 21 '24 18:10 sbenhoff007