openv0
openv0 copied to clipboard
LLM server local server for openv0 VS openai
is it possible to use a free local server with a LLM model instead of OpenAI ? https://ollama.ai
Sure, OpenAI service is compatible. You can use ollama for local test.
can someone make a simple guide on how to use it with ollama ? we can skip the installation, just the juicy part