can this only be used with together.ai and not locally?
you can use this app locally if you run llm locally in your machine; and you can run llms locally using ollama; https://ollama.com/
this can run any api, just change base URL and API KEY, also change model respectively in dropdown.
you can use this app locally if you run llm locally in your machine; and you can run llms locally using ollama; https://ollama.com/
I found that the code invokes the together-api package, and it seems that the API is not compatible with Ollama's API.
I am forking this repo and trying to working on this,but my device only 16g and can't run local model very well,here is the link ollamacoder