shahab
shahab
I used oobabooga text generation UI to run the 1.3b model. I selected transformers from the drop-down and it works fine. I'm sure you can run the bigger models too...
> I think you can use it with local models powdered by Ollama It uses Ollama for some parts of it like processing web search results but not for coding....
Same here.
I also have the same problem, running on Ubuntu 22 on WSL2
> > I also have the same problem, running on Ubuntu 22 on WSL2 > > @shahab00x , Were able to troubleshoot using [#344 (comment)](https://github.com/OpenDevin/OpenDevin/issues/344#issuecomment-2027840656)? > > @Fujimon416 , I...
> > Is there a way to modify uvicorn opendevin.server.listen:app --port 3000 to access the server over the network rather than the localhost? > > Yup! You should be able...