OpenDevin icon indicating copy to clipboard operation
OpenDevin copied to clipboard

Option to use Ollama with local models instead of OpenAI/ChatGPT

Open jefferyb opened this issue 3 months ago • 2 comments

Hi team, I was wondering if it would be possible to have an option to use Ollama with local models instead of OpenAI/ChatGPT with the OpenDevin project.

I was thinking it could provide some benefits, such as:

  • Cost savings: Ollama is an open-source platform, which could significantly reduce the cost compared to OpenAI/ChatGPT.

  • Flexibility & Security: Ollama allows the use of different models ( and can create your own ), which could be more suitable for certain scenarios where data privacy or security concerns might arise.

  • Also beneficial for those that prefer to keep their data on-premises/locally.

Just thought I would put it out there/ask :) Thank you for your time and consideration. -Jeffery

jefferyb avatar Mar 25 '24 16:03 jefferyb