refact icon indicating copy to clipboard operation
refact copied to clipboard

Besides Refact.ai hosting models, support to work with any Ollama managed local models

Open heddaz opened this issue 11 months ago • 5 comments

Hi team, I saw Refact.ai self-hosted version can support 3rd-party APIs by adding URL and KEY. Could you do the similar to support any Ollama managed local model for code-completion / chat / finetuning?

Thanks a lot.

heddaz avatar Feb 07 '25 10:02 heddaz

I would love to test it out :) i am looking forward to try deepseek R1 with the refact agent

ukromalybot avatar Feb 21 '25 18:02 ukromalybot

I set it up in hope to use it with ollama.. but no luck with that :/

zimdin12 avatar May 14 '25 22:05 zimdin12

I got it running but it doesn't work correctly. i used id ollama/qwen3:30b and didn't use /v1 endpoint for API base url

zimdin12 avatar May 15 '25 00:05 zimdin12

Have you tried another way - setting up the provider's Ollama in the plugin settings? Image Image Image

hazratisulton avatar Jun 06 '25 09:06 hazratisulton

Hey @zimdin12 @HeddaZ 👋 If you’d like to add new models or providers, feel free to open a PR — contributions are always welcome! We’ve included a Contributor Guide to help you get started easily.

avie66 avatar Jun 24 '25 09:06 avie66