crewAI icon indicating copy to clipboard operation
crewAI copied to clipboard

Integration with LM studio for running local models

Open Zirgite opened this issue 1 year ago • 0 comments

Is it possible to integrate with LM studio? As LM studio has the possiblilty to run a local server with open AI API settings. I know there is support for ollama but as ollama does not have a native Windows support it is out of the question to mess with WSL. None of the proposed solutions seems to work. I think this should not be closed as there in no working solution that has bewen proposed.

Zirgite avatar Jan 18 '24 09:01 Zirgite