crewAI
crewAI copied to clipboard
Integration with LM studio for running local models
Is it possible to integrate with LM studio? As LM studio has the possiblilty to run a local server with open AI API settings. I know there is support for ollama but as ollama does not have a native Windows support it is out of the question to mess with WSL. None of the proposed solutions seems to work. I think this should not be closed as there in no working solution that has bewen proposed.