mikeaper323

Results 7 comments of mikeaper323

try the following setting for LM Studio: LLM_API_KEY="lm-studio" LLM_MODEL="openai/mistral" //leave openai as is... you can change mistral to the local model you use LLM_BASE_URL="http://localhost:1234/v1" LLM_EMBEDDING_MODEL="local"

Try this setting for LM studio: LLM_API_KEY="lm-studio" LLM_BASE_URL="http://localhost:1234/v1" LLM_MODEL="openai/dolphin-2.5-mixtral-8x7b-GGUF/dolphin-2.5-mixtral-8x7b.Q2_K.gguf" LLM_EMBEDDING_MODEL="local" WORKSPACE_DIR="./workspace"

LLM_API_KEY="lm-studio" LLM_BASE_URL="http://localhost:1234/v1" LLM_MODEL="openai/dolphin-2.5-mixtral-8x7b-GGUF" LLM_EMBEDDING_MODEL="local" WORKSPACE_DIR="./workspace

Yes. WSL with windows, LM studios with windows. Conda powershell env. Follow all the project instructions.

Make sure the api key is: LLM_API_KEY="lm-studio" Not LLM_API_KEY="lmstudio" And make sure you haven't changed the port in lm studio to another port. The only other thing I can think...

Oh and maybe run prompt with administrator privileges, but I don't think that would matter

I'm sorry your still having issues getting this LM studio to connect. I'll try to be as specific as possible, that worked for me. This is what I did, you...