Anton Solbjørg
Anton Solbjørg
@qianxinxuexi Is your python/Scrips folder in PATH?
@ericrallen This should be solved with LM Studio, I believe it has OpenCL support. This question was asked when we used Ooba, so it is not relevant anymore
https://github.com/KillianLucas/open-interpreter/issues/393#issuecomment-1827374006
@hvacking Using python you can read interpreter.messages I believe it contains what you want
@ericrallen We can close this one too, this is handled by LM Studio
When joining LM studios discord server you get access to the linux beta download.
Im not sure, I don't think so. You can use other programs to serve LLMs. Llama.cpp Ollama Etc. And then connect interpreter with the correct port: `interpreter --api_base http://localhost:port/v1`
@ericrallen Seems like people have problems connecting to [textgen-web-ui](https://github.com/oobabooga/text-generation-webui) https://discord.com/channels/1146610656779440188/1146610657224040451/1177381909467254854
@KillianLucas
@xinghao-1210 Can you share the config/command you used to run interpreter?