open-interpreter
open-interpreter copied to clipboard
Open interpreter access LM Studio remotely
Is your feature request related to a problem? Please describe.
can I access this LM Studio API remotely over my LAN from my workstation?
Describe the solution you'd like
I have LM Studio installed and running successfully on a virtual machine. I also installed Open Interpreter on this same VM and it successfully speaks to the currently (locally) running LM studio model. My question is, can I access this LM Studio API remotely over my LAN from my workstation? Any insight appreciated!
Describe alternatives you've considered
No response
Additional context
No response
You can use api_base:
interpreter --api_base http://ip_adress:port/v1
How do the results compare to gpt-4-v?
How you did to install Lm studio on a vmachine? I'm trying It and i'm sufferingna JavaScript error that isbrelated to a avx2 problem.
@cestort Where you able to run LM Studio on the VM? Can you tell me more about it?
No. the problem is that a virtualizer level 1 is needed, and user virtualizers as vbox or hyperv are level 2. Is compulsory the AVX2 technology and level 2 doesn't add this into the virtualized cpu. Level 1 uses the hardware directly and it should work.