open-interpreter
open-interpreter copied to clipboard
How to use a different port than 1234 connecting to LM Studio?
Is your feature request related to a problem? Please describe.
Port 1234, which is the default port for LM Studio, may be used by other applications. LM Studio allows the use of any port. However, open-interpreter has no documented options to use another port and automatically assumes port 1234.
Describe the solution you'd like
A config or command line option to use the port LM Studio uses.
Describe alternatives you've considered
No response
Additional context
No response
@caryknoop Change PORT and MODEL to whatever you want
interpreter -ab http://localhost:PORT/v1 --model openai/MODEL
Further documentation is here: https://docs.openinterpreter.com/usage/terminal/arguments
Change PORT and MODEL to whatever you want
interpreter -ab http://localhost:PORT/v1 --model openai/MODEL
Further documentation is here: https://docs.openinterpreter.com/usage/terminal/arguments
I would think an example using a port in addition to using a base url would clear up future confusion about using the -ab option.
@caryknoop https://docs.openinterpreter.com/usage/terminal/arguments#api-base-or-ab
You are correct, the documentation is lacking and following it would lead to errors when not following default settings. For now interpreter -ab http://localhost:PORT/v1 --model openai/MODEL
should get you up and running. You don't need to specify the MODEL, you can literally put anything there, we just need the openai there to fix a different bug
Hey there, @caryknoop!
Feel encouraged to submit a Pull Request to clear up any confusion you notice in the docs.
Closing this stale issue. Please create a new issue if the problem is not resolved or explained in the documentation. Thanks!