When using LM studio, user is prompted to enter their OpenAI API key when this value isn't needed
Describe the bug
The change in --local mode now requires users to launch OI with interpreter --api_base http://localhost:1234/v1 if they want to use LM Studio. When they run this command, they are prompted to enter their OpenAI API key. Pressing enter without typing any value allows OI to work with LM Studio successfully. However, the user should not be prompted to enter their OpenAI API key when they are using a local model.
Reproduce
- Disable
OPENAI_API_KEYenvironment variable - Run
interpreter --api_base http://localhost:1234/v1 - Observe OI asking for OpenAI API key
Expected behavior
Running interpreter --api_base http://localhost:1234/v1 should work without prompting user for their OpenAI API key.
Screenshots
Open Interpreter version
0.2.2
Python version
3.11.3
Operating System name and version
MacOS 14.3
Additional context
No response
I want to take a stab at this
It's all yours @lorenzejay ! Thanks for stepping up 💪
Let us know if you need a hand with anything
So this is my thoughts on this:
if interpreter.llm.api_base:
we can assume this won't need an openAI API, so in validate_llm_settings, we can remove prompting for openai api key
expected behavior? @MikeBirdTech
when you don't use --api_base