01
01 copied to clipboard
Selecting a model when choosing LM-studio
Describe the bug I can't select a different local model, e.g. "LM Studio Community/Meta-Llama-3-8B-Instruct-GGUF" when I choose LM-Studio on the CLI at start time. It defaults to "gpt-4".
I tried using a CLI flag. It didn't seem to respect the model flag
I think an option needs to be added here: https://github.com/OpenInterpreter/01/blob/main/software/source/server/utils/local_mode.py
To Reproduce Steps to reproduce the behavior:
- Start with this command: "poetry run 01 --local --model "LM Studio Community/Meta-Llama-3-8B-Instruct-GGUF""
- Choose LM-Studio
- Hit space to record.
- Error in LM-studio says that model "gpt-4" is not available
Expected behavior I would like to choose a model or type a model on the command line.
Desktop (please complete the following information): Mac M2