open-interpreter
                                
                                 open-interpreter copied to clipboard
                                
                                    open-interpreter copied to clipboard
                            
                            
                            
                        interpreter --os --model local
Describe the bug
not working like the interpreter local does
Reproduce
not sure
Expected behavior
should work
Screenshots
No response
Open Interpreter version
.2
Python version
.01
Operating System name and version
14
Additional context
No response
Describe the bug
not working like the interpreter local does
Reproduce
not sure
What did you run?
interpreter --os --model local
Expected behavior
should work
Open Interpreter version
.2
run
interpreter --version
and paste the output here
Python version
.01
run:
python --version
and paste the output here
Operating System name and version
14
Windows? Linux? MacOS? version???
We need important information in order to help. I do believe the --os version is openai only for now... @KillianLucas can you confirm?
You did not provide us with a error log either, that would be extremely helpful in finding out where the error occured.
--model needs a provider and model name in order to work properly: --model openai/local
I suspect that's where you ran in to the error first...
I believe this project is awesome but promises stuff that won't work without a decent request router that knows which request to send to which model (as GPT-4 MoE architecture does). For this 0.2.0 update it would be imperative to have at least a coding AND a visual model, or the promise simply won't hold.
Maybe a good idea to explain this in the readme? I think that would be fair, no?
@MikeBirdTech why is this closed, and why are other issues closed without any changes to either docs nor code? Example: #1085
the "running locally" docs are extremely poor imo: https://docs.openinterpreter.com/guides/running-locally.
What would you want to add to it?
That it's not possible at all? Only GPT4 is currently able to be used for the functions you promote in your readme... Otherwise point to demo code / instructions showing how to use local models.