open-interpreter icon indicating copy to clipboard operation
open-interpreter copied to clipboard

interpreter --os --model local

Open jmanhype opened this issue 1 year ago • 2 comments

Describe the bug

not working like the interpreter local does

Reproduce

not sure

Expected behavior

should work

Screenshots

No response

Open Interpreter version

.2

Python version

.01

Operating System name and version

14

Additional context

No response

jmanhype avatar Jan 06 '24 10:01 jmanhype

Describe the bug

not working like the interpreter local does

Reproduce

not sure

What did you run?

interpreter --os --model local

Expected behavior

should work

Open Interpreter version

.2

run

interpreter --version

and paste the output here

Python version

.01

run:

python --version

and paste the output here

Operating System name and version

14

Windows? Linux? MacOS? version???

We need important information in order to help. I do believe the --os version is openai only for now... @KillianLucas can you confirm? You did not provide us with a error log either, that would be extremely helpful in finding out where the error occured. --model needs a provider and model name in order to work properly: --model openai/local

I suspect that's where you ran in to the error first...

Notnaton avatar Jan 06 '24 12:01 Notnaton

I believe this project is awesome but promises stuff that won't work without a decent request router that knows which request to send to which model (as GPT-4 MoE architecture does). For this 0.2.0 update it would be imperative to have at least a coding AND a visual model, or the promise simply won't hold.

Maybe a good idea to explain this in the readme? I think that would be fair, no?

Morriz avatar Jan 08 '24 22:01 Morriz

@MikeBirdTech why is this closed, and why are other issues closed without any changes to either docs nor code? Example: #1085

Morriz avatar Mar 31 '24 11:03 Morriz

the "running locally" docs are extremely poor imo: https://docs.openinterpreter.com/guides/running-locally.

Morriz avatar Mar 31 '24 11:03 Morriz

What would you want to add to it?

Notnaton avatar Apr 02 '24 20:04 Notnaton

That it's not possible at all? Only GPT4 is currently able to be used for the functions you promote in your readme... Otherwise point to demo code / instructions showing how to use local models.

Morriz avatar Apr 04 '24 20:04 Morriz