open-interpreter icon indicating copy to clipboard operation
open-interpreter copied to clipboard

🌋 LLaVA: Large Language and Vision Assistant support

Open ilteris opened this issue 1 year ago • 4 comments

Is your feature request related to a problem? Please describe.

OpenAI is cool but it's also very expensive.

Describe the solution you'd like

LLAVA could be a great candidate as an alternative to GPT4V. https://huggingface.co/mys/ggml_llava-v1.5-7b I was able to load it through LMStudio but unfortunately it crashes and it requires more work.

Describe alternatives you've considered

No response

Additional context

No response

ilteris avatar Nov 11 '23 01:11 ilteris

Hey there, @ilteris!

Do you have any more details about what you’re asking for from the project/community here?

Are you saying the model crashes when you try to run it in general or that it specifically crashes when you try to use it with Open Interpreter?

ericrallen avatar Nov 11 '23 13:11 ericrallen

Thank you for the response @ericrallen . I am trying to run this local vision model and get open interpreter to interpret that and turn into code. I start very basic. Just understand what the image is first.

Here's my prompt and output: Screenshot 2023-11-11 at 11 26 42 AM

Output: `Traceback (most recent call last): File "/Users/ilteris/.venv/lib/python3.11/site-packages/interpreter/core/respond.py", line 49, in respond for chunk in interpreter._llm(messages_for_llm): File "/Users/ilteris/.venv/lib/python3.11/site-packages/interpreter/llm/convert_to_coding_llm.py", line 65, in coding_llm for chunk in text_llm(messages): ^^^^^^^^^^^^^^^^^^ File "/Users/ilteris/.venv/lib/python3.11/site-packages/interpreter/llm/setup_text_llm.py", line 130, in base_llm return openai.ChatCompletion.create(**params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/ilteris/.venv/lib/python3.11/site-packages/openai/api_resources/chat_completion.py", line 25, in create return super().create(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/ilteris/.venv/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 155, in create response, _, api_key = requestor.request( ^^^^^^^^^^^^^^^^^^ File "/Users/ilteris/.venv/lib/python3.11/site-packages/openai/api_requestor.py", line 299, in request resp, got_stream = self._interpret_response(result, stream) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/ilteris/.venv/lib/python3.11/site-packages/openai/api_requestor.py", line 710, in _interpret_response self._interpret_response_line( File "/Users/ilteris/.venv/lib/python3.11/site-packages/openai/api_requestor.py", line 775, in _interpret_response_line raise self.handle_error_response( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/ilteris/.venv/lib/python3.11/site-packages/openai/api_requestor.py", line 428, in handle_error_response error_code=error_data.get("code"), ^^^^^^^^^^^^^^ AttributeError: 'str' object has no attribute 'get'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/ilteris/.venv/bin/interpreter", line 8, in sys.exit(cli()) ^^^^^ File "/Users/ilteris/.venv/lib/python3.11/site-packages/interpreter/core/core.py", line 24, in cli cli(self) File "/Users/ilteris/.venv/lib/python3.11/site-packages/interpreter/cli/cli.py", line 268, in cli interpreter.chat() File "/Users/ilteris/.venv/lib/python3.11/site-packages/interpreter/core/core.py", line 86, in chat for _ in self._streaming_chat(message=message, display=display): File "/Users/ilteris/.venv/lib/python3.11/site-packages/interpreter/core/core.py", line 106, in _streaming_chat yield from terminal_interface(self, message) File "/Users/ilteris/.venv/lib/python3.11/site-packages/interpreter/terminal_interface/terminal_interface.py", line 115, in terminal_interface for chunk in interpreter.chat(message, display=False, stream=True): File "/Users/ilteris/.venv/lib/python3.11/site-packages/interpreter/core/core.py", line 127, in _streaming_chat yield from self._respond() File "/Users/ilteris/.venv/lib/python3.11/site-packages/interpreter/core/core.py", line 162, in _respond yield from respond(self) File "/Users/ilteris/.venv/lib/python3.11/site-packages/interpreter/core/respond.py", line 105, in respond raise Exception( Exception: 'str' object has no attribute 'get'

Please make sure LM Studio's local server is running by following the steps above.

If LM Studio's local server is running, please try a language model with a different architecture.`

ilteris avatar Nov 11 '23 16:11 ilteris

I am running LMStudio with https://huggingface.co/mys/ggml_llava-v1.5-7b/resolve/main/ggml-model-q5_k.gguf model

ilteris avatar Nov 11 '23 16:11 ilteris

Just to double-check, this prompt works when sent to the model directly without Open Interpreter?

ericrallen avatar Nov 12 '23 13:11 ericrallen

Closing this stale issue. Please create a new issue if the problem is not resolved or explained in the documentation. Thanks!

MikeBirdTech avatar Mar 18 '24 20:03 MikeBirdTech