open-interpreter icon indicating copy to clipboard operation
open-interpreter copied to clipboard

Open Interpreter does not work on ollama and local llm models

Open robik72 opened this issue 1 year ago • 6 comments

Describe the bug

I am using open interpreter on a MacOS 12.7.3 notebook. after installing ollama and downloading tinyllama and phi, I have launched it with the --model. i can converse with the llm but it cannot execute any command

Reproduce

interpreter -y --model ollama/tinyllama --context_window 3000 when interpreter starts, use the example shown in the open interpreter demo video: "can you set my system to dark mode ?" it will try to generate an example script then conclude with "Error: OpenInterpretor not found."

Expected behavior

It should work in the same way compared to using OpenAI with the API key.

Screenshots

Schermata 2024-03-16 alle 18 52 46

Open Interpreter version

0.2.2

Python version

3.10.13

Operating System name and version

macOS 12.7.3

Additional context

No response

robik72 avatar Mar 16 '24 18:03 robik72

Hi @robik72 Sorry to hear that you're having issues. Are you able to try with a more powerful model? The tiny models are very limited

MikeBirdTech avatar Mar 17 '24 01:03 MikeBirdTech

Are there any “minimum requirements” ? For instance mistral 7b ? I would like to try it on a more powerful windows 11 pc with a better model but installation there did not work, i have reported that too.

Il giorno dom 17 mar 2024 alle 02:40 Mike Bird @.***> ha scritto:

Hi @robik72 https://github.com/robik72 Sorry to hear that you're having issues. Are you able to try with a more powerful model? The tiny models are very limited

— Reply to this email directly, view it on GitHub https://github.com/KillianLucas/open-interpreter/issues/1086#issuecomment-2002255255, or unsubscribe https://github.com/notifications/unsubscribe-auth/AYUR2BZJBP5BRINF76IB3ILYYTX7JAVCNFSM6AAAAABEZP2LG6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMBSGI2TKMRVGU . You are receiving this because you were mentioned.Message ID: @.***>

robik72 avatar Mar 17 '24 08:03 robik72

There are no official minimum requirements for LLMs but that sounds like something that would be helpful for people to understand what performance they can expect from certain models. Mistral 7b is popular, as is Mixtral. I've had success with OpenCodeInterpreter and Nous Hermes2 Pro

MikeBirdTech avatar Mar 17 '24 14:03 MikeBirdTech

@MikeBirdTech @KillianLucas The new Nous Hermes Pro 7b really cornered function calling. I could really see interpreter using it to save tokens for basic stuff. How to use the function calling here https://github.com/NousResearch/Hermes-Function-Calling

mysticaltech avatar Mar 18 '24 16:03 mysticaltech

Thanks, do you mean the Nous-Hermes2 11b available for download from ollama ?

Il giorno lun 18 mar 2024 alle 17:43 K. N. @.***> ha scritto:

@MikeBirdTech https://github.com/MikeBirdTech @KillianLucas https://github.com/KillianLucas The new Nous Hermes Pro 7b really cornered function calling. I could really see interpreter using it to save tokens for basic stuff. How to use the function calling here https://github.com/NousResearch/Hermes-Function-Calling

— Reply to this email directly, view it on GitHub https://github.com/KillianLucas/open-interpreter/issues/1086#issuecomment-2004420846, or unsubscribe https://github.com/notifications/unsubscribe-auth/AYUR2B4LVZ34JHMFEZIBSFLYY4KSVAVCNFSM6AAAAABEZP2LG6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMBUGQZDAOBUGY . You are receiving this because you were mentioned.Message ID: @.***>

robik72 avatar Mar 18 '24 16:03 robik72

Or rather the latest hermes 2 pro - mistral 7b ? https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B

Il giorno lun 18 mar 2024 alle 17:53 Roberto Borghesi < @.***> ha scritto:

Thanks, do you mean the Nous-Hermes2 11b available for download from ollama ?

Il giorno lun 18 mar 2024 alle 17:43 K. N. @.***> ha scritto:

@MikeBirdTech https://github.com/MikeBirdTech @KillianLucas https://github.com/KillianLucas The new Nous Hermes Pro 7b really cornered function calling. I could really see interpreter using it to save tokens for basic stuff. How to use the function calling here https://github.com/NousResearch/Hermes-Function-Calling

— Reply to this email directly, view it on GitHub https://github.com/KillianLucas/open-interpreter/issues/1086#issuecomment-2004420846, or unsubscribe https://github.com/notifications/unsubscribe-auth/AYUR2B4LVZ34JHMFEZIBSFLYY4KSVAVCNFSM6AAAAABEZP2LG6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMBUGQZDAOBUGY . You are receiving this because you were mentioned.Message ID: @.***>

robik72 avatar Mar 18 '24 17:03 robik72

Or rather the latest hermes 2 pro - mistral 7b ? https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B

Yes this one! It's heavily trained on function calling.

mysticaltech avatar Mar 19 '24 01:03 mysticaltech

Lots of great discussions in our Discord about local models and which ones perform best. I invite you to join! https://discord.gg/Hvz9Axh84z

MikeBirdTech avatar Mar 19 '24 21:03 MikeBirdTech

@MikeBirdTech I do not know where that discussion is happening. Here's a great video on Hermes Pro function calling and how accurate it is. https://www.youtube.com/watch?v=ViXURxck-HM

mysticaltech avatar Mar 19 '24 23:03 mysticaltech

@mysticaltech Thank you for the video! I have been using Hermes Pro and I am very impressed so far!

MikeBirdTech avatar Mar 20 '24 01:03 MikeBirdTech