open-interpreter
open-interpreter copied to clipboard
When use ollama with model llama3:70b, the code cannot run
Describe the bug
when use ollama with model llama3:70b, the python code return with a "`" ,caused the code cannot be executed.
Reproduce
- run interpreter by this command . interpreter --model ollama/llama3:70b
- input please ananlyze d:\test\dumpstack.log
Expected behavior
it should give me some output
Screenshots
Open Interpreter version
0.2.0
Python version
3.12.2
Operating System name and version
windows 11
Additional context
No response
I'm also having this problem.
same here, with the llama3 8b model. I do get the backtick and also get random responses that don't solve the issue. Sometimes it responds to previous questions even though I ask for something else.
Same issue running llama3-8b on MacOS 14.4 using LM Studio.
Same here; ollama/llama3-8b on MacOS 13.5.2
same -- goes very mad
Update to version 0.2.5
pip install --force-reinstall --upgrade open-interpreter
Update to version 0.2.5
pip install --force-reinstall --upgrade open-interpreter
Seems to be working now after the reinstall. Thanks!
Tried it and installed most got errors:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
gensim 4.3.0 requires FuzzyTM>=0.4.0, which is not installed.
streamlit 1.30.0 requires packaging<24,>=16.8, but you have packaging 24.0 which is incompatible.
mdit-py-plugins 0.3.0 requires markdown-it-py<3.0.0,>=1.0.0, but you have markdown-it-py 3.0.0 which is incompatible.
faster-whisper 0.10.0 requires tokenizers<0.16,>=0.13, but you have tokenizers 0.19.1 which is incompatible.
botocore 1.31.64 requires urllib3<2.1,>=1.25.4; python_version >= "3.10", but you have urllib3 2.2.1 which is incompatible.
python-lsp-server 1.7.2 requires jedi<0.19.0,>=0.17.2, but you have jedi 0.19.1 which is incompatible.
conda-repo-cli 1.0.75 requires clyent==1.2.1, but you have clyent 1.2.2 which is incompatible.
conda-repo-cli 1.0.75 requires python-dateutil==2.8.2, but you have python-dateutil 2.9.0.post0 which is incompatible.
s3fs 2023.10.0 requires fsspec==2023.10.0, but you have fsspec 2024.3.1 which is incompatible.
transformers 4.39.3 requires tokenizers<0.19,>=0.14, but you have tokenizers 0.19.1 which is incompatible.
spyder 5.4.3 requires jedi<0.19.0,>=0.17.2, but you have jedi 0.19.1 which is incompatible.
ane-transformers 0.1.1 requires protobuf<=3.20.1,>=3.1.0, but you have protobuf 3.20.3 which is incompatible.
Will it work? lol
Also still getting this:
/opt/anaconda3/lib/python3.11/site-packages/pydantic/_internal/_fields.py:160: UserWarning: Field "model_name" has conflict with protected namespace "model_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
warnings.warn(
/opt/anaconda3/lib/python3.11/site-packages/pydantic/_internal/_fields.py:160: UserWarning: Field "model_info" has conflict with protected namespace "model_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
warnings.warn(
(although got this before and still works)
Re https://github.com/OpenInterpreter/open-interpreter/issues/1220#issuecomment-2079949900
So when I try with various models I get odd things happening where it's almost telling me it's thoughts or just coming back with odd stuff:
> ls
Plan Recap: The user needs help setting up a repository. We previously created a new directory and initialized it as a
Git repository. Now, we will list the contents of the directory to verify that everything is set up correctly. Current
State: We are now listing the contents of the directory.
`
ls
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
Cell In[3], line 2
1 print('##active_line1##')
----> 2 ls
NameError: name 'ls' is not defined
> ##ls
Plan Recap: The user wants to list the contents of the current directory. We will use theos module to execute thels
command. Current State: We are now listing the contents of the directory.
`
import os
print(os.listdir())
['awesome-openai-vision-api-experiments', 'gpt-video', 'HeyGPT-AI-Voice-Companion', 'personal-ai', '.DS_Store',
'gpt-pilot-backup-0-1-10-14a2d3ef', 'vision.py', 'OpenVoice', 'macOSpilot-ai-assistant', '__pycache__', 'whisper.cpp',
'oobabooga', 'MagicLLight', 'rem', '01', 'gpt-pilot', 'bot', 'testvision.py', '.vscode',
'LM-Studio-Voice-Conversation', '~']
latest I have now is it always responding with "a plan" so
> hello there
Hello, Open Interpreter here. I'll follow the plan as described:
1 Greet the user.
Plan: Send a message to greet the user.
print("Hello there!
> how are you today?
I'm just a program, Open Interpreter, but I'll do my best to help you out today! Let's get started with your task.
Plan: Get the user's current date and time for reference.
>
I did an update to 0.25 but I think there's been 2 updates to that so how do I know I've got the right one?
Same issue running llama3 on windows but the update to 0.2.5 solve the problem pip install --force-reinstall --upgrade open-interpreter