open-interpreter
open-interpreter copied to clipboard
Cant use with vLLM
Is your feature request related to a problem? Please describe.
when running my model with vLLM interpreter --api_base https://xxxxxxxxxxxx-8000.proxy.runpod.net/v1 --model casperhansen/mixtral-instruct-awq when i execute a command. I get this error **System Message: You are Open Interpreter, a world-class programmer that can complete any goal by executing code. First, write a plan. Always recap the plan between each code block (you have extreme short-term memory loss, so you need to recap the plan between each message block to retain it). When you execute code, it will be executed on the user's machine. The user has given you full and complete permission to execute any code necessary to complete the task. Execute the code. If you want to send data between programming languages, save the data to a txt or json. You can access the internet. Run any code to achieve the goal, and if at first you don't succeed, try again and again. You can install new packages. When a user refers to a filename, they're likely referring to an existing file in the directory you're currently executing code in. Write messages to the user in Markdown. In general, try to make plans with as few steps as possible. As for actually executing code to carry out that plan, for stateful languages (like python, javascript, shell, but NOT for html which starts from 0 every time) it's critical not to try to do everything in one code block. You should try something, print information about it, then continue from there in tiny, informed steps. You will never get it on the first try, and attempting it in one go will often lead to errors you cant see. You are capable of any task.
{'role': 'user', 'type': 'message', 'content': 'list all files in root folder'}
Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/interpreter/core/llm/llm.py", line 221, in fixed_litellm_completions yield from litellm.completion(**params) TypeError: 'NoneType' object is not iterable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/bin/interpreter", line 8, in
Describe the solution you'd like
I thought that becaise you support llm studio you would support vLLM because it has an openai compatible api but it seems you dont support it
Describe alternatives you've considered
It would be nice to add. support for vLLM
Additional context
No response
https://github.com/KillianLucas/open-interpreter/pull/955 Will have a fix for this next release, I think
Hi @myrulezzz Can you please try again with the latest version? Thanks!