open-interpreter icon indicating copy to clipboard operation
open-interpreter copied to clipboard

computer.browser.search is dead

Open cfortune opened this issue 10 months ago • 5 comments

Describe the bug

all web searches freeze indefinately

Reproduce

any web search

Expected behavior

returns search results

Screenshots

No response

Open Interpreter version

0.2.4

Python version

3.11.5

Operating System name and version

Windows 10

Additional context

started suddenly sometime in the last two days or so. It was working fine before that.

cfortune avatar Apr 17 '24 21:04 cfortune

Verbose output ` {'role': 'assistant', 'content': 'Sure, could you please tell me what specifically you would like me to search for?'}, {'role': 'user', 'content': 'web search frogs of the sahara'}] Token Counter - using OpenAI token counter, for model=gpt-4

LiteLLM: Utils - Counting tokens for OpenAI model=gpt-4 Token Counter - using OpenAI token counter, for model=gpt-4 LiteLLM: Utils - Counting tokens for OpenAI model=gpt-4 Logging Details LiteLLM-Success Call streaming complete Looking up model=gpt-4 in model_cost_map Success: model=gpt-4 in model_cost_map prompt_tokens=750; completion_tokens=27 Returned custom cost for model=gpt-4 - prompt_tokens_cost_usd_dollar: 0.0225, completion_tokens_cost_usd_dollar: 0.0016200000000000001 final cost: 0.02412; prompt_tokens_cost_usd_dollar: 0.0225; completion_tokens_cost_usd_dollar: 0.0016200000000000001

` .... It hangs here, indefinitely ....

cfortune avatar Apr 17 '24 22:04 cfortune

I have reconstructed the system message on my side to not use the computer and run CLI and python code only upon a specific request such as browse, in that case the intruction is to write code using python such as beautiful soap and scraping. all the tests i didn with the computer on 7b param open source LLMs are failing due to poor system message sysntax. i will share my system msg after i get a confirmation on the submitted llama.cpp merge approval to main. hint, change your system prompt and remove the 'execute' leave the computer as an imaginary idea not a py class. tell the op-in to output 3 then CLI and 2 \n then your input on py code to perform the task and then another \n and 3 this will construct the place holder tor passing any $CLIcommand or $pycode_request and the LLM will understand and not get errors. filnal output of the execute markdown instructions: CLI\n\n $your_input_if_its_CLI_Pycode \n and everything works smooth and without errors. even some 1.8b llm are able to function when the system msg is clear. who wants to send sms's and call john lol....its a computer not a phone...yet.

6rz6 avatar Apr 19 '24 20:04 6rz6

i had the same issue. apparently , it's been fixed already. run --upgrade.

legaltextai avatar Apr 24 '24 21:04 legaltextai

Same here. May 16 2024. computer is not defined. (Just after upgradding open-interpreter) Capture d’écran 2024-05-16 145808

onigetoc avatar May 16 '24 21:05 onigetoc

image

hossain666 avatar May 17 '24 06:05 hossain666