open-interpreter icon indicating copy to clipboard operation
open-interpreter copied to clipboard

httpx.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

Open yxl23 opened this issue 1 year ago • 10 comments

Describe the bug

(base) C:\Users\admin>interpreter --model ollama/llama2

▌ Model set to ollama/llama2

Open Interpreter will require approval before running code.

Use interpreter -y to bypass this.

Press CTRL-C to exit.

Help me create a file named my PDF on the C drive, and the contents of the file are bubbling code written in Python

We were unable to determine the context window of this model. Defaulting to 3000.

If your model can handle more, run interpreter --context_window {token limit} --max_tokens {max tokens per response}.

Continuing...

    Python Version: 3.11.5
    Pip Version: 23.2.1
    Open-interpreter Version: cmd:Interpreter, pkg: 0.2.0
    OS Version and Architecture: Windows-10-10.0.22631-SP0
    CPU Info: Intel64 Family 6 Model 165 Stepping 2, GenuineIntel
    RAM Info: 15.87 GB, used: 6.72, free: 9.16


    # Interpreter Info

    Vision: False
    Model: ollama/llama2
    Function calling: None
    Context window: None
    Max tokens: None

    Auto run: False
    API base: None
    Offline: False

    Curl output: Not local

    # Messages

    System Message: You are Open Interpreter, a world-class programmer that can complete any goal by executing code.

First, write a plan. Always recap the plan between each code block (you have extreme short-term memory loss, so you need to recap the plan between each message block to retain it). When you execute code, it will be executed on the user's machine. The user has given you full and complete permission to execute any code necessary to complete the task. Execute the code. If you want to send data between programming languages, save the data to a txt or json. You can access the internet. Run any code to achieve the goal, and if at first you don't succeed, try again and again. You can install new packages. When a user refers to a filename, they're likely referring to an existing file in the directory you're currently executing code in. Write messages to the user in Markdown. In general, try to make plans with as few steps as possible. As for actually executing code to carry out that plan, for stateful languages (like python, javascript, shell, but NOT for html which starts from 0 every time) it's critical not to try to do everything in one code block. You should try something, print information about it, then continue from there in tiny, informed steps. You will never get it on the first try, and attempting it in one go will often lead to errors you cant see. You are capable of any task.

    {'role': 'user', 'type': 'message', 'content': 'Help me create a file named my PDF on the C drive, and the contents of the file are bubbling code written in Python'}

Traceback (most recent call last): File "D:\andconda\Lib\site-packages\httpcore_exceptions.py", line 10, in map_exceptions yield File "D:\andconda\Lib\site-packages\httpcore_backends\sync.py", line 206, in connect_tcp sock = socket.create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\socket.py", line 851, in create_connection raise exceptions[0] File "D:\andconda\Lib\socket.py", line 836, in create_connection sock.connect(sa) ConnectionRefusedError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "D:\andconda\Lib\site-packages\httpx_transports\default.py", line 69, in map_httpcore_exceptions yield File "D:\andconda\Lib\site-packages\httpx_transports\default.py", line 233, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpcore_sync\connection_pool.py", line 268, in handle_request raise exc File "D:\andconda\Lib\site-packages\httpcore_sync\connection_pool.py", line 251, in handle_request response = connection.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpcore_sync\connection.py", line 99, in handle_request raise exc File "D:\andconda\Lib\site-packages\httpcore_sync\connection.py", line 76, in handle_request stream = self._connect(request) ^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpcore_sync\connection.py", line 124, in _connect stream = self._network_backend.connect_tcp(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpcore_backends\sync.py", line 205, in connect_tcp with map_exceptions(exc_map): File "D:\andconda\Lib\contextlib.py", line 155, in exit self.gen.throw(typ, value, traceback) File "D:\andconda\Lib\site-packages\httpcore_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in run_code File "D:\andconda\Scripts\interpreter.exe_main.py", line 7, in File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 25, in start_terminal_interface start_terminal_interface(self) File "D:\andconda\Lib\site-packages\interpreter\terminal_interface\start_terminal_interface.py", line 684, in start_terminal_interface interpreter.chat() File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 86, in chat for _ in self._streaming_chat(message=message, display=display): File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 113, in _streaming_chat yield from terminal_interface(self, message) File "D:\andconda\Lib\site-packages\interpreter\terminal_interface\terminal_interface.py", line 135, in terminal_interface for chunk in interpreter.chat(message, display=False, stream=True): File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 148, in _streaming_chat yield from self._respond_and_store() File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 194, in _respond_and_store for chunk in respond(self): File "D:\andconda\Lib\site-packages\interpreter\core\respond.py", line 49, in respond for chunk in interpreter.llm.run(messages_for_llm): File "D:\andconda\Lib\site-packages\interpreter\core\llm\llm.py", line 193, in run yield from run_text_llm(self, params) File "D:\andconda\Lib\site-packages\interpreter\core\llm\run_text_llm.py", line 19, in run_text_llm for chunk in llm.completions(**params): File "D:\andconda\Lib\site-packages\interpreter\core\llm\llm.py", line 224, in fixed_litellm_completions raise first_error File "D:\andconda\Lib\site-packages\interpreter\core\llm\llm.py", line 205, in fixed_litellm_completions yield from litellm.completion(**params) File "D:\andconda\Lib\site-packages\litellm\llms\ollama.py", line 242, in ollama_completion_stream with httpx.stream( File "D:\andconda\Lib\contextlib.py", line 137, in enter return next(self.gen) ^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_api.py", line 160, in stream File "D:\andconda\Lib\contextlib.py", line 137, in enter return next(self.gen) ^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 870, in stream response = self.send( ^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 914, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 942, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 979, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 1015, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_transports\default.py", line 232, in handle_request with map_httpcore_exceptions(): File "D:\andconda\Lib\contextlib.py", line 155, in exit self.gen.throw(typ, value, traceback) File "D:\andconda\Lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

(base) C:\Users\admin>interpreter --model ollama/llama2

▌ Model set to ollama/llama2

Open Interpreter will require approval before running code.

Use interpreter -y to bypass this.

Press CTRL-C to exit.

Help me create a file named my PDF on the C drive, and the contents of the file are bubbling code written in Python

We were unable to determine the context window of this model. Defaulting to 3000.

If your model can handle more, run interpreter --context_window {token limit} --max_tokens {max tokens per response}.

Continuing...

    Python Version: 3.11.5
    Pip Version: 23.2.1
    Open-interpreter Version: cmd:Interpreter, pkg: 0.2.0
    OS Version and Architecture: Windows-10-10.0.22631-SP0
    CPU Info: Intel64 Family 6 Model 165 Stepping 2, GenuineIntel
    RAM Info: 15.87 GB, used: 6.68, free: 9.19


    # Interpreter Info

    Vision: False
    Model: ollama/llama2
    Function calling: None
    Context window: None
    Max tokens: None

    Auto run: False
    API base: None
    Offline: False

    Curl output: Not local

    # Messages

    System Message: You are Open Interpreter, a world-class programmer that can complete any goal by executing code.

First, write a plan. Always recap the plan between each code block (you have extreme short-term memory loss, so you need to recap the plan between each message block to retain it). When you execute code, it will be executed on the user's machine. The user has given you full and complete permission to execute any code necessary to complete the task. Execute the code. If you want to send data between programming languages, save the data to a txt or json. You can access the internet. Run any code to achieve the goal, and if at first you don't succeed, try again and again. You can install new packages. When a user refers to a filename, they're likely referring to an existing file in the directory you're currently executing code in. Write messages to the user in Markdown. In general, try to make plans with as few steps as possible. As for actually executing code to carry out that plan, for stateful languages (like python, javascript, shell, but NOT for html which starts from 0 every time) it's critical not to try to do everything in one code block. You should try something, print information about it, then continue from there in tiny, informed steps. You will never get it on the first try, and attempting it in one go will often lead to errors you cant see. You are capable of any task.

    {'role': 'user', 'type': 'message', 'content': 'Help me create a file named my PDF on the C drive, and the contents of the file are bubbling code written in Python'}

Traceback (most recent call last): File "D:\andconda\Lib\site-packages\httpcore_exceptions.py", line 10, in map_exceptions yield File "D:\andconda\Lib\site-packages\httpcore_backends\sync.py", line 206, in connect_tcp sock = socket.create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\socket.py", line 851, in create_connection raise exceptions[0] File "D:\andconda\Lib\socket.py", line 836, in create_connection sock.connect(sa) ConnectionRefusedError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "D:\andconda\Lib\site-packages\httpx_transports\default.py", line 69, in map_httpcore_exceptions yield File "D:\andconda\Lib\site-packages\httpx_transports\default.py", line 233, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpcore_sync\connection_pool.py", line 268, in handle_request raise exc File "D:\andconda\Lib\site-packages\httpcore_sync\connection_pool.py", line 251, in handle_request response = connection.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpcore_sync\connection.py", line 99, in handle_request raise exc File "D:\andconda\Lib\site-packages\httpcore_sync\connection.py", line 76, in handle_request stream = self._connect(request) ^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpcore_sync\connection.py", line 124, in _connect stream = self._network_backend.connect_tcp(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpcore_backends\sync.py", line 205, in connect_tcp with map_exceptions(exc_map): File "D:\andconda\Lib\contextlib.py", line 155, in exit self.gen.throw(typ, value, traceback) File "D:\andconda\Lib\site-packages\httpcore_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in run_code File "D:\andconda\Scripts\interpreter.exe_main.py", line 7, in File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 25, in start_terminal_interface start_terminal_interface(self) File "D:\andconda\Lib\site-packages\interpreter\terminal_interface\start_terminal_interface.py", line 684, in start_terminal_interface interpreter.chat() File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 86, in chat for _ in self._streaming_chat(message=message, display=display): File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 113, in _streaming_chat yield from terminal_interface(self, message) File "D:\andconda\Lib\site-packages\interpreter\terminal_interface\terminal_interface.py", line 135, in terminal_interface for chunk in interpreter.chat(message, display=False, stream=True): File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 148, in _streaming_chat yield from self._respond_and_store() File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 194, in _respond_and_store for chunk in respond(self): File "D:\andconda\Lib\site-packages\interpreter\core\respond.py", line 49, in respond for chunk in interpreter.llm.run(messages_for_llm): File "D:\andconda\Lib\site-packages\interpreter\core\llm\llm.py", line 193, in run yield from run_text_llm(self, params) File "D:\andconda\Lib\site-packages\interpreter\core\llm\run_text_llm.py", line 19, in run_text_llm for chunk in llm.completions(**params): File "D:\andconda\Lib\site-packages\interpreter\core\llm\llm.py", line 224, in fixed_litellm_completions raise first_error File "D:\andconda\Lib\site-packages\interpreter\core\llm\llm.py", line 205, in fixed_litellm_completions yield from litellm.completion(**params) File "D:\andconda\Lib\site-packages\litellm\llms\ollama.py", line 242, in ollama_completion_stream with httpx.stream( File "D:\andconda\Lib\contextlib.py", line 137, in enter return next(self.gen) ^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_api.py", line 160, in stream with client.stream( File "D:\andconda\Lib\contextlib.py", line 137, in enter return next(self.gen) ^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 870, in stream response = self.send( ^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 914, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 942, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 979, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 1015, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_transports\default.py", line 232, in handle_request with map_httpcore_exceptions(): File "D:\andconda\Lib\contextlib.py", line 155, in exit self.gen.throw(typ, value, traceback) File "D:\andconda\Lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

Reproduce

first interpreter --model ollama/llama2 second Help me create a file named my PDF on the C drive, and the contents of the file are bubbling code written in Python last The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in run_code File "D:\andconda\Scripts\interpreter.exe_main.py", line 7, in File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 25, in start_terminal_interface start_terminal_interface(self) File "D:\andconda\Lib\site-packages\interpreter\terminal_interface\start_terminal_interface.py", line 684, in start_terminal_interface interpreter.chat() File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 86, in chat for _ in self._streaming_chat(message=message, display=display): File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 113, in _streaming_chat yield from terminal_interface(self, message) File "D:\andconda\Lib\site-packages\interpreter\terminal_interface\terminal_interface.py", line 135, in terminal_interface for chunk in interpreter.chat(message, display=False, stream=True): File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 148, in _streaming_chat yield from self._respond_and_store() File "D:\andconda\Lib\site-packages\interpreter\core\core.py", line 194, in _respond_and_store for chunk in respond(self): File "D:\andconda\Lib\site-packages\interpreter\core\respond.py", line 49, in respond for chunk in interpreter.llm.run(messages_for_llm): File "D:\andconda\Lib\site-packages\interpreter\core\llm\llm.py", line 193, in run yield from run_text_llm(self, params) File "D:\andconda\Lib\site-packages\interpreter\core\llm\run_text_llm.py", line 19, in run_text_llm for chunk in llm.completions(**params): File "D:\andconda\Lib\site-packages\interpreter\core\llm\llm.py", line 224, in fixed_litellm_completions raise first_error File "D:\andconda\Lib\site-packages\interpreter\core\llm\llm.py", line 205, in fixed_litellm_completions yield from litellm.completion(**params) File "D:\andconda\Lib\site-packages\litellm\llms\ollama.py", line 242, in ollama_completion_stream with httpx.stream( File "D:\andconda\Lib\contextlib.py", line 137, in enter return next(self.gen) ^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_api.py", line 160, in stream File "D:\andconda\Lib\contextlib.py", line 137, in enter return next(self.gen) ^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 870, in stream response = self.send( ^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 914, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 942, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 979, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_client.py", line 1015, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\andconda\Lib\site-packages\httpx_transports\default.py", line 232, in handle_request with map_httpcore_exceptions(): File "D:\andconda\Lib\contextlib.py", line 155, in exit self.gen.throw(typ, value, traceback) File "D:\andconda\Lib\site-packages\httpx_transports\default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

Expected behavior

image

Screenshots

image

Open Interpreter version

0.2.0

Python version

3.11.5

Operating System name and version

win11

Additional context

No response

yxl23 avatar Mar 07 '24 03:03 yxl23

Have you enabled VPN locally? Please disable VPN first.

Dragonchu avatar Mar 13 '24 16:03 Dragonchu

No, I haven't started VPN locally

yxl23 avatar Mar 14 '24 07:03 yxl23

Have you executed ollama run llama2 ?

You need to start ollama server before using Open Interpreter locally.

Dragonchu avatar Mar 14 '24 09:03 Dragonchu

Yes, I have already started Ollama

yxl23 avatar Mar 15 '24 08:03 yxl23

image

yxl23 avatar Mar 15 '24 08:03 yxl23

Add these flags:

Interpreter --model ollama/model --api_base http://localhost:11434/v1 --api_key dummykey

Notnaton avatar Mar 15 '24 11:03 Notnaton

What is Ollama's apikey and how can I check it?

yxl23 avatar Mar 16 '24 00:03 yxl23

You don't need to, just leave it as it is

Notnaton avatar Mar 19 '24 18:03 Notnaton

@yxl23 Did this resolve your issue?

MikeBirdTech avatar Mar 27 '24 17:03 MikeBirdTech

yes,thankyou

yxl23 avatar Mar 28 '24 00:03 yxl23