[BUG] Connection error, liteLLM gateway API + local http_proxy
Environment
- Platform (select one):
- [ ] Anthropic API
- [ ] AWS Bedrock
- [ ] Google Vertex AI
- [☑️] Other: liteLLM + AWS Bedrock
- Claude CLI version: 1.0.25
- Operating System: macOS 15.5
- Terminal: iTerm2
Bug Description
After configuring local http_proxy and https_proxy environment variables, I'm unable to use Claude Code. After unsetting these configurations, it works properly. This issue only occurs when using the liteLLM gateway. Direct calls to AWS Bedrock API work without problems.
Steps to Reproduce
- set http_proxy / https_proxy
- API Overrides liteLLM url
Expected Behavior
it works well
Actual Behavior
[ERROR] Error streaming, falling back to non-streaming mode: Connection error.
Additional Context
Bug Description [ERROR] Error streaming, falling back to non-streaming mode: Connection error.
Environment Info
- Platform: darwin
- Terminal: iTerm.app
- Version: 1.0.25
- Feedback ID:
Errors
[{"error":"Error: Command failed: security find-generic-password -a $USER -w -s \"Claude Code\"\nsecurity: SecKeychainSearchCopyNext: The specified item could not be found in the keychain.\n\n at genericNodeError (node:internal/errors:983:15)\n at wrappedFn (node:internal/errors:537:14)\n at checkExecSyncError (node:child_process:892:11)\n at execSync (node:child_process:964:15)\n at wZ (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:659:3921)\n at file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:582:8863\n at Q (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:526:17199)\n at IX (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:582:8009)\n at NS (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:582:7090)\n at T6 (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:582:11202)","timestamp":"2025-06-17T17:02:12.739Z"},{"error":"RangeError [ERR_CHILD_PROCESS_STDIO_MAXBUFFER]: stdout maxBuffer length exceeded\n at Socket.onChildStdout (node:child_process:482:14)\n at Socket.emit (node:events:507:28)\n at Socket.emit (node:domain:489:12)\n at addChunk (node:internal/streams/readable:559:12)\n at readableAddChunkPushByteMode (node:internal/streams/readable:510:3)\n at Readable.push (node:internal/streams/readable:390:5)\n at Pipe.onStreamRead (node:internal/stream_base_commons:189:23)","timestamp":"2025-06-17T17:02:14.639Z"},{"error":"Error: Connection error.\n at Sw.makeRequest (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:1315:4386)","timestamp":"2025-06-17T17:05:15.513Z"},{"error":"Error: Connection error.\n at Sw.makeRequest (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:1315:4386)\n at runNextTicks (node:internal/process/task_queues:65:5)\n at process.processTimers (node:internal/timers:540:9)\n at async GE2.E11.showErrors (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:1668:22049)\n at async E11 (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:1668:12215)\n at async GE2 (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:1668:21895)\n at async file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:1668:17115\n at async ft1 (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:1668:5163)\n at async Xu (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:1668:17085)\n at async dO (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:1884:16403)","timestamp":"2025-06-17T17:05:17.483Z"},{"error":"Error: Connection error.\n at Sw.makeRequest (file:///opt/homebrew/lib/node_modules/@anthropic-ai/claude-code/cli.js:1315:4386)","timestamp":"2025-06-17T17:10:16.498Z"}]
@deanbear can you provide more details on your configuration:
- How are you passing the LiteLLM URL (which env var)?
- Which LiteLLM endpoints are you using - passthrough or unified?
- Does this work with just the HTTP proxy without LiteLLM? Does this work without the HTTP proxy with LiteLLM?
@ant-kurt Hi, thank you for your concern about this issue.
-
How are you passing the LiteLLM URL (which env var)? export ANTHROPIC_BASE_URL=my litellm gataway url
-
Which LiteLLM endpoints are you using - passthrough or unified? unified(/v1/messages)
-
Does this work with just the HTTP proxy without LiteLLM? Yes. Using the HTTP proxy with AWS access key, secret key, and the UDE_CODE_USE_BEDROCK environment variable works fine.
-
Does this work without the HTTP proxy with LiteLLM? Yes. Using only ANTHROPIC_BASE_URL through the LiteLLM gateway works fine.
Does your LiteLLM live on localhost? In working to repro this, I noticed some weirdness when using both a local proxy and a local LiteLLM.
Are you expecting HTTP CONNECT calls to your proxy?
In my scenario, the proxy is local, while LiteLLM is on a remote server accessed through a public domain name. In my complex setup, I'm trying to access the internet through the local system global proxy.
I suspect it's some kind of compatibility issue between Claude Code's ANTHROPIC_BASE_URL and the Proxy. By monitoring network requests, I don't even see any access attempts to my LiteLLM service domain.
I've also tried setting my LiteLLM service domain to DIRECT access in my local proxy (ClashX), but I still get errors. Additionally, Claude Code explicitly states that it currently doesn't support NO_PROXY env var.
Hello, I'm currently on version 1.0.44, the issue appears to have been fixed, thank you.
This issue has been automatically locked since it was closed and has not had any activity for 7 days. If you're experiencing a similar issue, please file a new issue and reference this one if it's relevant.