llm icon indicating copy to clipboard operation
llm copied to clipboard

"Error: Connection error." following Github readme.md

Open nickschurch opened this issue 1 year ago • 2 comments

I installed llm via conda (from conda-forge), set my openai key (which worked fine). llm models returns a list of models that are available, but attempting to ask the models a Q, gets this error.

Any ideas?

nickschurch avatar Jun 11 '24 14:06 nickschurch

I get the same.

Error: Connection error

Steps:

brew install llm
...

llm keys set openai
Enter key: <snip>

llm "count to 5 in spanish"
Error: Connection error.

LLM_OPENAI_SHOW_RESPONSES=1 llm -m 4o 'three word slogan for an an otter-run bakery'
Error: Connection error.

nicpaimnt avatar Aug 08 '24 04:08 nicpaimnt

I am suddnely getting this to from all the API models after working without issues for a while (OpenAI and Anthoropic are the only ones I am using). Local models still work, and it's not a general network issue on my machine and not an issue with keys, credit etc.

Would be great if this error gave a little more context.

Edit: appears to be a genuine network error. In my case a DNS resolution error for the API (due to a quirky situation with the environment I am running WSL in that was not visible to the system as a whole).

yrebrac avatar Sep 07 '24 01:09 yrebrac

I have the same issue as @yrebrac - running llm inside WSL. It's happening only with llm, I don't have any issues using OpenAI/Anthropic SDKs.

frixaco avatar Oct 21 '24 13:10 frixaco