chat-ui
chat-ui copied to clipboard
Request failed during generation: Server error: error trying to connect: Connection refused (os error 111)
Bug description
The error occurs when the LLM Server suddenly stops, and the chat-ui continues to send queries to the LLM Server, eventually leading to the chat-ui also crashing. The specific error message is:
Error: Request failed during generation: Server error: error trying to connect: Connection refused (os error 111)
Steps to reproduce
- Ensure the LLM Server is running.
- Start the chat-ui and begin sending queries to the LLM Server.
- Suddenly stop the LLM Server.
- Continue to send queries from the chat-ui to the LLM Server.
- Observe the error and subsequent crash of the chat-ui.
Context
Logs
file:///app/chat-ui/node_modules/@huggingface/inference/dist/index.js:371
throw new Error(data.error);
^
Error: Request failed during generation: Server error: error trying to connect: Connection refused (os error 111)
at streamingRequest (file:///app/chat-ui/node_modules/@huggingface/inference/dist/index.js:371:19)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async textGenerationStream (file:///app/chat-ui/node_modules/@huggingface/inference/dist/index.js:705:3)
at async generate (file:///app/chat-ui/build/server/chunks/_server.ts-82ba0bdc.js:558:20)
at async textGenerationWithoutTitle (file:///app/chat-ui/build/server/chunks/_server.ts-82ba0bdc.js:850:5)
Notes
The chat-ui should handle the LLM Server shutdown gracefully and not crash. Implementing a retry mechanism or proper error handling could help mitigate this issue.