DocsGPT
DocsGPT copied to clipboard
🐛 Bug Report: Empty response in the conversation after finishing streaming
📜 Description
When asking a question, the LLM's response streams in as it should, then disappears from the conversation once it is finished. It reappears if the page is refreshed.
👟 Reproduction steps
I am running DocsGPT 0.9.0 locally (./setup.sh with option 2). The environment is configured to use llama-cpp with huggingface_sentence-transformers/all-mpnet-base-v2 embeddings. To reproduce the bug, simply ask a question with this environment.
👍 Expected behavior
The response should remain on screen.
👎 Actual Behavior with Screenshots
Here is a recording of the issue: https://i.imgur.com/MBo3xtV.gif
💻 Operating system
Windows
What browsers are you seeing the problem on?
Firefox, Chrome
🤖 What development environment are you experiencing this bug on?
Docker
🔒 Did you set the correct environment variables in the right path? List the environment variable names (not values please!)
CELERY_BROKER_URL CELERY_RESULT_BACKEND EMBEDDINGS_NAME FLASK_APP FLASK_DEBUG LLM_NAME VITE_API_STREAMING
📃 Provide any additional context for the Bug.
The bug seems to occur when the token sent by the LLM is an empty string. In llama-cpp, the last token seems to always be an empty string. This causes the frontend to recognize the string as empty in https://github.com/arc53/DocsGPT/blob/8873428b4bbd44e4f0c1978cbd9f081030063dfe/frontend/src/conversation/conversationSlice.ts#L151, which in turn causes the code to enter the else part of the statement, which returns an empty response.
The fix is simple, we can check specifically for undefined:
📖 Relevant log output
No response
👀 Have you spent some time to check if this bug has been raised before?
- [X] I checked and didn't find similar issue
🔗 Are you willing to submit PR?
Yes, I am willing to submit a PR!
🧑⚖️ Code of Conduct
- [X] I agree to follow this project's Code of Conduct
Nice catch, thank you!