DocsGPT icon indicating copy to clipboard operation
DocsGPT copied to clipboard

🐛 Bug Report: Malformed request body error on /stream endpoint

Open guralover36 opened this issue 5 months ago • 10 comments

📜 Description

After installing and starting the server via Docker, any request sent from the local UI to the /stream endpoint fails with a "Malformed request body" error on the client side. The backend logs inside the Docker container show that the request data is received but then immediately rejected with the same error. I use the method 4, via OpenAI api key.

👟 Reproduction steps

The Quickstart guide from DocsGPT docs for Windows.

👍 Expected behavior

The /stream endpoint should accept the JSON payload, process the request, and return a streamed response to the client without error.

👎 Actual Behavior with Screenshots

lient: Displays error banner or alert with message

Malformed request body

Backend logs (inside Docker container):

[2025-06-18 10:24:06,808] INFO in routes: /stream - request_data: {
  'question': 'What is DocsGPT?',
  'history': '[{"prompt":"What is DocsGPT?"}]',
  'conversation_id': None,
  'prompt_id': 'default',
  'chunks': '2',
  'token_limit': 2000,
  'isNoneDoc': False,
  'save_conversation': True,
  'retriever': 'duckduck_search'
}, source: {}, attachments: 0
[2025-06-18 10:24:06,808] ERROR in routes: /stream - error: Malformed request body

💻 Operating system

Windows

What browsers are you seeing the problem on?

Firefox, Chrome

🤖 What development environment are you experiencing this bug on?

Docker

🔒 Did you set the correct environment variables in the right path? List the environment variable names (not values please!)

API_KEY LLM_NAME MODEL_NAME VITE_API_STREAMING

📃 Provide any additional context for the Bug.

No response

📖 Relevant log output

Connect Cloud API Provider
Choose your Cloud API Provider:
1) OpenAI
2) Google (Vertex AI, Gemini)
3) Anthropic (Claude)
4) Groq
5) HuggingFace Inference API
6) Azure OpenAI
7) Novita
b) Back to Main Menu

Choose option (1-7, or b): 1
Your API key will be stored locally in the .env file and will not be sent anywhere else
Please enter your API key: sk-...

Configuring for Cloud API Provider: OpenAI...
.env file configured for OpenAI.
Docker is not running. Attempting to start Docker Desktop...
Waiting for Docker to start   ...
Docker has started successfully!

Starting Docker Compose...
time="2025-06-18T13:14:44+03:00" level=warning msg="The \"LLM_PROVIDER\" variable is not set. Defaulting to a blank string."
time="2025-06-18T13:14:44+03:00" level=warning msg="The \"OPENAI_BASE_URL\" variable is not set. Defaulting to a blank string."
time="2025-06-18T13:14:44+03:00" level=warning msg="The \"LLM_PROVIDER\" variable is not set. Defaulting to a blank string."
Compose can now delegate builds to bake for better performance.
 To do so, set COMPOSE_BAKE=true.
[+] Building 3.8s (41/49)                                                                          docker:desktop-linux
 => [backend internal] load build definition from Dockerfile                                                       0.0s
 => => transferring dockerfile: 2.67kB                                                                             0.0s
 => WARN: FromAsCasing: 'as' and 'FROM' keywords' casing do not match (line 2)                                     0.0s
 => WARN: FromAsCasing: 'as' and 'FROM' keywords' casing do not match (line 46)                                    0.0s
 => [worker internal] load build definition from Dockerfile                                                        0.0s
 => => transferring dockerfile: 2.67kB                                                                             0.0s
 => WARN: FromAsCasing: 'as' and 'FROM' keywords' casing do not match (line 2)                                     0.0s
 => WARN: FromAsCasing: 'as' and 'FROM' keywords' casing do not match (line 46)                                    0.0s
 => [backend internal] load metadata for docker.io/library/ubuntu:24.04                                            1.3s
 => [worker auth] library/ubuntu:pull token for registry-1.docker.io                                               0.0s
 => [worker internal] load .dockerignore                                                                           0.0s
 => => transferring context: 2B                                                                                    0.0s
 => [backend internal] load .dockerignore                                                                          0.0s
 => => transferring context: 2B                                                                                    0.0s
 => [worker internal] load build context                                                                           0.1s
 => => transferring context: 515.32kB                                                                              0.1s
 => [backend builder 1/9] FROM docker.io/library/ubuntu:24.04@sha256:b59d21599a2b151e23eea5f6602f4af4d7d31c4e236d  0.1s
 => => resolve docker.io/library/ubuntu:24.04@sha256:b59d21599a2b151e23eea5f6602f4af4d7d31c4e236d22bf0b62b86d2e38  0.0s
 => [backend internal] load build context                                                                          0.1s
 => => transferring context: 515.32kB                                                                              0.1s
 => CACHED [worker final 2/9] RUN apt-get update &&     apt-get install -y software-properties-common &&     add-  0.0s
 => CACHED [worker final 3/9] WORKDIR /app                                                                         0.0s
 => CACHED [worker final 4/9] RUN groupadd -r appuser &&     useradd -r -g appuser -d /app -s /sbin/nologin -c "D  0.0s
 => CACHED [worker builder 2/9] RUN apt-get update &&     apt-get install -y software-properties-common &&     ad  0.0s
 => CACHED [worker builder 3/9] RUN if [ -f /usr/bin/python3.12 ]; then         ln -s /usr/bin/python3.12 /usr/bi  0.0s
 => CACHED [worker builder 4/9] RUN wget https://d3dg1063dc54p9.cloudfront.net/models/embeddings/mpnet-base-v2.zi  0.0s
 => CACHED [worker builder 5/9] RUN wget -q -O - https://sh.rustup.rs | sh -s -- -y                                0.0s
 => CACHED [worker builder 6/9] RUN apt-get remove --purge -y wget unzip && apt-get autoremove -y && rm -rf /var/  0.0s
 => CACHED [worker builder 7/9] COPY requirements.txt .                                                            0.0s
 => CACHED [worker builder 8/9] RUN python3.12 -m venv /venv                                                       0.0s
 => CACHED [worker builder 9/9] RUN pip install --no-cache-dir --upgrade pip &&     pip install --no-cache-dir ti  0.0s
 => CACHED [worker final 5/9] COPY --from=builder /venv /venv                                                      0.0s
 => CACHED [worker final 6/9] COPY --from=builder /models /app/models                                              0.0s
 => CACHED [worker final 7/9] COPY . /app/application                                                              0.0s
 => CACHED [worker final 8/9] RUN mkdir -p /app/application/inputs/local                                           0.0s
 => CACHED [backend final 9/9] RUN chown -R appuser:appuser /app                                                   0.0s
 => [worker] exporting to image                                                                                    0.3s
 => => exporting layers                                                                                            0.0s
 => => exporting manifest sha256:41b7093d7c6495fa315a26bd3dd38899e206f2c598537a8b330315314bc032b2                  0.0s
 => => exporting config sha256:30dc4530fd3304e02b240334171db83926c4dbc7a61eef3cd81040470863a63f                    0.0s
 => => exporting attestation manifest sha256:8642fd8c36272cbe5e35273e3ec200818982cd8aa99d6f6ef29dfbca359c12c4      0.1s
 => => exporting manifest list sha256:bd741fd1b3ed6ffb0be074b266f5a025c30e52181893f5538f44c270bbd39556             0.0s
 => => naming to docker.io/library/docsgpt-oss-worker:latest                                                       0.0s
 => => unpacking to docker.io/library/docsgpt-oss-worker:latest                                                    0.0s
 => [backend] exporting to image                                                                                   0.3s
 => => exporting layers                                                                                            0.0s
 => => exporting manifest sha256:bccc8b517ec8e4ba23efc3e1b71a16a109a80bdac942f014ee5f8ed439042d27                  0.0s
 => => exporting config sha256:0169b06a2dfe81ce52971e7c5c2a7282d9ba2969482801ac344fdce75d7e7f60                    0.0s
 => => exporting attestation manifest sha256:31ede6053a59ccc31f855dc1243bfacac709a536815cd98ac98ae64546d31f74      0.1s
 => => exporting manifest list sha256:f2b33ac9a260b5186f864e11159f942d6139b1c992ecffde6d42262a7e1f2e56             0.0s
 => => naming to docker.io/library/docsgpt-oss-backend:latest                                                      0.0s
 => => unpacking to docker.io/library/docsgpt-oss-backend:latest                                                   0.0s
 => [worker] resolving provenance for metadata file                                                                0.0s
 => [backend] resolving provenance for metadata file                                                               0.0s
 => [frontend internal] load build definition from Dockerfile                                                      0.0s
 => => transferring dockerfile: 201B                                                                               0.0s
 => [frontend internal] load metadata for docker.io/library/node:20.6.1-bullseye-slim                              0.9s
 => [frontend auth] library/node:pull token for registry-1.docker.io                                               0.0s
 => [frontend internal] load .dockerignore                                                                         0.0s
 => => transferring context: 2B                                                                                    0.0s
 => [frontend 1/5] FROM docker.io/library/node:20.6.1-bullseye-slim@sha256:ee905d8492c443aebe41f4cc525ebabefef757  0.1s
 => => resolve docker.io/library/node:20.6.1-bullseye-slim@sha256:ee905d8492c443aebe41f4cc525ebabefef757df43556c4  0.1s
 => [frontend internal] load build context                                                                         0.2s
 => => transferring context: 2.46MB                                                                                0.2s
 => CACHED [frontend 2/5] WORKDIR /app                                                                             0.0s
 => CACHED [frontend 3/5] COPY package*.json ./                                                                    0.0s
 => CACHED [frontend 4/5] RUN npm install                                                                          0.0s
 => CACHED [frontend 5/5] COPY . .                                                                                 0.0s
 => [frontend] exporting to image                                                                                  0.1s
 => => exporting layers                                                                                            0.0s
 => => exporting manifest sha256:aa77fd68e595b54e232fe831640972c18c311e4763d4738c164cdcf82959ac9b                  0.0s
 => => exporting config sha256:6ea5e684bb76995f8340c9a0fecc7bc8a26cb60d9faf95f1572fb351c75d8412                    0.0s
 => => exporting attestation manifest sha256:66a043a6ab89c95dea956e195fe064d6bddffd6ffe1c417fe3bc522976f4de96      0.0s
 => => exporting manifest list sha256:765ed4b848381114866573838f0c96b91a52f9ccf69639c6a149ae73e30252b5             0.0s
 => => naming to docker.io/library/docsgpt-oss-frontend:latest                                                     0.0s
 => => unpacking to docker.io/library/docsgpt-oss-frontend:latest                                                  0.0s
 => [frontend] resolving provenance for metadata file                                                              0.0s
[+] Running 8/8
 ✔ backend                           Built                                                                         0.0s
 ✔ frontend                          Built                                                                         0.0s
 ✔ worker                            Built                                                                         0.0s
 ✔ Container docsgpt-oss-redis-1     Started                                                                       1.3s
 ✔ Container docsgpt-oss-mongo-1     Started                                                                       1.4s
 ✔ Container docsgpt-oss-worker-1    Started                                                                       1.8s
 ✔ Container docsgpt-oss-backend-1   Started                                                                       1.8s
 ✔ Container docsgpt-oss-frontend-1  Started                                                                       1.7s

DocsGPT is now configured to use OpenAI on http://localhost:5173
You can stop the application by running: docker compose -f "K:\DocsGPT\deployment\docker-compose.yaml" down

DocsGPT Setup Complete.

👀 Have you spent some time to check if this bug has been raised before?

  • [x] I checked and didn't find similar issue

🔗 Are you willing to submit PR?

No

🧑‍⚖️ Code of Conduct

  • [x] I agree to follow this project's Code of Conduct

guralover36 avatar Jun 18 '25 10:06 guralover36

Same here, on Ubuntu

deltacodepl avatar Jul 14 '25 07:07 deltacodepl

I met the same error too

XiYuXu avatar Jul 17 '25 07:07 XiYuXu

I think Ill try to add more details to this error, to make it easier to debug

dartpain avatar Jul 20 '25 11:07 dartpain

I ran into same error on Windows 11 after fresh install @dartpain.

  • Serve Local (with Ollama)
  • serve Ollama with CPU

An other Issue I had to solve: "setup.ps1" creates an .env-File, that was UTF-8-BOM encoded (which leads to Error unexpected character "\ufeff" in variable name "\ufeffAPI_KEY=xxxx\r"), instead of UTF-8.

Anzge avatar Jul 23 '25 09:07 Anzge

I encountered the same problem , on Ubuntu

Simon0401 avatar Jul 27 '25 09:07 Simon0401

same problem, i asked ai and it said the 'chunks' parameter expected by the backend should be an number rather than a string, maybe it could fix the problem?

Cook1ez avatar Aug 04 '25 02:08 Cook1ez

If anyone still has a similar issue, pease check your backend logs, as these errors may be happening for different reasons. check the line in logs that starts with /stream - error Also please make sure you are using latest version by running git pull

dartpain avatar Aug 04 '25 10:08 dartpain

I had this error too when choosing ollama for my Inference Engine

started with this fix: https://docs.docsgpt.cloud/Models/local-inference The OPENAI_BASE_URL examples above use http://localhost. If you are running DocsGPT within Docker and your local inference engine is running on your host machine (outside of Docker), you will likely need to replace localhost with http://host.docker.internal to ensure Docker can correctly access your host's services. For example, http://host.docker.internal:11434/v1 for Ollama.

In the deployment/docker-compose.yaml file MODEL_NAME is missing from backend environment. Changed LLM_PROVIDER during troubleshooting since it did not exist in .env file

  backend:
    user: root
    build: ../application
    environment:
      - API_KEY=$API_KEY
      - EMBEDDINGS_KEY=$API_KEY
      - LLM_PROVIDER=$LLM_NAME
      - LLM_NAME=$LLM_NAME
      - MODEL_NAME=$MODEL_NAME

Added MODEL_NAME to Settings class in application/core/settings.py

added line 18: MODEL_NAME: Optional[str] = None

Updated get_gpt_model() function in application/utils.py

modified line 31: return settings.MODEL_NAME or settings.LLM_NAME or model_map.get(settings.LLM_PROVIDER, "")

toddp0 avatar Aug 07 '25 22:08 toddp0

Hey @dartpain, if this issue remains unfixed, can you please assign this issue to me, would be happy to work on it

shivansh-bhatnagar18 avatar Oct 07 '25 18:10 shivansh-bhatnagar18

I have been using the /stream endpoint from quite a while and it seems to work fine. I am attaching some screenshots for the same.

Image

The API also works fine, I think there might be some mismatch in some JSON Payload which is giving your error. Can anyone help me reciprocate that error?

shivansh-bhatnagar18 avatar Oct 08 '25 19:10 shivansh-bhatnagar18