olmocr
olmocr copied to clipboard
[vLLM] Response did not finish with reason code 'stop', cannot use this response
🐛 Describe the bug
version: 0.4.4
vllm verison: v0.10.1.1
installation method: uv add olmocr
command: python -m olmocr.pipeline ./olm --server http://localhost:8114/v1 --markdown --pdfs ./test/Extract[955-968].pdf --model vllm
other info: No error on vLLM endpoint logs, seems like endless generating
Versions
0.4.4
logs
2025-11-13 09:00:47,969 - __main__ - WARNING - ValueError on attempt 3 for ./test/Extract[6].pdf-1: <class 'ValueError'> - Response did not finish with reason code 'stop', cannot use this response
2025-11-13 09:00:54,982 - __main__ - INFO - Queue remaining: 0
2025-11-13 09:00:54,982 - __main__ - INFO -
Metric Name Lifetime (tokens/sec) Recently (tokens/sec)
----------------------------------------------------------------------------------
2025-11-13 09:00:54,982 - __main__ - INFO -
Worker ID | started
----------+--------
0 | 1
2025-11-13 09:01:04,990 - __main__ - INFO - Queue remaining: 0
2025-11-13 09:01:04,990 - __main__ - INFO -
Metric Name Lifetime (tokens/sec) Recently (tokens/sec)
----------------------------------------------------------------------------------
a ValueError problem ?
It's probably a PDF that might have text which is too long or otherwise not parseable by the model. Can you send the PDF you are using?