vllm icon indicating copy to clipboard operation
vllm copied to clipboard

[Bug]:Question about logprobs output being 0.0 when using `vllm` sampling params

Open Violettttee opened this issue 8 months ago • 2 comments

Your current environment

The output of `python collect_env.py`
[pip3] numpy==1.26.4
[pip3] nvidia-cublas-cu12==12.4.5.8
[pip3] nvidia-cuda-cupti-cu12==12.4.127
[pip3] nvidia-cuda-nvrtc-cu12==12.4.127
[pip3] nvidia-cuda-runtime-cu12==12.4.127
[pip3] nvidia-cudnn-cu12==9.1.0.70
[pip3] nvidia-cufft-cu12==11.2.1.3
[pip3] nvidia-curand-cu12==10.3.5.147
[pip3] nvidia-cusolver-cu12==11.6.1.9
[pip3] nvidia-cusparse-cu12==12.3.1.170
[pip3] nvidia-dali-cuda120==1.35.0
[pip3] nvidia-ml-py==12.560.30
[pip3] nvidia-nccl-cu12==2.21.5
[pip3] nvidia-nvjitlink-cu12==12.4.127
[pip3] nvidia-nvtx-cu12==12.4.127
[pip3] nvidia-pyindex==1.0.9
[pip3] onnx==1.15.0rc2
[pip3] optree==0.10.0
[pip3] pynvml==11.4.1
[pip3] pytorch-quantization==2.1.2
[pip3] pytorch-triton==2.2.0+e28a256d7
[pip3] pyzmq==25.1.2
[pip3] torch==2.5.1
[pip3] torch-tensorrt==2.3.0a0
[pip3] torchdata==0.7.1a0
[pip3] torchtext==0.17.0a0
[pip3] torchvision==0.20.1
[pip3] transformers==4.48.0
[pip3] triton==3.1.0
[conda] Could not collect
ROCM Version: Could not collect
Neuron SDK Version: N/A
vLLM Version: 0.6.6

🐛 Describe the bug

Title: Question about logprobs output being 0.0 when using vllm sampling params

Body:

Hi,

Thanks for the amazing project!

I have a question regarding the logprobs output when using vllm's sampling parameters.
Specifically, when I pass the logprobs parameter in the request, I noticed that in the returned response, many tokens have their logprobs equal to 0.0.

Is this expected behavior?
It seems unusual because I found that the majority of generated tokens have their logprobs as 0.0.
I would have expected negative log probabilities instead.

Image

Would you mind helping clarify under what conditions this might happen?
Thanks a lot in advance!

Before submitting a new issue...

  • [x] Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.

Violettttee avatar Apr 28 '25 06:04 Violettttee

I have this issues too. My vllm version is 0.6.6.post1.

huangjj67 avatar May 18 '25 01:05 huangjj67

I have this issues too. My vllm version is 0.6.6.post1.

it seems fixed in v 0.8 version

Violettttee avatar May 27 '25 14:05 Violettttee

This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you!

github-actions[bot] avatar Aug 26 '25 02:08 github-actions[bot]