TensorRT-LLM icon indicating copy to clipboard operation
TensorRT-LLM copied to clipboard

New feature request: Topk log probabilities with TokenIDs for each position of input sequence.

Open salaki opened this issue 1 year ago • 1 comments
trafficstars

System Info

H100

Who can help?

@ncomly-nvidia

Information

  • [X] The official example scripts
  • [ ] My own modified scripts

Tasks

  • [ ] An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • [X] My own task or dataset (give details below)

Reproduction

Not a bug

Expected behavior

Append optional fields in Output Dict

actual behavior

None

additional notes

This is a feature request.

salaki avatar Feb 23 '24 19:02 salaki

cc @AdamzNV @ncomly-nvidia @laikhtewari for vis.

hello-11 avatar Nov 15 '24 10:11 hello-11

@salaki , is this issue still relevant to you? By the way, there has been some updates made so the feature is available in the tensorrt backend:

sampling_params = SamplingParams(
    prompt_logprobs=5,  # Now this will work
    max_tokens=20
)

But, this feature is not yet available in PyTorch workflow 😄

karljang avatar Sep 17 '25 22:09 karljang

Issue has not received an update in over 14 days. Adding stale label.

github-actions[bot] avatar Oct 07 '25 03:10 github-actions[bot]

This issue was closed because it has been 14 days without activity since it has been marked as stale.

github-actions[bot] avatar Oct 21 '25 03:10 github-actions[bot]