TensorRT-LLMs/tests/unittest/api_stability
Stefan Niebler 0cfd08745c
[TRTLLM-9735][feat] Add processed logprobs functionality to TorchSampler (#9675)
Signed-off-by: Stefan Niebler <82932102+stnie@users.noreply.github.com>
Signed-off-by: Yuan Tong <13075180+tongyuantongyu@users.noreply.github.com>
Signed-off-by: Erin Ho <14718778+hchings@users.noreply.github.com>
Co-authored-by: Yuan Tong <13075180+tongyuantongyu@users.noreply.github.com>
Co-authored-by: Erin Ho <14718778+hchings@users.noreply.github.com>
2026-01-16 10:52:41 -08:00
..
references [TRTLLM-9735][feat] Add processed logprobs functionality to TorchSampler (#9675) 2026-01-16 10:52:41 -08:00
references_committed [None][chore] set the default value of max_num_tokens explicitly (#8208) 2025-10-14 23:03:02 -07:00
api_stability_core.py [TRTLLM-9735][feat] Add processed logprobs functionality to TorchSampler (#9675) 2026-01-16 10:52:41 -08:00
test_llm_api.py [None][feat] Support ignored prompt length for penalties via new sampling config parameter (#8127) 2025-10-27 13:12:31 -04:00