TensorRT-LLMs/tensorrt_llm/serve
Kaiyu Xie 6eef19297f
[None] [chore] cherry pick changes on slurm scripts from release/1.1.0rc2 (#7750)
Signed-off-by: Kaiyu Xie <26294424+kaiyux@users.noreply.github.com>
2025-09-16 16:07:13 +08:00
..
scripts [None] [chore] cherry pick changes on slurm scripts from release/1.1.0rc2 (#7750) 2025-09-16 16:07:13 +08:00
__init__.py Update TensorRT-LLM (#2820) 2025-02-25 21:21:49 +08:00
chat_utils.py [TRTLLM-6771][feat] Support MMMU for multimodal models (#6828) 2025-08-21 08:54:12 +08:00
harmony_adapter.py [TRTLLM-7779][feat] Support multiple postprocess workers for chat completions API (#7508) 2025-09-08 11:11:35 +08:00
metadata_server.py feat: Add integration of etcd (#3738) 2025-06-03 20:01:44 +08:00
openai_disagg_server.py [None][chore] Mass integration of release/1.0 - 3rd (#7519) 2025-09-08 14:03:04 +08:00
openai_protocol.py [TRTLLM-1302][feat] Topk logprobs for TRT backend and top1 logprob for PyT backend (#6097) 2025-09-12 15:32:34 +08:00
openai_server.py [None][fix] using arrival time in llmapi when creating LlmRequest in pytorch workflow (#7553) 2025-09-15 07:26:01 -04:00
postprocess_handlers.py [TRTLLM-1302][feat] Topk logprobs for TRT backend and top1 logprob for PyT backend (#6097) 2025-09-12 15:32:34 +08:00
responses_utils.py [TRTLLM-7208][feat] Implement basic functionalities for Responses API (#7341) 2025-09-02 07:08:22 -04:00
router.py feat: Dynamically remove servers in PD (#5270) 2025-06-25 09:50:04 +08:00