TensorRT-LLMs/tensorrt_llm/executor
Yan Chunwei 4798d088d9
chore: Partition LlmArgs into TorchLlmArgs and TrtLlmArgs (#3823)
* partition LlmArgs

Signed-off-by: Superjomn <328693+Superjomn@users.noreply.github.com>

* update backend

Signed-off-by: Superjomn <328693+Superjomn@users.noreply.github.com>

---------

Signed-off-by: Superjomn <328693+Superjomn@users.noreply.github.com>
2025-05-22 09:40:56 +08:00
..
__init__.py Update TensorRT-LLM (#2873) 2025-03-11 21:13:42 +08:00
executor.py feat: Support Top-K logprobs and prompt_logprobs in LLMAPI (#3388) 2025-05-01 12:47:14 -04:00
ipc.py fix: llmapi-launch add add trtllm-bench test with engine building (#4091) 2025-05-21 10:18:01 +08:00
postproc_worker.py feat: return logits in PyTorch flow (#3221) 2025-04-24 16:56:03 -07:00
proxy.py feat: Support Top-K logprobs and prompt_logprobs in LLMAPI (#3388) 2025-05-01 12:47:14 -04:00
request.py feat: Add multimodal embedding field in LlmRequest (#3855) 2025-05-01 12:23:30 +08:00
result.py feat: Support Top-K logprobs and prompt_logprobs in LLMAPI (#3388) 2025-05-01 12:47:14 -04:00
utils.py fix: llmapi-launch add add trtllm-bench test with engine building (#4091) 2025-05-21 10:18:01 +08:00
worker.py chore: Partition LlmArgs into TorchLlmArgs and TrtLlmArgs (#3823) 2025-05-22 09:40:56 +08:00