TensorRT-LLMs/tensorrt_llm/_torch/auto_deploy
Lucas Liebenwein 9879400479
[#10642][feat] AutoDeploy: optimized canonicalize_graph utilities [1/2] (#10675)
Signed-off-by: Lucas Liebenwein <11156568+lucaslie@users.noreply.github.com>
2026-01-18 13:42:30 -05:00
..
compile [None][feat] AutoDeploy: prepare_metadata revisited (#9764) 2025-12-12 20:14:14 +08:00
config [None][fix] AutoDeploy: Fix the nvfp4 fused_moe (#10727) 2026-01-16 12:04:40 -08:00
custom_ops [#10696][fix] AutoDeploy prevent torch.export from specializing batch dimension when max_batch_size=1 (#10697) 2026-01-18 10:42:49 +02:00
distributed [None][refactor] Unify the usage of MPIDist and TorchDist. (#10380) 2026-01-14 14:05:47 +08:00
export [https://nvbugs/5732942][fix] AutoDeploy: handle transformers 4.57.1 upgrade fixes (#10466) 2026-01-06 19:55:49 -05:00
models [None][fix] AutoDeploy: Fix the nvfp4 fused_moe (#10727) 2026-01-16 12:04:40 -08:00
shim [None][feat] Auto download speculative models from HF for pytorch backend, add speculative_model field alias (#10099) 2026-01-14 21:06:07 -08:00
transform [#10642][feat] AutoDeploy: optimized canonicalize_graph utilities [1/2] (#10675) 2026-01-18 13:42:30 -05:00
utils [#10642][feat] AutoDeploy: optimized canonicalize_graph utilities [1/2] (#10675) 2026-01-18 13:42:30 -05:00
__init__.py [AutoDeploy] merge feat/ad-2025-07-07 (#6196) 2025-07-23 05:11:04 +08:00
llm_args.py [#10688][fix] AutoDeploy Fix CUDA graph batch sizes exceeding max_batch_size (#10687) 2026-01-18 13:31:01 -05:00
llm.py [TRTLLM-9065][chore] remove PyTorchConfig completely (#8856) 2025-11-06 22:37:03 -08:00