| .. |
|
fla
|
[TRTLLM-9432][feat] Reduce synchronization and recompilation for qwen3-next (#9691)
|
2025-12-23 10:14:29 +08:00 |
|
fused_moe
|
[None][chore] Pass without_comm to cutlass and deepgemm (#11229)
|
2026-02-05 02:07:59 -05:00 |
|
mamba
|
[None][fix] Fix selective_state_update perf regression for T=1 decode path (#11194)
|
2026-02-04 09:01:34 +02:00 |
|
__init__.py
|
|
|
|
attention.py
|
[TRTLLM-9457][feat] Add cute dsl fp8 gemm for Blackwell (#10130)
|
2026-02-06 09:49:30 +08:00 |
|
decoder_layer.py
|
chore: Change the type annotations of input_ids and position_ids to int32. (#4632)
|
2025-06-07 16:10:47 +08:00 |
|
embedding.py
|
[TRTLLM-7073][feat] Support torch compile for PP for Llama and DeepSeekV3 (#7838)
|
2025-12-04 13:32:11 +08:00 |
|
gated_mlp.py
|
[None][feat] spark cublas LUT table for llama-8b-bf16 perf (#9811)
|
2025-12-12 22:37:56 -05:00 |
|
layer_norm.py
|
[TRTLLM-9259][perf] Use torch.compile to fuse copy + layernorm within the LayerNorm module (#9052)
|
2025-11-11 18:11:00 -08:00 |
|
linear.py
|
[https://nvbugs/5800646][fix] Fix hang issue by avoid exposing UB buf… (#10842)
|
2026-02-09 23:53:40 +08:00 |
|
logits_processor.py
|
feat: LogitsProcessor in PyTorch backend (#3145)
|
2025-05-01 14:15:30 -07:00 |
|
mlp.py
|
[None][fix] Enable AttentionDP on Qwen3-VL and fix test (#10435)
|
2026-01-10 00:13:26 +09:00 |
|
multi_stream_utils.py
|
[None][refactor] Refactor Torch Compile Backend, MoeLoadBalancer and warmup Logic (#6615)
|
2025-08-19 09:58:44 +08:00 |
|
qk_norm_attention.py
|
[TRTLLM-8310][feat] Add Qwen3-VL-MoE (#9689)
|
2025-12-15 20:05:20 -08:00 |
|
rms_norm.py
|
[None][feat] Integrate cuda.tile RMS norm kernels (#9725)
|
2026-02-02 19:44:27 +08:00 |
|
rotary_embedding.py
|
[TRTLLM-8310][feat] Add Qwen3-VL-MoE (#9689)
|
2025-12-15 20:05:20 -08:00 |
|
swiglu.py
|
[None][chore] Wrap the swiglu into custom op to avoid redundant device copy. (#7021)
|
2025-08-27 13:02:10 +08:00 |
|
triton_linear.py
|
[https://nvbugs/5761391][fix] Include triton-kernels as a packaged dependency (#10471)
|
2026-01-28 19:56:32 -08:00 |