TensorRT-LLMs/tests/unittest/_torch/modules
Li Min 6021a439ab
Make moe permute and final as custom op (#5412)
Signed-off-by: Mindy Li <11663212+limin2021@users.noreply.github.com>
2025-06-27 15:48:33 -07:00
..
tests_lora_modules added loraOp into lora layer + test for mlp and comparison to lora plugin (#3455) 2025-04-17 12:48:27 +08:00
test_fused_moe.py Make moe permute and final as custom op (#5412) 2025-06-27 15:48:33 -07:00
test_moe_host_sharer.py feat: large-scale EP(part 6: Online EP load balancer integration for GB200 nvfp4) (#4818) 2025-06-08 10:25:18 +08:00
test_moe_load_balancer.py feat: large-scale EP(part 8: Online EP load balancer integration for PCIe fp8) (#5226) 2025-06-25 22:25:13 -07:00
test_moe_routing.py [https://nvbugspro.nvidia.com/bug/5332927][fix] Fix the bug in the routing unit test (#5065) 2025-06-11 09:44:35 +08:00