TensorRT-LLMs/tests/unittest/_torch/attention
Fanrong Li 0d20a8fd61
[TRTLLM-8536][feat] Add the sparse attention framework and one use case--RocketKV support (#8086)
Signed-off-by: Fanrong Li <23290157+lfr-0531@users.noreply.github.com>
Signed-off-by: yuhangh <58161490+heyuhhh@users.noreply.github.com>
Co-authored-by: yuhangh <58161490+heyuhhh@users.noreply.github.com>
2025-10-14 08:23:16 -07:00
..
sparse [TRTLLM-8536][feat] Add the sparse attention framework and one use case--RocketKV support (#8086) 2025-10-14 08:23:16 -07:00
test_attention_mla.py [https://nvbugs/5453806][unwaive] Unwaive fp8 kvcache attention test (#7243) 2025-09-05 12:13:57 -04:00
test_attention_no_cache.py [None][ci] move unittests to sub-directories (#6635) 2025-08-20 05:42:22 -04:00
test_attention.py [None][ci] move unittests to sub-directories (#6635) 2025-08-20 05:42:22 -04:00
test_flashinfer_attention.py [None][ci] move unittests to sub-directories (#6635) 2025-08-20 05:42:22 -04:00
test_flashinfer_star_attn.py [None][ci] move unittests to sub-directories (#6635) 2025-08-20 05:42:22 -04:00
test_vanilla_attention.py [None][ci] move unittests to sub-directories (#6635) 2025-08-20 05:42:22 -04:00