TensorRT-LLMs/tensorrt_llm/_torch/attention_backend
Mike Iovine e534bf09cc
[fix] Fix flashinfer + speculation issues (#3686)
Signed-off-by: Mike Iovine <6158008+mikeiovine@users.noreply.github.com>
2025-04-28 14:34:22 -04:00
..
__init__.py Update (#2978) 2025-03-23 16:39:35 +08:00
flashinfer.py [fix] Fix flashinfer + speculation issues (#3686) 2025-04-28 14:34:22 -04:00
interface.py Fix create_weights in attention (#3692) 2025-04-24 07:30:00 +08:00
star_flashinfer.py Remove dummy forward path (#3669) 2025-04-18 16:17:50 +08:00
trtllm.py Fix create_weights in attention (#3692) 2025-04-24 07:30:00 +08:00
utils.py Fix create_weights in attention (#3692) 2025-04-24 07:30:00 +08:00
vanilla.py Remove dummy forward path (#3669) 2025-04-18 16:17:50 +08:00