This website requires JavaScript.
Explore
Help
Sign In
kanshan
/
TensorRT-LLMs
Watch
1
Star
0
Fork
0
You've already forked TensorRT-LLMs
mirror of
https://github.com/NVIDIA/TensorRT-LLM.git
synced
2026-01-14 06:27:45 +08:00
Code
Issues
Actions
1
Packages
Projects
Releases
Wiki
Activity
47893cfd0c
TensorRT-LLMs
/
cpp
/
tensorrt_llm
/
kernels
/
decoderMaskedMultiheadAttention
/
instantiation
History
Kaiyu Xie
385626572d
Update TensorRT-LLM (
#2502
)
...
* Update TensorRT-LLM --------- Co-authored-by: 岑灿 <yunyi.hyy@alibaba-inc.com>
2024-11-26 16:51:34 +08:00
..
decoderMaskedMultiheadAttention32_bf16_implicit_relative_attn.cu
decoderMaskedMultiheadAttention32_bf16.cu
decoderMaskedMultiheadAttention32_float_implicit_relative_attn.cu
decoderMaskedMultiheadAttention32_float.cu
decoderMaskedMultiheadAttention32_half_implicit_relative_attn.cu
decoderMaskedMultiheadAttention32_half.cu
decoderMaskedMultiheadAttention48_bf16.cu
decoderMaskedMultiheadAttention48_float.cu
decoderMaskedMultiheadAttention48_half.cu
decoderMaskedMultiheadAttention64_bf16_implicit_relative_attn.cu
decoderMaskedMultiheadAttention64_bf16.cu
decoderMaskedMultiheadAttention64_float_implicit_relative_attn.cu
decoderMaskedMultiheadAttention64_float.cu
decoderMaskedMultiheadAttention64_half_implicit_relative_attn.cu
decoderMaskedMultiheadAttention64_half.cu
decoderMaskedMultiheadAttention80_bf16.cu
decoderMaskedMultiheadAttention80_float.cu
decoderMaskedMultiheadAttention80_half.cu
decoderMaskedMultiheadAttention96_bf16.cu
decoderMaskedMultiheadAttention96_float.cu
decoderMaskedMultiheadAttention96_half.cu
decoderMaskedMultiheadAttention104_bf16.cu
decoderMaskedMultiheadAttention104_float.cu
decoderMaskedMultiheadAttention104_half.cu
decoderMaskedMultiheadAttention112_bf16.cu
decoderMaskedMultiheadAttention112_float.cu
decoderMaskedMultiheadAttention112_half.cu
decoderMaskedMultiheadAttention128_bf16_block_sparse_attn.cu
decoderMaskedMultiheadAttention128_bf16_implicit_relative_attn.cu
decoderMaskedMultiheadAttention128_bf16_qk_tanh_scale.cu
decoderMaskedMultiheadAttention128_bf16.cu
decoderMaskedMultiheadAttention128_float_block_sparse_attn.cu
decoderMaskedMultiheadAttention128_float_implicit_relative_attn.cu
decoderMaskedMultiheadAttention128_float_qk_tanh_scale.cu
decoderMaskedMultiheadAttention128_float.cu
decoderMaskedMultiheadAttention128_half_block_sparse_attn.cu
decoderMaskedMultiheadAttention128_half_implicit_relative_attn.cu
decoderMaskedMultiheadAttention128_half_qk_tanh_scale.cu
decoderMaskedMultiheadAttention128_half.cu
decoderMaskedMultiheadAttention144_bf16.cu
decoderMaskedMultiheadAttention144_float.cu
decoderMaskedMultiheadAttention144_half.cu
decoderMaskedMultiheadAttention160_bf16.cu
decoderMaskedMultiheadAttention160_float.cu
decoderMaskedMultiheadAttention160_half.cu
decoderMaskedMultiheadAttention192_bf16.cu
decoderMaskedMultiheadAttention192_float.cu
decoderMaskedMultiheadAttention192_half.cu
decoderMaskedMultiheadAttention224_bf16.cu
decoderMaskedMultiheadAttention224_float.cu
decoderMaskedMultiheadAttention224_half.cu
decoderMaskedMultiheadAttention256_bf16_qk_tanh_scale.cu
decoderMaskedMultiheadAttention256_bf16.cu
decoderMaskedMultiheadAttention256_float_qk_tanh_scale.cu
decoderMaskedMultiheadAttention256_float.cu
decoderMaskedMultiheadAttention256_half_qk_tanh_scale.cu
decoderMaskedMultiheadAttention256_half.cu