TensorRT-LLMs/examples/openai_triton/README.md
Guoming Zhang 9f0f52249e [None][doc] Rename TensorRT-LLM to TensorRT LLM for homepage and the … (#7850)
Signed-off-by: nv-guomingz <137257613+nv-guomingz@users.noreply.github.com>
Signed-off-by: Wangshanshan <30051912+dominicshanshan@users.noreply.github.com>
2025-09-25 21:02:35 +08:00

8 lines
431 B
Markdown

# Integration for OpenAI Triton
The typical approach to integrate a kernel into TensorRT LLM is to create TensorRT plugins.
Specially for integrating OpenAI Triton kernels, there are two methods:
1. Creating TensorRT plugin manually, you can refer to [manual plugin example](./manual_plugin/) for details,
2. Generate the TensorRT plugins automatically, please refer to [automatic plugin example](./plugin_autogen/) for details.