TensorRT-LLMs/examples/configs
Venky fd1270b9ab
[TRTC-43] [feat] Add config db and docs (#9420)
Signed-off-by: Frank Di Natale <3429989+FrankD412@users.noreply.github.com>
Signed-off-by: Venky Ganesh <23023424+venkywonka@users.noreply.github.com>
Co-authored-by: Frank Di Natale <3429989+FrankD412@users.noreply.github.com>
2025-12-12 04:00:03 +08:00
..
curated [TRTC-43] [feat] Add config db and docs (#9420) 2025-12-12 04:00:03 +08:00
database [TRTC-43] [feat] Add config db and docs (#9420) 2025-12-12 04:00:03 +08:00
README.md [TRTLLM-8680][doc] Add table with one-line deployment commands to docs (#8173) 2025-11-03 17:42:41 -08:00

Recommended LLM API Configuration Settings

This directory contains recommended LLM API performance settings for popular models. They can be used out-of-the-box with trtllm-serve via the --extra_llm_api_options CLI flag, or you can adjust them to your specific use case.

For model-specific deployment guides, please refer to the official documentation.