TensorRT-LLMs/examples/configs/README.md

464 B

Recommended LLM API Configuration Settings

This directory contains recommended LLM API performance settings for popular models. They can be used out-of-the-box with trtllm-serve via the --config CLI flag, or you can adjust them to your specific use case.

For model-specific deployment guides, please refer to the official documentation.