TensorRT-LLMs/examples/serve
pansicheng e84dc6b3c7
feat: add deepseek-r1 reasoning parser to trtllm-serve (#3354)
* add deepseek-r1 reasoning parser

Signed-off-by: pansicheng <sicheng.pan.chn@gmail.com>

* fix test

Signed-off-by: Pengyun Lin <81065165+LinPoly@users.noreply.github.com>

---------

Signed-off-by: pansicheng <sicheng.pan.chn@gmail.com>
Signed-off-by: Pengyun Lin <81065165+LinPoly@users.noreply.github.com>
Co-authored-by: Pengyun Lin <81065165+LinPoly@users.noreply.github.com>
2025-05-06 08:13:04 +08:00
..
curl_chat_client_for_multimodal.sh feat: trtllm-serve multimodal support (#3590) 2025-04-19 05:01:28 +08:00
curl_chat_client.sh feat: trtllm-serve multimodal support (#3590) 2025-04-19 05:01:28 +08:00
curl_completion_client.sh feat: trtllm-serve multimodal support (#3590) 2025-04-19 05:01:28 +08:00
deepseek_r1_reasoning_parser.sh feat: add deepseek-r1 reasoning parser to trtllm-serve (#3354) 2025-05-06 08:13:04 +08:00
genai_perf_client.sh doc: add genai-perf benchmark & slurm multi-node for trtllm-serve doc (#3407) 2025-04-16 00:11:58 +08:00
openai_chat_client_for_multimodal.py feat: trtllm-serve multimodal support (#3590) 2025-04-19 05:01:28 +08:00
openai_chat_client.py doc: refactor trtllm-serve examples and doc (#3187) 2025-04-04 11:40:43 +08:00
openai_completion_client.py doc: refactor trtllm-serve examples and doc (#3187) 2025-04-04 11:40:43 +08:00
README.md doc: refactor trtllm-serve examples and doc (#3187) 2025-04-04 11:40:43 +08:00
requirements.txt doc: add genai-perf benchmark & slurm multi-node for trtllm-serve doc (#3407) 2025-04-16 00:11:58 +08:00

Online Serving Examples with trtllm-serve

We provide a CLI command, trtllm-serve, to launch a FastAPI server compatible with OpenAI APIs, here are some client examples to query the server, you can check the source code here or refer to the command documentation and examples for detailed information and usage guidelines.