[None][chore] Change trt-server to trtlllm-server in opentelemetry readme (#9173)

Signed-off-by: Stanley Sun <stsun@nvidia.com>
Co-authored-by: Larry Xu <197874197+LarryXFly@users.noreply.github.com>
This commit is contained in:
Stanley Sun 2025-11-18 14:02:24 +08:00 committed by GitHub
parent 5e5300898b
commit 96cfdd8a72
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -49,7 +49,7 @@ export JAEGER_IP=$(docker inspect --format '{{ .NetworkSettings.IPAddress }}' ja
export OTEL_EXPORTER_OTLP_TRACES_PROTOCOL=grpc
export OTEL_EXPORTER_OTLP_TRACES_ENDPOINT=grpc://$JAEGER_IP:4317
export OTEL_EXPORTER_OTLP_TRACES_INSECURE=true
export OTEL_SERVICE_NAME="trt-server"
export OTEL_SERVICE_NAME="trtllm-server"
```
Then run TensorRT-LLM with OpenTelemetry, and make sure to set `return_perf_metrics` to true in the model configuration:
@ -61,7 +61,7 @@ trtllm-serve models/Qwen3-8B/ --otlp_traces_endpoint="$OTEL_EXPORTER_OTLP_TRACES
## Send requests and find traces in Jaeger
You can send a request to the server and view the traces in [Jaeger UI](http://localhost:16686/).
The traces should be visible under the service name "trt-server".
The traces should be visible under the service name "trtllm-server".
## Configuration for Disaggregated Serving