TensorRT-LLMs/docs
dongfengy 367ff88a5e
[None][feat] Refactor llama4 for multimodal encoder IFB (#6844)
Signed-off-by: Dongfeng Yu <dongfengy@nvidia.com>
2025-08-28 13:22:19 -07:00
..
source [None][feat] Refactor llama4 for multimodal encoder IFB (#6844) 2025-08-28 13:22:19 -07:00
Doxygen Update TensorRT-LLM (#1315) 2024-03-19 17:36:42 +08:00
make.bat Kaiyu/update main (#5) 2023-10-18 22:38:53 +08:00
Makefile Kaiyu/update main (#5) 2023-10-18 22:38:53 +08:00
README.md Kaiyu/update main (#5) 2023-10-18 22:38:53 +08:00
requirements.txt [TRTLLM-5989, TRTLLM-5991, TRTLLM-5993] doc: Update container instructions (#5490) (#5605) 2025-06-30 13:27:49 +02:00

Docs

This directory contains the stuff for building static html documentations based on sphinx.

Build the docs

Firstly, install the sphinx:

apt-get install python3-sphinx doxygen python3-pip graphviz

Secondly, install the packages:

python3 -m pip install -r ./requirements.txt

And then, make the docs:

doxygen Doxygen # build C++ docs

make html

And the finally the generated html pages will locate in the build/html directory.

Preview the docs locally

The basic way to preview the docs is using the http.serve:

cd build/html

python3 -m http.server 8081

And you can visit the page with your web browser with url http://localhost:8081.