TensorRT-LLMs/tests/integration/test_lists/test-db
Mike Iovine f9adac3dea
[feat] Enable chunked context for flashinfer (#4132)
Signed-off-by: Mike Iovine <6158008+mikeiovine@users.noreply.github.com>
2025-05-15 10:59:38 +08:00
..
l0_a10.yml chore: Remove deprecated Python runtime benchmark (#4171) 2025-05-14 18:41:05 +08:00
l0_a30.yml [TRTLLM-5171] chore: Remove GptSession/V1 from TRT workflow (#4092) 2025-05-14 23:10:04 +02:00
l0_a100.yml move pytorch tests of LLM API into separate test files (#3745) 2025-04-22 14:36:59 -07:00
l0_b200.yml test: Add UT for moe trtllmgen (#4258) 2025-05-14 15:22:58 +08:00
l0_dgx_h100.yml [feat] Enable chunked context for flashinfer (#4132) 2025-05-15 10:59:38 +08:00
l0_dgx_h200.yml [TRTLLM-5081] [test] Align parametrize_with_ids to the pytest behavior (#4090) 2025-05-13 07:41:51 +08:00
l0_gb202.yml fix: Fix FMHA-based MLA in the generation phase and add MLA unit test (#3863) 2025-04-29 09:09:43 +08:00
l0_gb203.yml chore: refactor llmapi e2e tests (#3803) 2025-05-05 07:37:24 +08:00
l0_gh200.yml chore: refactor llmapi e2e tests (#3803) 2025-05-05 07:37:24 +08:00
l0_h100.yml chore: Remove deprecated Python runtime benchmark (#4171) 2025-05-14 18:41:05 +08:00
l0_l40s.yml chore: refactor llmapi e2e tests (#3803) 2025-05-05 07:37:24 +08:00
l0_perf.yml chore: Remove deprecated Python runtime benchmark (#4171) 2025-05-14 18:41:05 +08:00
l0_rtx_pro_6000.yml test: Added tests for Llama3.1-70B-BF16 on SM120 (#4198) 2025-05-14 11:57:49 -04:00
l0_sanity_check.yml [Infra] - Update the upstream PyTorch dependency to 2.7.0 (#4235) 2025-05-14 22:28:13 +08:00
README.md Update (#2978) 2025-03-23 16:39:35 +08:00

Description

This folder contains test definition which is consumed by trt-test-db tool based on system specifications.

Installation

Install trt-test-db using the following command:

pip3 install --extra-index-url https://urm.nvidia.com/artifactory/api/pypi/sw-tensorrt-pypi/simple --ignore-installed trt-test-db==1.8.5+bc6df7

Test Definition

Test definitions are stored in YAML files located in ${TRT_LLM_ROOT}/tests/integration/test_lists/test-db/. These files define test conditions and the tests to be executed.

Example YAML Structure

version: 0.0.1
l0_e2e:
  - condition:
      terms:
        supports_fp8: true
      ranges:
        system_gpu_count:
          gte: 4
          lte: 4
      wildcards:
        gpu:
          - '*h100*'
        linux_distribution_name: ubuntu*
    tests:
      - examples/test_llama.py::test_llm_llama_v3_1_1node_multi_gpus[llama-3.1-8b-enable_fp8]
      - examples/test_llama.py::test_llm_llama_v3_1_1node_multi_gpus[llama-3.1-70b-enable_fp8]

Generating Test Lists

Use trt-test-db to generate a test list based on the system configuration:

trt-test-db -d /TensorRT-LLM/src/tests/integration/test_lists/test-db \
            --context l0_e2e \
            --test-names \
            --output /TensorRT-LLM/src/l0_e2e.txt \
            --match-exact '{"chip":"ga102gl-a","compute_capability":"8.6","cpu":"x86_64","gpu":"A10","gpu_memory":"23028.0","host_mem_available_mib":"989937","host_mem_total_mib":"1031949","is_aarch64":false,"is_linux":true,"linux_distribution_name":"ubuntu","linux_version":"22.04","supports_fp8":false,"supports_int8":true,"supports_tf32":true,"sysname":"Linux","system_gpu_count":"1",...}'

This command generates a test list file (l0_e2e.txt) based on the specified context and system configuration.

Running Tests

Execute the tests using pytest with the generated test list:

pytest -v --test-list=/TensorRT-LLM/src/l0_e2e.txt --output-dir=/tmp/logs

This command runs the tests specified in the test list and outputs the results to the specified directory.

Additional Information

  • The --context parameter in the trt-test-db command specifies which context to search in the YAML files.
  • The --match-exact parameter provides system information used to filter tests based on the conditions defined in the YAML files.
  • Modify the YAML files to add or update test conditions and test cases as needed. For more detailed information on trt-test-db and pytest usage, refer to their respective documentation.