..
utils
refactor: Speculative decoding buffers part 2 ( #5316 )
2025-06-27 17:41:48 +02:00
allocateKvCache.cpp
Update TensorRT-LLM ( #2873 )
2025-03-11 21:13:42 +08:00
assignReqSeqSlots.cpp
refactor: Remove enforced sorted order of batch slots ( #3502 )
2025-07-14 17:23:02 +02:00
cacheFormatter.cpp
[None][fix]revert kvcache transfer ( #6709 )
2025-08-08 07:18:53 -04:00
cacheFormatter.h
[None][feat] move kv cache measure into transfer session ( #6633 )
2025-08-08 17:49:22 +08:00
cacheTransBuffer.cpp
chore:[BREAKING CHANGE] use cacheTransceiverConfig as knobs for disagg service ( #5234 )
2025-07-17 17:42:07 +08:00
cacheTransBuffer.h
chore:[BREAKING CHANGE] use cacheTransceiverConfig as knobs for disagg service ( #5234 )
2025-07-17 17:42:07 +08:00
cacheTransceiver.cpp
[None][chore] ucx establish connection with zmq ( #6090 )
2025-08-05 02:50:45 -04:00
capacityScheduler.cpp
refactor: Scheduling based on KV cache state ( #4865 )
2025-06-16 08:14:58 +02:00
CMakeLists.txt
feat: Support structural tag in C++ runtime and upgrade xgrammar to 0.1.21 ( #6408 )
2025-07-31 09:53:52 +08:00
contextProgress.cpp
Update TensorRT-LLM ( #2873 )
2025-03-11 21:13:42 +08:00
createNewDecoderRequests.cpp
[TRTLLM-6785][feat] BREAKING CHANGE Enable TRTLLM sampler by default ( #6216 )
2025-08-07 22:19:37 -04:00
dataTransceiver.cpp
[None][feat] move kv cache measure into transfer session ( #6633 )
2025-08-08 17:49:22 +08:00
dataTransceiver.h
[None][feat] move kv cache measure into transfer session ( #6633 )
2025-08-08 17:49:22 +08:00
dataTransceiverImpl.cpp
[None][feat] move kv cache measure into transfer session ( #6633 )
2025-08-08 17:49:22 +08:00
dataTransceiverImpl.h
[None][feat] move kv cache measure into transfer session ( #6633 )
2025-08-08 17:49:22 +08:00
decoderBuffers.cpp
refactor: Enhanced handling of decoder requests and logits within the batch manager ( #6055 )
2025-07-18 12:12:08 +02:00
encoderBuffers.cpp
Feat: Variable-Beam-Width-Search (VBWS) part4 ( #3979 )
2025-05-12 22:32:29 +02:00
encoderBuffers.h
Update TensorRT-LLM ( #2873 )
2025-03-11 21:13:42 +08:00
evictionPolicy.cpp
[JIRA-5226219][fix] Fix Bug in KV cache manager ( #4596 )
2025-05-29 22:03:20 -07:00
guidedDecoder.cpp
[TRTLLM-6637][feat] Resolve KV cache divergence issue ( #6628 )
2025-08-09 23:15:04 +08:00
handleContextLogits.cpp
refactor: Enhanced handling of decoder requests and logits within the batch manager ( #6055 )
2025-07-18 12:12:08 +02:00
handleGenerationLogits.cpp
refactor: Enhanced handling of decoder requests and logits within the batch manager ( #6055 )
2025-07-18 12:12:08 +02:00
kvCacheEventManager.cpp
[TRTLLM-6881][feat] Include attention dp rank info with KV cache events ( #6563 )
2025-08-07 14:17:07 +02:00
kvCacheManager.cpp
[None][chore][kv cache manager] Dead code elimination, we no longer record/fetch through WindowBlockManager:: mContextBlocksByHash ( #6249 )
2025-08-10 09:10:10 -04:00
kvCacheTransferManager.cpp
[fix] Fix illegal mem access and possible accuracy lose. Cherry-pick … ( #5017 )
2025-06-09 17:50:57 +08:00
llmRequest.cpp
[TRTLLM-6683][feat] Support LoRA reload CPU cache evicted adapter ( #6510 )
2025-08-07 09:05:36 +03:00
logitsPostProcessor.cpp
refactor: Enhanced handling of decoder requests and logits within the batch manager ( #6055 )
2025-07-18 12:12:08 +02:00
loraBuffers.cpp
fix: [nvbugs/5287097] Align PP layer distribution between pytorch and TRT flow. ( #4399 )
2025-05-19 14:25:36 -07:00
loraBuffers.h
Update TensorRT-LLM ( #2873 )
2025-03-11 21:13:42 +08:00
makeDecodingBatchInputOutput.cpp
refactor: Enhanced handling of decoder requests and logits within the batch manager ( #6055 )
2025-07-18 12:12:08 +02:00
medusaBuffers.cpp
Update TensorRT-LLM ( #2873 )
2025-03-11 21:13:42 +08:00
microBatchScheduler.cpp
[nvbugs/5274894] fix: Sort requests for functional correctness and performance (adapted from #4608 ) ( #4621 )
2025-05-26 17:10:55 +08:00
mlaCacheFormatter.cpp
[None][fix]revert kvcache transfer ( #6709 )
2025-08-08 07:18:53 -04:00
mlaCacheFormatter.h
[TRTLLM-6549] chore: record delay introduced by disaggregated serving in kv cache measure ( #6135 )
2025-07-30 10:39:40 +08:00
pauseRequests.cpp
Update TensorRT-LLM ( #2873 )
2025-03-11 21:13:42 +08:00
peftCacheManager.cpp
[TRTLLM-6683][feat] Support LoRA reload CPU cache evicted adapter ( #6510 )
2025-08-07 09:05:36 +03:00
promptTuningBuffers.cpp
perf: Removing initializing ptuning buffers to zero ( #4915 )
2025-06-09 21:57:21 -04:00
rnnStateBuffers.cpp
[TRTLLM-5171] chore: Remove GptSession/V1 from TRT workflow ( #4092 )
2025-05-14 23:10:04 +02:00
rnnStateBuffers.h
Update TensorRT-LLM ( #2873 )
2025-03-11 21:13:42 +08:00
rnnStateManager.cpp
fix: [nvbugs/5287097] Align PP layer distribution between pytorch and TRT flow. ( #4399 )
2025-05-19 14:25:36 -07:00
runtimeBuffers.cpp
Revert "feat: nanobind bindings ( #5961 )" ( #6160 )
2025-07-18 10:12:54 +08:00
scheduledBlocksManager.h
refactor: Scheduling based on KV cache state ( #4865 )
2025-06-16 08:14:58 +02:00
sequenceSlotManager.cpp
refactor: Remove enforced sorted order of batch slots ( #3502 )
2025-07-14 17:23:02 +02:00
transformerBuffers.cpp
refactor: remove batch_manager::KvCacheConfig and use executor::KvCacheConfig instead ( #5384 )
2025-06-26 19:45:52 +08:00
trtEncoderModel.cpp
refactor: remove TrtGptModelOptionalParams ( #5165 )
2025-06-20 10:31:40 +02:00
trtEncoderModel.h
refactor: remove TrtGptModelOptionalParams ( #5165 )
2025-06-20 10:31:40 +02:00
trtGptModel.h
refactor: remove TrtGptModelOptionalParams ( #5165 )
2025-06-20 10:31:40 +02:00
trtGptModelFactory.h
refactor: remove TrtGptModelOptionalParams ( #5165 )
2025-06-20 10:31:40 +02:00
trtGptModelInflightBatching.cpp
[None][chore][kv cache manager] Dead code elimination, we no longer record/fetch through WindowBlockManager:: mContextBlocksByHash ( #6249 )
2025-08-10 09:10:10 -04:00
trtGptModelInflightBatching.h
[nvbug/5374773] chore: Add a runtime flag to enable fail fast when attn window is too large to fit at least one sequence in KV cache ( #5974 )
2025-07-25 18:10:40 -04:00
updateDecoderBuffers.cpp
refactor: Speculative decoding buffers part 2 ( #5316 )
2025-06-27 17:41:48 +02:00