From 68e774ff9ed0d1e96e3667f6c308712d40838353 Mon Sep 17 00:00:00 2001 From: Mike Iovine Date: Fri, 25 Apr 2025 10:04:24 -0700 Subject: [PATCH] [chore] Add Llama 4 Maverick to quickstart README (#3848) Signed-off-by: Mike Iovine <6158008+mikeiovine@users.noreply.github.com> --- examples/pytorch/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/examples/pytorch/README.md b/examples/pytorch/README.md index a142b8e51e..5441cbd29f 100644 --- a/examples/pytorch/README.md +++ b/examples/pytorch/README.md @@ -51,7 +51,7 @@ python3 quickstart_multimodal.py --model_dir Efficient-Large-Model/NVILA-8B --mo | `LlavaLlamaModel` | VILA | `Efficient-Large-Model/NVILA-8B` | L + V | | `LlavaNextForConditionalGeneration` | LLaVA-NeXT | `llava-hf/llava-v1.6-mistral-7b-hf` | L + V | | `LlamaForCausalLM` | Llama 3.1, Llama 3, Llama 2, LLaMA | `meta-llama/Meta-Llama-3.1-70B` | L | -| `Llama4ForConditionalGeneration` | Llama 4 | `meta-llama/Llama-4-Scout-17B-16E-Instruct` | L | +| `Llama4ForConditionalGeneration` | Llama 4 Scout/Maverick | `meta-llama/Llama-4-Scout-17B-16E-Instruct`, `meta-llama/Llama-4-Maverick-17B-128E-Instruct` | L | | `MistralForCausalLM` | Mistral | `mistralai/Mistral-7B-v0.1` | L | | `MixtralForCausalLM` | Mixtral | `mistralai/Mixtral-8x7B-v0.1` | L | | `MllamaForConditionalGeneration` | Llama 3.2 | `meta-llama/Llama-3.2-11B-Vision` | L |