Keftek

Meta Releases Llama 4 Scout and Maverick: Open MoE With 10M Context

Open-weight natively multimodal MoE models with 17B active params; Scout offers a 10M-token context window, Maverick (128 experts) outperforms GPT-4o on multimodal benchmarks.

LLM InfraIndustryMulti-Model
Read original on Meta AI