Madian Khabsa’s Post

Llama 4 is here! We have been working extremely hard to bring the latest version of llama models for the world to enjoy, both in open source, and Meta AI. Here are some highlights Here are some highlights: 📌 The Llama series have been re-designed to use state of the art mixture-of-experts (MoE) architecture and natively trained with multimodality. 📌 Llama 4 Scout is highest performing small model with 17B activated parameters with 16 experts, and achieves an industry leading 10M+ token context window and can also run on a single GPU! 📌 Llama 4 Maverick is the best multimodal model in its class, and ranks 2nd on LMArena! You can learn more about it here: https://www.llama.com/

  • graphical user interface, text, application, chat or text message
Joachim Daiber

ML @ Upwork | Ex-Apple, startups

4mo

Amazing work Madian!

Hung-Hsuan Chen

Associate Professor at National Central University

4mo

Awesome work!

Like
Reply
Kalpa Gunaratna

Staff Research Scientist & Engineer at Samsung Research America | NLP, LLMs, and Knowledge Graphs | PhD

4mo

Excellent work Madian

Like
Reply
Gunnar W. Knutsen

Professor of early modern history at University of Bergen

4mo

Congratulations, Madian!

Like
Reply
See more comments

To view or add a comment, sign in

Explore topics