Meta releases two Llama 4 AI models

Importance Score: 72 / 100 🔴

Meta Unveils Llama 4 AI Models: Challenging Industry Leaders

Technology conglomerate Meta has introduced Llama 4, its latest suite of artificial intelligence models, now powering the Meta AI assistant across web platforms and within WhatsApp, Messenger, and Instagram. This release includes two new models accessible for download from Meta and Hugging Face: Llama 4 Scout, a compact model designed to operate on a single Nvidia H100 GPU, and Llama 4 Maverick, positioned to compete with advanced models like GPT-4o and Gemini 2.0 Flash. Mark Zuckerberg, CEO of Meta, stated that Llama 4 Behemoth, currently under development, is anticipated to be the “highest performing base model globally.”

Llama 4 Scout and Maverick: Performance Benchmarks

Llama 4 Scout: Efficiency and Capabilities

Meta reports that Llama 4 Scout boasts a 10-million-token context window, representing the AI model’s working memory. The company asserts that Scout surpasses Google’s Gemma 3 and Gemini 2.0 Flash-Lite models, as well as the open-source Mistral 3.1, across a wide array of established benchmarks. This performance is achieved while maintaining the capability to function on a single Nvidia H100 GPU, highlighting its efficiency.

Llama 4 Maverick: Competing with Top-Tier Models

Similarly, Meta makes performance claims for the larger Maverick model, citing results comparable to OpenAI’s GPT-4o and Google’s Gemini 2.0 Flash. Furthermore, Meta indicates that Maverick achieves results on par with DeepSeek-V3 in coding and reasoning tasks, utilizing “less than half the active parameters,” suggesting enhanced resource utilization.

Llama 4 Behemoth: A High-Performance Contender

Behemoth’s Specifications and Anticipated Performance

Llama 4 Behemoth is characterized by 288 billion active parameters out of a total of 2 trillion parameters. Although not yet released, Meta projects that Behemoth will outperform competitors, specifically mentioning GPT-4.5 and Claude Sonnet 3.7, “on multiple STEM benchmarks,” positioning it as a potentially leading model in scientific and technical domains.

Technological Innovations and Future Plans

Mixture of Experts Architecture

Meta has implemented a “mixture of experts” (MoE) architecture for the Llama 4 series. This innovative approach aims to optimize resource allocation by selectively activating only the necessary components of the model for each specific task, enhancing efficiency and potentially reducing computational demands.

LlamaCon Conference: Future Directions

The company has announced plans to elaborate on future strategies for its AI models and related products at the upcoming LlamaCon conference, scheduled for April 29th. This event is expected to provide further insights into Meta’s AI development roadmap.

Open Source Designation and Licensing Considerations

Llama 4: Open Source or Open Access?

Continuing its approach with previous releases, Meta describes the Llama 4 suite as “open-source.” However, the Llama license has faced scrutiny regarding its restrictions. Notably, the license mandates that commercial entities exceeding 700 million monthly active users must seek Meta’s authorization before deploying these models commercially. In 2023, the Open Source Initiative noted that this condition places Llama “out of the category of ‘Open Source’,” raising questions about the true extent of its open-source nature.


🕐 Top News in the Last Hour By Importance Score

# Title 📊 i-Score
1 Ten rockets fired at Israel from Gaza, military says 🟢 85 / 100
2 Patients reveal 'horror stories' of cancer screenings as cases surge among young people 🟢 82 / 100
3 Second child dies of measles as Texas outbreak worsens 🔴 72 / 100
4 Meta’s benchmarks for its new AI models are a bit misleading 🔴 65 / 100
5 British Steel's Chinese owners hit by dividend row 🔴 65 / 100
6 'I'm an expat living in Spain – you can travel from Madrid to Alicante for £17' 🔵 45 / 100
7 Novak Djokovic and Carlos Alcaraz play set in Monte-Carlo and result speaks volumes 🔵 40 / 100
8 Michelle Obama humiliated amid dismal ratings for podcast she launched with great fanfare 🔵 30 / 100
9 Why Niclas Fullkrug can be the No 9 West Ham need to take them forward after a difficult debut season, writes JAMES SHARPE 🔵 30 / 100
10 Alex Ovechkin’s Kids: Does the Hockey Player Have Children? 🔵 30 / 100

View More Top News ➡️